The Federal Trade Commission (FTC) announced it is cracking down on errand companies using AI schemes to deceive people. This is being carried out under the new law enforcement sweep known as Operation AI Comply.
According to the agency, companies that have enabled the manipulation of AI tools to create fake reviews, for instance, will not be spared.
In addition to that, companies that claim they can use AI to help consumers to make more money will also fall under this category and face the wrath of the agency.
“Using AI tools to trick, mislead, or defraud people is illegal,” said FTC chair Lina M. Khan. “The FTC’s enforcement actions make clear that there is no AI exemption from the laws on the books.”
Out of the three identified firms, FBA Machine, Ecommerce Empire Builders, and Ascend Ecom stated that the businesses they are offering can generate passive income with the help of AI for online stores. The FTC’s complaints allege that these businesses cheated consumers out of millions of dollars and hardly ever delivered on the commitments made.
DoNotPay settles with FTC over AI “robot lawyer” claims
The enforcement action also includes DoNotPay, an AI company that has recently attracted attention for reportedly creating what it calls the world’s first “robot lawyer.” The FTC noted that the service, which used AI, offered legal advice without human lawyers and even allowed users to file assault lawsuits and produce legal documents.
However, the company did not provide sufficient evidence that the AI created by the company is as effective as human legal knowledge. The service was not fully tested, and there was no input from licensed attorneys while running it. The FTC’s settlement involves DoNotPay paying $193,000. Additionally, they are to send out a notice to consumers who have used the service. The company is now prohibited from making misleading advertising about the company’s ability to replace such a service.
In a post on X last year, DoNotPay CEO Joshua Browder stated that the company had received “threats from State Bar prosecutors” and that he could end up in jail if he went further with the plans to present an AI lawyer via smart glasses in court.
Rytr faces consequences for AI-generated fake review tool
The FTC has also gone after Rytr, a tool that uses AI to create consumer reviews. The service offered the possibility to generate an endless number of reviews, which frequently contained made-up information not connected to the products in question. These fake reviews misled potential buyers and distorted the online marketplace.
As part of the settlement, Rytr is also barred from offering or advertising any AI-written review services in the future. According to FTC, this will help combat the problem of fake information in consumer reviews.
Under Khan’s leadership, the FTC has focused on actions against firms that use AI in deceptive ways. At the beginning of the year, the FTC initiated an investigation of collaborations between leading tech corporations and growing generative AI companies.