FTC cracks down on ‘deceptive’ AI projects, including an AI lawyer

Avatar


The United States Federal Trade Commission has taken action against companies it claims have misled consumers using artificial intelligence, including a firm that billed itself as offering an AI lawyer.

The FTC said on Sept. 25 that it was launching “Operation AI Comply” as part of a new “law enforcement sweep,” cracking down on five companies “that have relied on artificial intelligence as a way to supercharge deceptive or unfair conduct that harms consumers.”

“Using AI tools to trick, mislead, or defraud people is illegal,” said FTC Chair Lina Khan in a statement. She added that the enforcement action makes it clear there is “no AI exemption from the laws on the books.” 

“By cracking down on unfair or deceptive practices in these markets, FTC is ensuring that honest businesses and innovators can get a fair shot and consumers are being protected.”

The FTC took action against DoNotPay, a company that claimed to be the “world’s first robot lawyer,” which the agency alleged failed to provide valid legal services.

DoNotPay promised it could “sue for assault without a lawyer” and “generate perfectly valid legal documents in no time,” but the agency claimed it “could not deliver on these promises.”

The FTC accused the project of not testing to see if its AI bot was on par with a real lawyer and claimed it didn’t hire or retain any legal counsel.

The service could also check small businesses websites for legal violations using just an email, the agency said. The FTC claimed it was “not effective.”

DoNotPay settled by agreeing to pay $193,000 and to notify consumers about the limitations of its services.

The FTC also targeted Ascend Ecom, a firm accused of running a fraudulent online business scheme that claimed AI could help consumers earn significant passive income. According to the lawsuit, the company allegedly defrauded consumers of over $25 million.

Related: Hong Kong prepares AI guidelines for finance sector

Ecommerce Empire Builders (EEB) was charged with falsely claiming to help consumers build lucrative AI-powered e-commerce businesses. According to the FTC, the company’s promises of high earnings did not materialize, leading to consumer complaints.

The fourth firm targeted by the FTC, Rytr, allegedly marketed an AI writing assistant that generated misleading consumer reviews, contributing to a marketplace of false information. A proposed settlement would prohibit the company from offering such services in the future.

Finally, the FBA Machine, a scheme promising guaranteed income through AI-powered online stores, was hit with a lawsuit from the FTC in June. The scheme allegedly cost consumers over $15.9 million, and the case is ongoing.

Magazine: All rise for the robot judge: AI and blockchain could transform the courtroom