New York (EFE).- The Federal Trade Commission (FTC) has opened an extensive investigation into OpenAI, to determine if its chatbot with artificial intelligence (AI) ChatGPT has violated consumer protection laws by putting at risk the reputation and personal data of users, as reported by The Washington Post on Thursday.
The commission this week sent a 20-page civil inquiry complaint to the Sam Altman-led startup asking OpenAI several questions, including its practices for training AI models and its handling of users’ personal information.
They ask ChatGPT to account
Among the various examples detailed in the lengthy letter, seen by The Washington Post, was a 2020 incident in which the company disclosed a bug that allowed users to view information about other chats and information related to payment of other users.
ChatGPT gained huge popularity after its free and public launch last November and achieved a record 100 million downloads in 2 months.
The FTC asked OpenAI to provide detailed descriptions of all complaints it had received about its products that made “false, misleading, derogatory, or harmful” statements.
In addition, according to the letter, the commission is investigating whether the company engaged in unfair or deceptive practices that resulted in “reputational damage” to consumers.
Magnifying glass on chatbot data
In one of its responses, ChatGPT said a lawyer (who actually existed) had made sexually suggestive comments and attempted to grope a student on a school trip, citing an article the chatbot said had appeared in The Washington Post.
But no such item existed, the trip never happened, and the attorney claimed he was never accused of harassing a student.
The agency also demanded a detailed description of the data that OpenAI uses to train its products and how it is working to prevent what is known in the technology industry as “hallucinations,” a problem that occurs when chatbot responses are well-structured. but they are completely wrong.
If the FTC determines that a company violates consumer protection laws, it can impose fines or subject the company to a consent decree, which can dictate how the company handles data.