Federal Trade Commission issues open warning to AI companies



summary
Summary

In a new publication, the FTC clarifies that AI companies are not exempt from applicable laws.

The Federal Trade Commission issued a warning to AI companies that fail to meet their obligations to protect the privacy and confidentiality of user data. The FTC (Federal Trade Commission) is an independent U.S. agency that enforces consumer protection and fair competition laws. It has the authority to take action against companies that violate the law and can impose fines and force companies to change their practices.

In a recent blog post, the FTC emphasized that AI companies, especially those offering Model-as-a-Service services, must honor their privacy commitments to their customers and users. To continually improve their models, these companies need constant access to new data, which could conflict with their privacy obligations. The FTC does not name specific companies, but virtually all providers of large models — including OpenAI or Anthropic — also offer them via an API.

FTC wants to have AI models deleted if they have been trained with unlawfully obtained data

The FTC warns that companies that fail to live up to their privacy promises may violate laws enforced by the FTC. This includes promises that customer data will not be used for secret purposes, such as the secret training of AI models. Previous enforcement actions have already required companies that improperly obtained consumer data to withdraw all products, including models and algorithms, developed in whole or in part with the improperly obtained data.

Ad

Ad

The FTC also emphasizes that AI companies must honor their commitments to customers, regardless of how or where those commitments were made. This includes, for example, commitments made through promotional materials, terms of service on the company’s website, or online marketplaces. Companies that violate these commitments could be prosecuted by the FTC.

Finally, the FTC notes that failing to inform customers about certain aspects of data collection and use can be just as serious as making promises. It can and has taken action against companies that have concealed material facts that could influence customers’ purchasing decisions.

“There is no AI exemption from the laws on the books. Like all firms, model-as-a-service companies that deceive customers or users about how their data is collected—whether explicitly or implicitly, by inclusion or by omission—may be violating the law,” the article states.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top