OpenAI Faces EU Complaint Over ChatGPT's 'Hallucinations' About People
Privacy advocates have launched a new challenge against OpenAI, targeting the company's inability to correct false information generated by its ChatGPT AI chatbot. This development highlights growing concerns about AI accuracy and data protection in the rapidly evolving field of generative AI.
Key Points:
- Vienna-based nonprofit noyb filed a complaint with Austria's data watchdog
- The complaint alleges violations of the EU's General Data Protection Regulation (GDPR)
- ChatGPT reportedly provided inaccurate information about a public figure's birth date
- OpenAI allegedly cannot correct or erase false information generated by its system
The GDPR Challenge
The complaint centers on OpenAI's apparent inability to comply with GDPR requirements, which give individuals the right to have incorrect information about them corrected. According to noyb, OpenAI admitted it cannot prevent its systems from displaying false information and only offered to block or filter results based on specific prompts.
Maartje de Graaf, a lawyer for noyb, stated: Making up false information is quite problematic in itself. But when it comes to false information about individuals, there can be serious consequences.
Broader Implications
This case could have far-reaching consequences for AI companies operating in the EU. If upheld, it may force significant changes in how AI models are trained and operated, potentially slowing the rollout of generative AI tools to customers.
OpenAI's Growing Legal Challenges
The complaint adds to a growing list of legal issues facing OpenAI, including:
- Copyright lawsuits over training data
- Privacy concerns about scraped data
- Potential defamation suits stemming from AI hallucinations
What's Next
The EU's data protection regime has the power to levy fines of up to 4% of a company's global turnover. While investigations can take years to resolve, the outcome of this case will be closely watched by AI firms operating in Europe.
As the AI industry continues to grapple with the challenge of hallucinations, this legal action underscores the need for robust solutions that balance innovation with individual rights and data protection.