- The 35+ best Black Friday Nintendo Switch deals 2024
- Best Black Friday TV deals 2024: 75+ expert-selected deals on QLED, OLED, & more
- The camera I recommend to most new photographers is $180 off for Black Friday
- The fan-favorite 8TB T5 Evo SSD is almost 50% off at Samsung for Black Friday
- This Samsung projector is secretly the best gaming console you can buy, and it's on sale for Black Friday
OpenAI’s ChatGPT is Breaking GDPR, Says Noyb
Noyb, the Austria-based European Center for Digital Rights, has filed a complaint against OpenAI to its home country’s data protection authority (DSB).
The non-profit has accused the AI company of failing to ensure the accuracy of personal data provided by ChatGPT, according to a statement published on April 29.
Noyb said OpenAI has been aware that its AI-powered chatbot has provided its users with false personal information for months, yet it has failed to address the problem.
“OpenAI openly admits that it is unable to correct incorrect information on ChatGPT. Furthermore, the company cannot say where the data comes from or what data ChatGPT stores about individual people,” Noyb indicated.
“The company is well aware of this problem, but doesn’t seem to care. Instead, OpenAI simply argues that ‘factual accuracy in large language models (LLMs) remains an area of active research.’”
Noyb claimed that these failings mean OpenAI is breaking the EU’s General Data Protection Regulation (GDPR).
In its complaint, the non-profit asked the DSB to conduct the following actions:
- Investigate OpenAI’s data processing and the measures taken to ensure the accuracy of personal data processed in the context of ChatGPT
- Force the AI company to comply with the complainant’s access request and bring its processing in line with the GDPR
- Impose a fine to ensure future compliance
Maartje de Graaf, a data protection lawyer at Noyb, commented in a public statement: “It’s clear that companies are currently unable to make chatbots like ChatGPT comply with EU law, when processing data about individuals. If a system cannot produce accurate and transparent results, it cannot be used to generate data about individuals. The technology has to follow the legal requirements, not the other way around.”
Processing False Personal Data, Unlawful Under GDPR
Noyb explained that although inaccurate information may be tolerable for certain uses – when a student uses ChatGPT to help him with their homework, for instance – it becomes unlawful in the EU when the inaccurate data is personal information.
“Since 1995, EU law requires that personal data must be accurate. Currently, this is enshrined in Article 5 of GDPR,” argued the non-profit.
According to the same regulation, Article 16 of GDPR grants EU citizens a right to rectification if data is inaccurate. They also can request that false information be deleted.
Finally, under the “right to access” in Article 15 of GDPR, companies must be able to show which data they hold on individuals and what the sources are.
Noyb cited an example of a public figure who repeatedly required OpenAI to rectify or erase the false date of birth ChatGPT provided about them, but their efforts were in vain.
“It is clearly possible to keep records of training data that was used at least have an idea about the sources of information. It seems that with each ‘innovation,’ another group of companies thinks that its products don’t have to comply with the law,” De Graaf said.
Previous efforts by several EU data protection authorities have been “fruitless,” Noyb noted.
Noyb was founded in 2017 by Austrian privacy campaigner Max Schrems, who has been fighting data privacy abuses since the early 2010s.