Austria: Data Privacy Complaint Filed against ChatGPT for Giving False Information

OpenAI logo on a screen
Users can ask ChatGPT questions – but should make sure the answers are correct. (Source: IMAGO / ZUMA Wire)

The Austrian advocacy group Noyb on Monday filed a complaint against OpenAI, the developer of ChatGPT. The group claims that the chatbot violates the General Data Protection Regulation (GDPR) because it gives out false information on individuals. There is no procedure for correcting or deleting the false information, as required by law, says Noyb.

ChatGPT is a so-called chatbot that operates using artificial intelligence (AI). Users can ask the tool questions in plain conversational language, which it then answers.

Noyb filed its complaint together with an individual impacted by the tool. The complainant is a public figure whose identity has been withheld.

Noyb charges that when asked to give the complainant’s birthday, ChatGPT “repeatedly provided incorrect information instead of telling users that it doesn’t have the necessary data.”

The GDPR however requires that personal data be accurate. It also establishes a right to rectification of false information – and to the deletion of false information. According to the right of access enshrined in the GDPR, companies must also be able to disclose what data pertaining to individuals they have stored and what the sources of the data are.

OpenAI Must Follow Regulations

Maartje de Graaf, data protection lawyer at Noyb, said in a statement: “Making up false information is quite problematic in itself. But when it comes to false information about individuals, there can be serious consequences. It’s clear that companies are currently unable to make chatbots like ChatGPT comply with EU law when processing data about individuals. If a system cannot produce accurate and transparent results, it cannot be used to generate data about individuals. The technology has to follow the legal requirements, not the other way around.”

According to Noyb, OpenAI refused the complainant’s request to rectify or delete the data pertaining to him. The company argued that it wasn’t possible to correct data. “Open AI says it can filter or block data on certain prompts,” Noyb reports, “but not without preventing ChatGPT from filtering all information about the complainant.”

The company “failed to adequately respond to the complainant’s access request.”

Said de Graaf: “The obligation to comply with access requests applies to all companies. It is clearly possible to keep records of training data that was used at least [to] have an idea about the sources of information.”

Data Protection Authorities Should Impose Fines

Noyb and the complainant filed their complaint with the Austrian data protection authority (DSB) and have asked the authority to investigate OpenAI’s data processing. The complaint asks that the DSB seek to clarify which measures the company has taken to ensure the accuracy of personal data.

OpenAI must also “comply with the complainant’s access request” and “bring its processing in line with the GDPR.” Noyb also requests that the DSB impose a fine on the company.

Criticism of Chatbots

According to Noyb, OpenAI itself admits that its chatbot generates “responses to user requests by predicting the next most likely words that might appear in response to each prompt.” The company therefore cannot guarantee the accuracy of the answers ChatGPT gives.

“This is very much a structural problem,” the advocacy group says. According to a November 2023 report by the New York Times, “chatbots invent information at least 3 percent of the time – and as high as 27 percent.” The invention of false information is also known as “AI hallucination.”

OpenAI has already run into trouble with European data privacy regulations. Last year the Italian data protection authority imposed a temporary restriction to prevent ChatGPT from processing user data from Italy. The authority criticized OpenAI, saying “there appears to be no legal basis underpinning the massive collection and processing of personal data” used to train the company’s algorithms. (dpa / js)