top of page
  • Mariane Bunn

EU task force uncertain on ChatGPT's data compliance



A European Union task force has been examining how the EU's data protection laws apply to OpenAI's ChatGPT, a popular chatbot, for over a year. Their preliminary findings reveal uncertainty over key legal issues, including whether OpenAI's data handling methods are lawful and fair. This is significant because violations of the EU's privacy laws could lead to fines of up to 4% of OpenAI's global annual turnover, and the company could be ordered to halt any non-compliant data processing activities.


Despite the lack of clarity from EU privacy enforcers, OpenAI is likely to continue its operations as usual. This stance is bolstered by the fact that there are currently no specific laws for AI, and any such laws in the EU are still years away from implementation. The investigation into ChatGPT was partly triggered by complaints, including one from Poland's data protection authority about the chatbot generating false information about an individual.


The General Data Protection Regulation (GDPR) applies to any collection and processing of personal data, which includes the activities of large language models like ChatGPT that train on data scraped from the internet. The GDPR also allows for data protection authorities to order the cessation of non-compliant data processing, a power that was demonstrated when Italy temporarily banned ChatGPT over privacy concerns. OpenAI resumed service in Italy after making changes demanded by the Italian data protection authority, but the legal scrutiny continues.


The taskforce's report suggests that OpenAI needs a valid legal basis for processing personal data at all stages, from collection to the output of ChatGPT. It also highlights the challenges of processing sensitive data and suggests that OpenAI could improve its legal standing by implementing safeguards to reduce privacy risks.


The taskforce's existence may influence GDPR enforcement decisions on ChatGPT by potentially delaying them. For example, Poland's data protection authority indicated it would wait for the taskforce's findings before proceeding with its investigation. This situation underscores the varied approaches of EU data protection authorities to regulating emerging technologies like ChatGPT. OpenAI has not yet responded to the taskforce's preliminary report. Source: Tech Crunch

Comments


bottom of page