OpenAI to let Europeans store data locally amid GDPR concerns
And it won’t be used for training
ChatGPT-maker OpenAI is on course to let some European customers store and process their data within European borders.
The initiative comes as the company faces increasing pressure to comply with the EU’s General Data Protection Regulation (GDPR) and aims to build trust with users in the region.
Businesses, educational institutions and developers can use the new data residency option to store data from OpenAI’s products - such as ChatGPT Enterprise, ChatGPT Edu and its API platform - locally.
Under its European data residency plans, the AI firm lists some features it believes will fit into the GDPR compliance guidelines, such as using advanced encryption and Data Processing Addendum (DPA) techniques.
The company also confirmed data processed under this option will not be retained or used to train its AI models unless users explicitly grant permission.
GDPR guidelines mandate strict protections for personal information, and OpenAI's localised data storage is likely a step to meet these requirements.
This move comes amid increasing regulatory scrutiny of the company's data practices in Europe, including a €15 million fine levied last December by Italy's data protection agency, after ChatGPT was found lacking in personal data management.
Although this move gives European customers the ability to create new projects and select to keep and process their data within Europe, OpenAI notes that existing projects will not have the option for European data residency.
“As of now, European residency can only be configured for new projects, existing projects cannot be updated to have European data residency after creation,” it explained.
With this announcement, OpenAI follows the steps taken by other tech firms like GitHub, Microsoft and Amazon, which have also implemented localised data storage solutions in Europe.
While there is still an argument to be won on whether data residency alone can address broader concerns about transparency in how AI systems handle personal data, the move is likely to bring the company closer to meeting the requirements of Europe’s ever-tightening regulatory landscape.
It also represents a significant climb-down for a company that had threatened to leave Europe before the AI Act came into force.