More

    Italy Fines OpenAI €15 Million for Violating GDPR Data Privacy Rules

    Italy’s data protection authority has fined OpenAI €15 million ($15.66 million) for violating the EU’s General Data Protection Regulation (GDPR) with its ChatGPT application. The fine follows a finding that OpenAI processed personal data without proper consent and failed to report a security breach in March 2023.

    The Garante also criticized OpenAI for not providing age verification mechanisms, potentially exposing children under 13 to inappropriate content. The company was ordered to run a six-month public campaign to inform users about how their data is used to train ChatGPT and their rights to manage that data under GDPR.

    This penalty builds on Italy’s previous actions, including a temporary ChatGPT ban in March 2023, which was lifted after OpenAI addressed data concerns.

    OpenAI has expressed disagreement with the fine, calling it excessive and pledging to appeal. The company also reaffirmed its commitment to respecting user privacy.

    The fine follows a recent European Data Protection Board opinion clarifying that an AI model that anonymizes personal data before deployment may not violate GDPR, as long as no personal data is processed during operation. The EDPB also issued new guidelines on GDPR compliance for data transfers outside the EU, which are open for public consultation until January 2025.

    Related topics:

    Sam Altman Calls Elon Musk “Clearly a Bully” Amid OpenAI Feud

    AI Optimizes Offshore Operations, Reduces Fuel and Emissions

    AI Revolutionizes Education: 2024 AI in Teaching Summit Draws Over 500 Educators

    Recent Articles

    TAGS

    Related Stories