Meta's plan to utilize user data for AI training is facing a legal challenge from a data advocacy group, raising concerns about privacy and data protection. The European Center for Digital Rights, also known as NOYB (None Of Your Business), has sent a cease and desist letter to Meta, threatening an injunction and potential class-action lawsuit if the company proceeds with its plan to use personal data from European Facebook and Instagram users to train its AI models.
Meta announced that it would begin using personal data from European users of its Instagram and Facebook platforms for AI training starting May 27. The company claims this will improve the performance of Meta AI by helping it better understand European languages, culture, and history. Meta intends to use publicly available posts, comments, and user interactions with its AI tools to train its AI models in the EU. However, Meta has stated that it will not use private messages or data from users under 18 for AI training.
The core of the dispute lies in Meta's reliance on "legitimate interest" as the legal basis for processing user data for AI training, instead of obtaining explicit opt-in consent. Under the General Data Protection Regulation (GDPR), companies must have a valid legal basis for processing personal data. While one such basis is obtaining explicit consent from users, another is "legitimate interest," which allows data processing if it doesn't override users' fundamental rights and freedoms.
NOYB argues that Meta's "legitimate interest" claim is invalid and that the company should obtain explicit consent from users before using their data for AI training. The group points to a previous European Court of Justice ruling that found Meta could not use "legitimate interest" as a basis for targeted advertising. NOYB argues that if Meta cannot claim a "legitimate interest" for targeted advertising, it cannot do so for AI training either.
Several data protection authorities in EU countries, including Belgium, France, and the Netherlands, have expressed concerns about Meta's approach. The Verbraucherzentrale North Rhine-Westphalia (NRW), a regional data protection authority in Germany, has issued a formal warning to Meta, urging the company to stop training its AI models using data from European users.
Meta has introduced an opt-out mechanism for users who do not want their data to be used for AI training. However, NOYB argues that this opt-out approach does not meet the GDPR threshold because it places the burden on users to defend their privacy, rather than requiring Meta to obtain explicit consent.
If NOYB's injunction is successful, Meta could face significant financial penalties. Noyb estimates that Meta could be liable for billions of euros in damages, potentially €500 for each EU monthly active user. Meta also faces the risk of having to delete AI models that were illegally trained using EU data.
Meta has defended its AI plans, stating that it has a legitimate interest in training AI tools and that it complies with EU regulations. The company also points out that other companies, such as Google and OpenAI, use data from European users to train their AI models. Meta says that it is providing EU users with a clear way to object to their data being used for AI training, through email and in-app notifications.
The outcome of this legal challenge could have significant implications for how tech companies handle user data in the context of AI development. It could set a precedent for whether companies can rely on "legitimate interest" as a basis for processing user data for AI training or whether they must obtain explicit consent.