Logo gray

Tech giants vs. supervisory bodies – The ongoing struggle to protect the privacy of Europeans continues

Austrian advocacy group NOYB has achieved a significant victory in its battle against Meta's data practices. Following 11 complaints from NOYB, the Data Protection Commission (DPC) of Ireland, the lead data protection supervisory authority of Meta announced that Meta will halt its plans to use EU/EEA user data for artificial intelligence (AI) training. This decision comes after Meta had previously argued it had a "legitimate interest" in processing this data and had given users a complex and potentially misleading "opt-out" option.

Meta's plans to use years' worth of users' personal posts, photos, and tracking data for its AI technology, set to take effect on June 26, faced strong opposition from NOYB and various European data protection authorities. Despite initially approving Meta's AI introduction, the DPC reversed its stance after intense pressure and engagement with other regulators. The DPC announced that it welcomes the decision by Meta to pause its plans to train its large language model using public content shared by adults on Facebook and Instagram across the EU/EEA. This decision followed intensive engagement between the DPC and Meta. The change in the DPC's position is attributed to the collective pressure from NOYB and other organizations like the Norwegian Consumer Council.

Meta expressed dissatisfaction with the DPC's reversal and highlighted that EU/EEA users would not be able to use AI services for the time being. However, activists criticized Meta's approach, suggesting that the company could still deploy AI technology based on valid opt-in consent from users.

The fight with regulatory bodies is not new to Meta – since the introduction of its new model in November 2023, impacting Facebook and Instagram users in the EU, the European Commission also placed their methods under investigation. The Commission’s investigation, coordinated with data protection authorities, will conclude within 12 months of the proceedings' start on 25 March 2024. In May, additional proceedings were launched against Meta concerning the protection of minors. On 1 July, the European Commission informed Meta of its preliminary findings that its “pay or consent” advertising model violates the Digital Markets Act (DMA). The model forces users to choose between paying for an ad-free experience or consenting to personalized ads, without offering a less personalized but equivalent alternative. Key issues with Meta's model are the lack of less personalized alternative (as users cannot opt for a service that uses less personal data but is otherwise equivalent to the personalized ads-based service); and the inadequate consent mechanism (hence users are not given the freedom to consent to the combination of their personal data.)

The Commission’s preliminary view is that Meta’s model does not comply with Article 5(2) of the DMA, which requires gatekeepers to offer users a choice that does not condition service access on data consent. If the findings are confirmed, Meta could face fines up to 10% of its worldwide turnover, increasing to 20% for repeated infringements. The Commission may also impose additional remedies, such as requiring Meta to sell parts of its business or restricting acquisitions. These developments mark critical steps in the ongoing struggle for data privacy, emphasizing the power of advocacy groups and regulatory bodies in holding tech giants accountable for their data practices.