Data Privacy Hits Back Against Meta in 2023
Data privacy sits at the top of a long list of problems plaguing Meta, and 2023 further deepened the hole the social media giant finds itself in. Meta and data protection regulators have had an adversarial relationship for years, ever since the EU’s General Data Protection Regulation went into effect and Facebook essentially ignored the majority of requirements, instead opting to find loopholes that let the company continue business as usual.
After year after year of legal battles, 2023 saw the walls close in on the since-rebranded Meta. The regulatory game of whack-a-mole will continue, but the message is clear: Meta–nor any other company–can defeat data privacy regulations.
The most recent news is that of the European Data Protection Board’s decision to further crack down on Meta’s data processing for the purpose of behavioral advertising (personalized ads), which the EDPB published on November 1.
To fully understand the context of that decision and its real impact–many sites are claiming it bans Meta’s behavioral advertising, which is simply not true–we need to go back to the beginning of 2023.
The Irish Data Protection Commission (DPC) brought ~$410 million worth of fines against Meta for violating Articles 5, 6, 12, and 13 of the GDPR. This all stems from Meta’s attempts to exploit a (perceived) loophole within the GDPR regarding legal bases for processing user data.
When the GDPR entered into force in 2018, Meta shifted its data processing to operate not on user consent, but under a “contract” legal basis, arguing it was a contractual necessity in order to deliver user’s unique news feeds and experiences on the Facebook and Instagram platforms.
The initial 2023 fines ruled that that was unacceptable, forcing Meta to adapt its legal basis for data processing (the $410 million figure was particularly large due to additional transparency violations).
Meta responded by switching its legal basis for data processing to that of “Legitimate Interests.” The aforementioned EDPB November ruling has also found that to be an invalid reason for most of the data processing Facebook & Instagram conduct.
The November ruling is the result of motions filed by the Norwegian DPC, and while it will force Meta to make yet another change to its data operations, it does not restrict or ban them as some have reported. The ruling dictates that Meta must operate on the basis of consent to process data for personalized advertisements.
Meta has tried to get ahead of this by announcing a paid version of its social media platforms in Europe. Under this model, users in Europe can pay a monthly subscription to access the platforms ad-free; for those who do not pay, they will be given a consent notice that includes permission for personalized ads. If they do not give Meta consent, they will not be able to use the platforms.
Few companies are in the position Meta is in, so the takeaways of these decisions do not necessarily apply unilaterally to future GDPR enforcement, but the interpretation of both the Norwegian DPC and European courts indicate that the enshrined legal basis of “Legitimate Interests” has undergone a transformation.
Whereas Meta likely interpreted it as meaning legitimate business interests–meaning a business cannot function properly or fund itself without data processing–the EU now seems to interpret the basis as legitimate technical interests, meaning the business physically cannot operate without processing data for personalized ads.
This would be more in line with how GPS applications function, as they need to process sensitive geolocation data to work and provide basic utility to the user. A change of that magnitude could force other companies to shy away from using “Legitimate Interests” as a basis for data processing activities.
The company also earned a $1.3 billion fine in May, thus far the largest GDPR fine in history, for illegally transferring user data from Europe to the United States. While Meta is in the process of appealing this decision, European courts have ruled against the tech giant time and time again on these matters.
Of course, Meta’s data privacy problems do not end there. Even with a potential path out of its perpetual GDPR troubles through the new pay-or-consent model, it faces resistance on the other side of the Atlantic in the U.S. as well.
As of this writing, 42 state Attorneys General are working jointly to bring lawsuits against Meta for violations against the Children’s Online Privacy Protection Act (COPPA). The lawsuit claims Meta has perpetuated a scheme “to exploit young users,” with its social media platforms designed to be “addictive” to young users and fail to take necessary precautions to “limit privacy harms.”
Age-gating any platform is incredibly difficult, but here it is doubly so as COPPA requires parental consent before a company can process the data of children under 13 years old. The overwhelming likelihood is that there are millions of children under 13 using Facebook and Instagram without parental consent, and while the lawsuit might have a difficult time proving that, Meta knows the issue exists and has yet to create any meaningful solution to it.
Further details of the U.S. lawsuit are currently redacted, so this saga will play out over time, but with Meta facing so much regulatory pressure in Europe that it publicly mused about shutting down its services there, another legal front appearing could bring the tech giant to its knees.
After Meta was hit with hundreds of millions of dollars worth of data privacy fines in 2022, 2023 has added more than $1.7 billion to that number, chipped away at its legal bases for data processing, and brought further government scrutiny upon the company.
It will likely continue its fight against regulators to minimize and settle fines and keep its services running (in the black) globally, but 2023 has undoubtedly been the year from hell for Meta on this front.
The message is clear to them and any other company still looking to skirt or merely pay lip service to data protection and privacy: comply or else.