TikTok GDPR Fine Shows Emphasis on Protecting Children's Data

James Grieco
James Grieco
Sep 19, 2023
min read
TikTok GDPR Fine Shows Emphasis on Protecting Children's Data

Late last week, the Irish Data Protection Commission announced a roughly $369 million fine for TikTok over multiple GDPR violations on handling children’s accounts. The decision comes almost exactly a year after a similar fine was handed down to Instagram, and becomes the second time TikTok has been hit with a fine over GDPR violations in 2023. 

The first occurrence was in April, when the UK data regulator fined the Chinese-owned social media giant more than $13 million over the platform illegally processing the data of 1.4 million children under 13 years old who were using the app without parental consent. 

The Irish DPC’s report notes TikTok failed to provide children using the platform with transparent signup language, resulting in millions of child users having their accounts set to public view by default. This negligence vastly underestimated the risk and potential privacy harms posed to children using the app. 

In addition to having users between the ages of 13 and 17 being guided through the sign-up process in a way that resulted in their accounts being set to public, the Irish DPC found that the “family pairing” scheme designed to provide parental consent for underaged users did not actually have any verification methods that actually checked adults paired with child users were their guardians or parents.

The size of the fine directly reflects the scope of the violations’ impact as well as the privacy harms opened up by the violations. 

This is the fifth high-profile data protection violation revolving around children’s data so far this year, with both GDPR fines issued to TikTok and the United States’ FTC issuing three fines to Edmodo, Amazon, and Microsoft, with a potential Youtube case also looming. The FTC fines all come over violations to COPPA, the Children’s Online Privacy Protection Act and total $51 million.

To put things in perspective, since GDPR came into effect in 2018, there has yet to be a single year with more than four publicized cases of COPPA and children-related GDPR violations, until now. 2023 has been a busy year. 

Seeing such a level of enforcement is encouraging, as any worthwhile regulation needs enforcement to give it real meaning, but having so many cases occur also brings the surprising lack of overall violations into focus. 

Nearly every GDPR fine has been targeted at Big Tech companies, and this most recent TikTok fine is no different. The overall zeal and consistency of European watchdogs, as well as counterparts globally, does not match the impact of the regulations themselves. 

There are numerous reasons why this is, the most prominent explanation being that the funding is not there for data protection commissions to regularly take up cases and wade through years of legal battles to finalize financial penalties. 

This may be true, but it does not help advance data protection. Whenever a fine is publicized, the public does not sympathize with the company that committed the violations. No one is defending TikTok’s dark patterns and underwhelming privacy by design. 

Increased societal scrutiny around children’s safety and companies treating children’s data with the utmost care and respect is a welcome sight, and DPCs have reflected that increase in awareness with the record number of related violations in 2023. However, children’s data is not the only data that should be inspiring this kind of regulatory intensity and proactive enforcement. After all, the public is strongly in favor of data privacy laws and any privacy harm devalues the internet, facts data protection commissions need to keep in mind when approaching enforcement.

The spirit of the GDPR, the CCPA, and the ever-growing number of data protection laws around the world are supposed to champion everyone’s data rights and data privacy. If enforcement revolves mostly around violations involving children’s data, these regulations will continue to fall short of the influence they should have. With better enforcement on the levels we’ve been seeing on cases involving children, we’ll all be on our way to truly meaningful data rights for all.