Minor changes: Regulators focus on protecting children’s data


Whitney Houston sang, "I believe that children are our future," and it seems that data privacy regulators strongly agree. Regulations on minors’ privacy are evolving rapidly. Across jurisdictions, new laws and amendments are tightening how businesses collect, use, and share children’s data. Prominent examples include the FTC’s January 2025 update to COPPA, which requires verifiable parental opt-in for third-party advertising and expands the definition of personal data to include biometric and government-issued identifiers.
In the UK, the ICO’s Age‑Appropriate Design Code (the "Children’s Code") ensures that services accessible to under‑18s enforce privacy‑by‑default and design with the child’s best interest at heart. Other countries, such as Australia, Singapore, France, India, and Brazil, are introducing or drafting similar protections.
This regulation stems from the understanding that Children are uniquely vulnerable. "It's time to strengthen privacy protections, ban targeted advertising to children, demand tech companies stop collecting personal data on our children," US President Joe Biden said in his 2022 State of the Union Address, expressing the sentiment behind the new and revised laws we’re now witnessing.
Spotting trends and focal points
- Next-level age verification
The era of “just type your birthday” is ending. Regulators are pushing for hard verification, not self-declaration. Utah has gone so far as to require app stores to step in, blocking downloads for minors unless age is confirmed. For businesses, age assurance is quickly becoming a baseline compliance feature that raises technical costs and user-experience concerns. It’s also worth noting that some verification tools seem very invasive: facial recognition, credit card details, etc.
- Designed for kids’ safety
New rules emphasize privacy by design, where the most protective settings are the default. That means geolocation, behavioral tracking, and profiling must be off until explicit consent is given. ICO’s Children’s Code and similar laws go further, requiring Data Protection Impact Assessments (DPIAs) that specifically examine risks before launch. In practice, companies need to bake “child-first” principles into product development cycles, ensuring privacy settings aren’t an afterthought but a structural part of the service.
- Advertising and profiling limits
Restrictions on advertising to minors are expanding. The EU’s Digital Services Act bans targeted ads based on profiling under 18, while other regulators add strict rules around behavioral data processing. The trend reflects a recognition that children are particularly vulnerable to manipulation, and deserve specific protection.
- Stronger parental control and consent
Regulations now demand usable tools that allow parents to manage kids’ privacy settings, see what data is collected, and revoke permissions at any time. The updated COPPA, for example, requires that parents opt in to targeted advertising before such practices are used by companies. This reflects regulators’ insistence on informed oversight and transparency, with tools designed for real-world use rather than legal compliance only. The expectation is to make parents active gatekeepers, not passive signatories.
- Higher protected ages
Many new rules stretch protections to 16 or even 18. This expands the number of users treated as sensitive, forcing services to embrace stricter data practices. The implication here is clear: older teenagers are still kids, and companies are no longer free to profile them for profit. Businesses now need to rethink how to categorize users and adjust their product and monetization strategies.
The bigger picture: what this says about data privacy regulation overall
Children’s data protection is a test case for how privacy regulation is evolving. The common thread is that companies can no longer rely on passive compliance, like a checkbox buried in terms and conditions. Instead, they must actively prove that protections work in practice.
The shift also signals a cultural change: vulnerabilities are not opportunities to monetize but areas requiring special restraint. Just as minors’ rights push companies to tread carefully with profiling, targeting, and consent, the same mindset is extending into the wider regulatory environment, where accountability, transparency, and proactive design are becoming standard.