close
close
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Data Privacy

What Organizations Need to Know About Protecting Children’s Data in the Age of AI

Michael Ariyo - April 2, 2026

Children’s privacy used to live in a compliance checklist somewhere between COPPA and “we’ll deal with it later.” Later is here. AI has turned every recommendation engine, voice feature, and chat tool into a potential compliance exposure, and regulators are no longer waiting for companies to figure it out.

Every month, DataGrail brings together privacy attorneys, security practitioners, and compliance professionals for a candid peer exchange on what’s actually happening in the field. Led by privacy lawyers Katelyn Ringrose and Tyler Henry (McDermott Will & Schulte), cybersecurity expert Kenneth Vignali (SPM Advisors), and privacy leader Steve Lappenbusch (Daxko), this month’s huddle covered the legal and ethical ground around children’s data and what organizations need to do to keep up. 

Participants refreshed on landmark children’s privacy regulations and enforcement cases, then dug in deep on their added complexities when AI is introduced into the picture. Children’s safety is already emerging as a major focus of early federal AI policy and enforcement, leading to even more pressure to get it right. In the end, the group compared notes on the costs and benefits of various approaches to risk mitigation. 

Read on for the main takeaways.

The regulatory maze of children’s privacy

COPPA remains the foundation in the U.S., but it doesn’t tell the full story of children’s privacy and AI governance. Privacy teams must also consider:

  • Age-appropriate design codes (e.g. the UK’s “Children’s Code” published 2020)
  • App store accountability bills that push age-rating obligations onto developers (e.g. the Texas App Store Accountability Act)
  • Social media addiction laws targeting algorithmic feeds (e.g. California Protecting Our Kids from Social Media Addiction Act) 
  • Age-gating requirements for mature content (e.g. Louisiana’s Act 440)
  • AI chatbot laws (e.g. California Senate Bill 243)

Unlike most privacy laws, lawmakers in states like Oregon and Washington are exploring a private right of action for violations involving minors. That shifts the risk from regulatory penalties into active litigation exposure. 

It’s not just companies selling AI products that should worry about this intersection. Any company using AI to power recommendation algorithms, digital profiling, voice features, learning tools, chat tools, or even targeted ads should evaluate their risk exposure. 

As Steve Lappenbusch put it, “The risk surface and the depth of risk are both expanding at the same time.”

Enforcement realities: Children’s privacy isn’t just a big tech problem

Three cases anchored the enforcement discussion:

Amazon settled with the FTC in 2023 for $25 million over failure to honor parents’ deletion requests of children’s voice data collected via Alexa.

NGL Labs, far smaller and less well-known, was forced to shut down significant portions of its business entirely after allegedly deceptive claims about AI moderation and other issues. 

Attorney General James Uthmeier qualified Roku’s voice-activated remote as a smart speaker under state law and argued that secondary indicators like use of a children’s profile should have been sufficient to trigger advanced children’s privacy obligations. 

Between these three cases and recent comments from the Federal Trade Commission, it’s clear that regulators will prioritize children’s privacy moving forward. Attendees also anticipate the current trend of multi-state attorney general coalitions will overlap with children’s privacy enforcement.

The verification problem: To verify or not to verify

Children’s safety laws increasingly expect an age verification stage before providing access to mature content or disabling certain privacy safeguards. The actual execution of this idea can be complicated. Facial estimation is biased and unreliable. Behavioral profiling is guesswork. Hard ID checks exclude people who do not have government-issued identification. Purchasing age data from third-party vendors could land your company on a data broker list. Operating system signals from Apple or Google help but place enormous pressure on platform providers. And any of these approaches can run contrary to an earnest effort towards data minimization protecting all users. 

The deeper problem is structural. Children have thin data profiles. No credit cards, limited browsing history, few digital footprints. Vendors selling verification tools often cannot accurately distinguish a child from an adult who simply does not have much of an online presence. Many organizations also do not collect birth dates, which means there is no mechanism to automatically update a user’s permissions when they come of age. Compliance is too often frozen in time rather than built to evolve.

What you can do about children’s privacy and AI governance now

While challenging, every privacy team can make meaningful progress towards protecting children’s privacy today. 

Here are five tips from our guests:

  1. Ask whether you need the data at all. If you can’t explain why you collect children’s data, do you really need it? That question is where minimization starts.
  2. Map where children’s data lives across your systems, including what your AI models are ingesting or were trained on. Know and document your risks. 
  3. Treat data vendors like any other procurement partner. Not all brokers operate to the same standard, and their problems become your compliance exposure.
  4. Account for data exhaust. Every interaction generates data that often goes untracked in compliance planning. It often becomes a focal point in audits.
  5. Build privacy from the start. Data minimization, encryption, and role-based access, can all help protect children’s data and enhance your compliance posture.

Children’s data sits at the center of where AI regulation, state law, and federal enforcement are all heading at once. For most organizations, the question is not whether this applies to them. It is how much runway they have left to get ahead of it.


Get in on our next community discussion! Request to join Privacy Roundtable to stay in the loop on the next Privacy Huddle event. 

Contact Us image

Let’s get started

Ready to level up your privacy program?

We're here to help.