Reimagining data governance for children

Insights from innovation

Melanie Penagos, UNICEF Innocenti, and Emma Day, Tech Legality
A girl holding her mobile phone and smiling
UNICEF/UNI783913/Ushindi
05 September 2025
Reading time: 4 minutes

In a small island town near the equator, a 13-year-old girl named Melody spends her afternoons playing games on her mother’s smartphone. For her, this is more than just entertainment, it’s a way of connecting with friends and finding her place in the world.

What Melody doesn’t see is what happens behind the screen. The game collects her personal information: her name, phone number, photo, location, and even her patterns of behavior. Some of this information is sold to advertisers, while other data could be misused if it fell into the wrong hands. Although Melody’s country has strong child data protection laws in place, weak implementation means she remains vulnerable.

Melody’s story is not unique. Around the world, children’s data is constantly collected, analyzed, and monetized, often without their knowledge or meaningful safeguards. This raises a pressing question: how can societies turn broad legal principles of data protection into practical, context-specific solutions that truly safeguard children?

"How can societies turn broad legal principles of data protection into practical, context-specific solutions that truly safeguard children?"

Our compendium of Innovations in Data Governance for Children explores forward-looking approaches to this challenge. These examples come from across the globe and demonstrate how governments, regulators, companies, civil society, and even children themselves can play a role in shaping safer digital environments.

Six innovations for responsible child-centered data governance systems

Across contexts, the case studies demonstrate six distinct ways stakeholders are rethinking data governance to put children at the center:

  • Children’s codes: Regulators in some countries have developed detailed guidance for companies on how to apply data protection laws to children. These codes prohibit harmful practices like profiling children for commercial purposes and mandate default protections, such as high privacy settings and restrictions on geolocation.
     
  • Compliance tech: A new market is emerging for digital tools that help companies navigate complex regulations across multiple jurisdictions. For instance, the age of consent for data processing, requirements for age assurance, and prohibitions on targeting advertising to children. These tools can also be used by regulators to scan app stores or analyze privacy policies at scale, making oversight more effective.
     
  • Risk and impact assessments require companies to consider children’s rights during the design phase, and throughout the product lifecycle, shifting the focus from reacting to harm to preventing it. These assessments, now required in some countries in the form of data protection, AI, or child rights impact assessments, help ensure that privacy-by-default and child rights-by-design are integrated into the development process, allowing potential harms to be identified and mitigated before they occur.
     
  • Regulatory sandboxes: Regulatory sandboxes provide a safe, collaborative space for regulators, companies, civil society, and children to test new technologies and shape rules before products are widely deployed. Cross-border sandboxes allow regulators to learn from one another and develop consistent guidance, helping small or under-resourced data protection authorities (DPAs) navigate cutting-edge technologies like data-driven AI while giving companies greater regulatory certainty.
     
  • Certification schemes are emerging as a key tool to support the implementation of children’s data protection laws and standards. Used across sectors such as EdTech and age assurance, they assess company performance against regulations or voluntary standards. Well-designed schemes help companies demonstrate compliance and support regulators in monitoring adherence, while robust oversight of certification bodies is essential to maintain accountability and effectiveness.
     
  • Industry standards help translate legal and regulatory requirements into practical guidance for developing child-centered digital products and services. They provide engineers and developers with concrete instructions to implement regulations, embed child rights-by-design, promote data protection, and reduce potential harms. Where standards making bodies include a broad range of civil society in their processes and make their standards publicly accessible, this supports transparency and stakeholder engagement.

Why these innovations matter

No single actor can solve the challenge of protecting children’s data. Policymakers need practical tools, companies need clearer guidance and implementation tools, regulators need scalable oversight methods, and civil society needs access to information about company use of children’s data, and support to effectively advocate for children’s rights. Parents and children themselves also have a vital role to play in shaping solutions that are both protective and empowering.

"No single actor can solve the challenge of protecting children’s data."

The innovations explored in this compendium demonstrate that effective data governance is not just about laws - it’s about creating systems of accountability, collaboration, and shared responsibility across multiple stakeholders.

Looking ahead

Children today are growing up with technologies that are evolving faster than most legal frameworks can keep pace with. From AI and machine learning to emerging neurotech, new challenges will continue to test the boundaries of existing protections.

The good news is that many of the innovations highlighted here are adaptable and scalable, although resources will need to be allocated to allow for implementation at scale. What matters is ensuring that children’s voices, and the perspectives of those who advocate for them, are included in shaping the future of digital governance.

By learning from these six innovations, we can move closer to a digital world where children like Melody are free to play, learn, and connect - without compromising their safety or their rights.

For more, visit UNICEF Innocenti's page on good governance of children's data