Data governance for children: An emerging priority area for privacy professionals
Recognition of importance of child rights in data governance is growing

Expert perspective | 4 minute read
Over 4,000 privacy professionals - including regulators, data protection officers, and privacy lawyers from the private sector and governments - attended the IAPP Global Privacy Summit in Washington, DC in April. Children’s rights were on the agenda, reflecting a growing recognition of their importance in data governance.
UNICEF chaired a discussion alongside the UK Information Commissioner’s Office (ICO), the Irish Data Protection Commission (DPC), and Apple, discussing why all data protection compliance processes should consider children’s data. The aim was to reach a wider audience of privacy professionals who may not always think about children in their work, and convince them that they must.
A push for better government regulation of children’s data
With children being a third of internet users, there are few companies processing data which do not also process children’s data. Children, however, are more vulnerable than adults and are less able to understand the long-term implications of consenting to their data collection. Concerns around the collection of adults’ data are magnified when it comes to the collection of data from children. How data is handled affects how data is then used to inform decisions that affect children. The UNICEF Manifesto on Good Governance of Children’s Data offers ten actions that challenge prevailing approaches to children’s data, including the need for policy innovations, and a call for companies to shift the burden from children to enforce their own data protection, and to take more responsibility for implementing child rights by design.
"UNICEF is working with regulators and the private sector to support policy innovations on children’s data, and to build partnerships needed to inspire new and better regulations, at a pace needed for the digital age."
The UK ICO and the Irish DPC have pioneered good data governance for children, which is being emulated around the world. The UK ICO introduced the UK Age Appropriate Design Code in 2021, articulating how online services must interpret UK data protection law when processing children’s data. The AADC requires data processing and service design to be age appropriate and provides that age assurance tools should be used proportionate to levels of risk, and in compliance with other code standards and laws. Similarly, the Irish DPC’s Fundamentals for a child-oriented approach to data processing contain principles and recommended measures for applying privacy by design and default. The Fundamentals covers issues like transparency, exercise of children’s rights, age verification, profiling for marketing purposes, and others.
Both codes take as their starting point the UN Convention on the Rights of the Child, and emphasise that any processing of children’s data must be in the best interests of the child. Similar codes have been produced in France and the Netherlands, and are under consideration in California and other jurisdictions. Importantly, the European Data Protection Board will soon issue a similar set of Guidelines on children’s data processing which, although not binding, reflect the common position and understanding which the authorities agree to apply.
These codes are already a momentous step forwards for children’s rights, but the UK ICO and Irish DPC are not stopping there. For the Irish DPC the protection of children and other vulnerable groups is a key strategic goal, and they plan to publish further guidance documents imminently and will also contribute to the forthcoming EDPB Guidelines. The UK ICO also plans to develop and publish ongoing guidance, and the AADC will be subject to review. A priority for the UK ICO will also be to broaden and deepen their supervision activities and strengthen international collaborations.
For Apple, privacy by design is part of the business model, and the company recently introduced heightened protections for children, including minimising collection and use of data, on device processing, transparency and control, and data security. Apple also provides controls that allow parents to monitor and limit use.
Industry self-regulation for teens: The US approach
In the absence of specific government regulations regarding children’s data in the US, industry bodies have provided a self-regulatory framework. The IAPP hosted another panel discussion about children’s rights, led by BBB National programs and with a focus on the US, Getting teen privacy right. A BBB National programs study in 2020 found that teenagers are major participants in today’s digital environment, and often engage with digital media unaware of the hidden data ecosystem driving them. It found that teen-directed apps are more likely to use ad serving, third-party trackers, and ask for more permissions than those for a general audience, leaving teenagers with greater privacy risks than adults. The BBB National programs Foundation, the US Center for Industry Self-Regulation, recently published a Teenage privacy program roadmap, designed to help develop digital products and services that consider and respond to the heightened potential of risks and harms, and to promote responsible collection and management of teen data.
Regulatory futures and the implications for children
Other sessions touched on hot topics in privacy law that also have far reaching implications for children’s rights. Experts from the Future of Privacy Forum, Hogan Lovells, and Google discussed how to evaluate algorithms, incorporating privacy and ethics. They highlighted growing regulatory trends towards requiring fairness, transparency, accountability and explainability in AI/ML products, all of which are likely to be held to a higher standard when it comes to children. Other speakers raised the challenges for regulators in ensuring that privacy protections are in place for all users and non-users of AR/VR technology as the metaverse expands. With Lego recently announcing a partnership with EPIC games to invest in the metaverse for kids, and Tencent also reportedly increasing its investments in gaming for the metaverse, the metaverse may be the next frontier for children’s rights.
UNICEF is working with regulators and the private sector to support policy innovations on children’s data, and to build partnerships needed to inspire new and better regulations, at a pace needed for the digital age.