A fiduciary approach to child data governance
Issue brief | Fiduciary structures provide a framework for protecting children's data rights
Children’s data rights, like children themselves, often require supervision.
Nearly every system that creates value or risk also creates ways to participate in that system, explicitly or implicitly. In data and algorithmic systems, defining and sharing data inputs both add value and politics. Social media systems often give users some amount of control over other users’ behaviour – whether as a direct moderator, a sharer, or reporting abuse. And, of course, buying stock in a digital platform company may entitle a person to recoup financial value, and, in some systems, decision-making authority. In each of these examples, the underlying systems vary, but they are shaped by the decisions, resources, and participation of large numbers of people – people who have a legal obligation to have the authority to consent, or in the case of children, to have the consent of an authorized, approving adult.
The primary difference between children’s data governance and general data governance is the presumption that children are not able to effectively represent their own interests. Nearly every modern conception of data rights and governance focuses on locating the responsibility for decisions – privacy, data protection, even the Health Insurance Portability and Accountability Act (HIPAA), all use models of consent and public interest – to justify data sharing. And, just as in the physical world children, especially those who fall under the age limits for data rights laws, cannot legally directly consent to the agreements that form the basis of the legitimacy of the digital world. Children are not the only group of people who cannot represent their own interests in the way that data rights are created, shared and used to shape the world on their behalf.