AI policy guidance: how the world responded
Key takeaways from the public consultation on child-centred AI policies and systems
|This work is part of UNICEF's AI for Children project.|
4 minute read
In 2019, we at UNICEF set out on a project to understand the impact that the growing proliferation of artificial intelligence (AI) systems is having on a key section of our population that is so often overlooked when those AI systems are designed and run: children.
This work culminated with the launch late last year of a guidance to help support and promote children's rights in AI policies and practices, and to help the world understand how AI systems can uphold or undermine these rights. This guidance was intended not just for our own work, but to be considered by anyone championing more child-centred AI policies and systems.
This guidance was released as a draft, and not intended as the final word on the best ways to support the rights of children in a world growing every more accustomed to – if not dependent on – AI. In fact, we welcomed input from stakeholders in the fields of AI and children’s rights. We were excited to receive many dozens of responses, representing civil society, academia, governments, the private sector, international organizations, and individuals.
Those responses were overwhelmingly positive, with most respondents finding the purpose of the guidance to be clear and the nine requirements for child-centred AI to be understandable. More importantly, many of these responses included valuable recommendations.
Said one: “This guidance is extremely welcome, in particular the nine requirements for child-centred AI. These requirements and the accompanying recommendations have the potential to foster practical implementation and innovation in the best interests of children.”
Involve parents and educators
One of these recommendations was for AI developers, policymakers and other stakeholders to involve parents and educators and develop guidelines on how they can best handle AI and children. Because even though governments and industry need to ensure that all AI systems are aligned with the rights, needs and realities of children, much of the interaction children will experience with AI-based systems will take place in the home or at school. So parents and teachers need to be aware of the challenges, risks and opportunities that AI can bring to children.
Respondents stressed the need to provide “short courses for parents, children and teachers over the mechanism of data and the meaning of consent, privacy, etc.”
Consider the different realities of children
Another recommendation was to consider the different realities of children. After all, no two children are the same, and neither are any two contexts. Yes, global guidance is important and necessary, but it is also important to consider the specific realities of minorities, for example, or children with disabilities.
“It is crucial for children, especially those who are classified as victims or witnesses of violence, not to remain 'trapped in a negative AI-mediated correlation' through a risk profile for their lives,” said a respondent.
In fact, several respondents urged the inclusion of explicit recommendations on, for example, children with disabilities, children from marginalised groups and very young children, as well as recommendations on sexual exploitation and victims or witnesses of violence, and a greater range of children’s ages. We were also asked to consider the physical impact of AI on children, from their interaction with robots to the effects of extended use of screens and tablets.
Look at the bigger picture
Respondents also urged us to look at the bigger picture. AI does not exist in a vacuum, and most interactions children will encounter are mediated through interfacing systems. So guidance needs to include, or be extended to, other technologies and digital platforms.
Support governmental action
Others stressed the need to support governmental action. Good intentions and nice words are important, but without action there is no change. Incentives for concrete action must be developed, while at the same time ensuring that not acting is not an option.
Above all, include children!
Above all, our respondents said, include children! As so many parents know, technologies like AI are areas that children often know better than adults. This was already evident during the consultation workshops held with children, but it also clearly emerged as a top recommendation: ensure children are part of the implementation of this guidance, and provide them with the means to be heard and to make their suggestions part of the action.
Recommendations, we were told, need to stress that children should be able to initiate and share in decision making, defining the policies that impact upon them rather than just “being heard”.
“Children need to be involved in analysing and interpreting data collected of them,” a respondent said.
The next version of the guidance, which will include input from this consultation, will be released later this year.