Digital misinformation / disinformation and children
10 things you need to know
3 minute read
Misinformation is false or misleading information that is unwittingly shared, while disinformation is deliberately created and distributed with an intent to deceive or harm. Together they range from satire and parody, to dangerous conspiracy theories. Here are 10 things you need to know about how they affect children.
1. Misinformation and disinformation (mis/disinformation) online is a pressing public issue.
The rapid spread of mis/disinformation online affects everyone online and offline. As active digital users, mis/disinformation is very much a part of children’s lives. Mis/disinformation on social media spreads farther, faster, and deeper than truthful information. Hot-button and divisive issues, such as immigration, gender politics and equality, and vaccination are common subjects.
2. There can be real-world consequences of mis/disinformation.
Mis/disinformation has been used to incite violence and crime targeted at ethnic minorities – which has resulted in deaths and displacement of children, led to lower child COVID vaccination rates, undermined trust in journalism and science, and drowned out marginalized voices.
3. While mis/disinformation is often spread by people, algorithms are a key part of the mis/disinformation flow.
Algorithms drive personalized news feeds and curate search results, content, and friend recommendations by tracking user behaviour. Algorithms sometimes promote misleading, sensationalist and conspiratorial content over factual information and can be key vectors in amplifying the spread of mis/disinformation.
4. Children are vulnerable to the risks of mis/disinformation.
Because of their evolving capacities, children cannot always distinguish between reliable and unreliable information. As a result, not only can they be harmed by mis/disinformation, but may also spread it among their peers. Even very young children or those without access to social media networks may be exposed to mis/disinformation through their interactions with peers, parents, caregivers and educators.
5. At the same time, children can challenge and debunk mis/disinformation.
Children can be targets and objects of mis/disinformation, but they can also actively counter its flow. They can contribute to online fact-checking and myth-busting initiatives, such as against COVID misinformation in Nepal. UNICEF Montenegro’s Let’s Choose What We Watch programme has given young people opportunities to practice their media literacy and journalism skills and so improve the quality of reporting child rights.
6. Education is important.
Equipping children with the critical reading and thinking skills can help them determine the veracity of information. Considering how mis/disinformation moves easily between online and offline contexts, it is important to develop critical thinking skills amongst children even in non-digital contexts.
7. Collective action is required to protect children.
Policymakers, civil society organizations, technology companies, and caregivers including parents and educators must work together to protect children from the harms of mis/disinformation. Efforts to slow the spread of mis/disinformation are not coordinated, and there is little reliable data on the scale of the problem.
8. Policymakers should devise child rights-based regulations around mis/disinformation.
UNICEF recommends that policymakers devise regulation to protect children from harmful mis/disinformation, while enabling children to safely access diverse content. Regulation should focus on requiring procedures for classifying content and ensuring transparency and accountability. Finding the balance between rights-based online protection and freedom of expression is a very significant policy challenge.
9. Technology companies can help combat mis/disinformation.
Technology companies are key actors in combating mis/disinformation. UNICEF recommends that they actually implement their self-declared policies and invest more in human and technical approaches. Technology companies should be transparent about mis/disinformation on their platforms and how they are combatting it, and prioritize meaningful connections and plurality of ideas for children in the designs of digital systems.
10. Civil society should provide policy guidance on mis/disinformation.
Civil society, including academia and international organizations, should conduct research on the impact of mis/disinformation on children and the efficacy of counter-measures, so that their findings can inform advocacy and policy responses.
Read more in the full report on digital misinformation/disinformation and children