A safer digital world, one chat at a time

The Safer Chatbots project improves young people's experiences with automated text services when seeking for help.

Gerda Binder and Isabelle Amazon-Brown, UNICEF Gender
A girls checks her smartphone.
UNICEF/UN015591/Prinsloo
07 March 2022

"Nandi is 17 and dropped out of school before graduating. She lives with her mother and younger siblings in Khayelitsha, a South African township. She spends her days doing chores or running errands, and in her down-time, messaging her friends or browsing instagram. Nandi has a boyfriend: an older guy with a reliable job and a car who she met at a party.  At first, she enjoyed the physical side of their relationship, but recently, things have been ‘off’. She’s not sure if she likes him as much anymore, and last time he made a move on her, she pushed him away and told him she wasn’t in the mood. But Nandi’s boyfriend insisted, and forced her to have sex with him. That night, she stayed up late hunting for answers on her phone to explain the feelings of guilt and shame overwhelming her, and stumbled on an advert for a WhatsApp service promising trustworthy, anonymous sex and relationship advice. After typing “hi”, she was greeted with a chatty, friendly message, encouraging her to ask a question.

“Is it bad if your boyfriend forces you?” Nandi messaged. “Whoops! I don’t understand that, can you try typing it another way?” came the reply. She blocked the contact in frustration, and switched off her phone. Nandi has no idea where to turn."

Cartoon of a girl checking her phone with a question mark.
Safer Chatbots

Nandi’s story is representative of one of the ways in which young people around the world are using automated services like chatbots to disclose traumatic experiences including Gender Based Violence, in an attempt to get help. But as highlighted in a UNICEF Learning Brief, too often, they are failing to get the support they need. They may instead receive error messages, be nudged towards irrelevant information, or worse, an automated response that exacerbates their feelings of isolation or guilt, doing further harm in the process.

The Safer Chatbots project led by the UNICEF East Asia Pacific Gender Section, aims to change this, firstly by raising awareness of this cross-sectoral problem, but also by offering a solution in the form of tried-and-tested, open-access safeguarding mechanisms. Our approach was developed in collaboration with experts in technology, Child Protection and Gender Based Violence, and piloted in multiple countries and languages to ensure its replicability.  By implementing Safer Chatbots guidelines and technical templates, you can ensure that girls like Nandi:

  • receive an instant, automated acknowledgement that what they typed may indicate they’re in distress;
  • are offered the choice to correct the chatbot’s assumption OR seek further help from a trained (human) professional;
  • are provided with warm, empathic messages of encouragement if they confirm they’re in distress;
  • are given clear referral details for appropriate services, as well as a safe word to discreetly trigger the information again in the future.

The Safer Chatbot implementation guidance offers options for all levels of chatbot (with or without AI), and are designed  for popular chatbot building platforms such as Turn, TextIt, RapidPro and Bothub - but our approach is platform-agnostic. As well as simpler, keyword-based mechanisms covered by our DIY implementation guidelines we teamed up with Girl Effect and Weni to develop an open-access AI model whose aim is to ensure a higher degree of sophistication and accuracy in responding to users in distress. This ‘plug-in’ solution is available for reuse and adaptation, free of charge. We are actively seeking partners with access to anonymised disclosure examples to ensure we improve the model’s global accuracy and relevance.

Cartoon of a failed chatbot interaction.
Safer Chatbots

If you think your chatbots users should be acknowledged, like Nandi, in case they need quality, timely, human-led help, explore the Safer Chatbots guidelines today or get in touch with the team to find out more and help us make the digital world more human, one chat at a time: gbinder@unicef.org and isabelle.amazon@gmail.com