COVID-19 and children’s digital privacy
How do we use technology and data to combat the outbreak now, without creating a ‘new normal’ where children’s privacy is under constant threat?
Expert perspective | 4 minute read
In an effort to contain the coronavirus pandemic, governments are employing strict isolation measures — over 3 billion people are under lockdown — and using extensive digital surveillance to track and monitor people. When applied ethically and responsibly, public health surveillance can be a critical component of outbreak and epidemic response. In countries like South Korea, in combination with additional measures such as widespread testing, it is proving to be successful in dramatically reducing new infections. The unprecedented scope of current digital measures, however, is pushing the limits of privacy. How do we balance children’s right to privacy with their right to health, both enshrined in the Convention on the Rights of the Child? How do we use the technology and data available to combat the outbreak now, without creating a ‘new normal’ where children’s privacy is under constant threat?
Digital health surveillance in the time of COVID-19
The use of data, including anonymized and aggregated location data, for epidemic and humanitarian responses is not new. But the scale of the COVID-19 response makes it different. Never before have so much data and such wide-ranging tools been available to governments, many of which now have expanded powers, including for health surveillance. Governments are using digital technology and data for contact tracing — identifying everyone who was in close contact with an infected person, constantly following up for symptoms, quarantine tracking of those with the virus, and understanding human movements generally.
Data sources can include mobile phone location data (sometimes of citizens in aggregate, other times for individuals), call record tracking, CCTV footage, and border control travel histories. Some governments require people to regularly submit geolocated selfies or actively check in through downloaded apps that take self-reported health data. In other cases data are tracked in the background, for example, through the establishment of so-called ‘electronic fences’, which map to a physical boundary around quarantined people and immediately notify authorities when their mobile phones break the fence. When this happens, in-person follow-ups by police can take place within hours. Some governments are publicly sharing personal details of people with COVID-19, including age, gender, and street address. Others do not reveal individual’s names but reveal enough information for easy identification.
Impact on children
Children’s data are being collected in the COVID-19 surveillance sweep. UNICEF is already concerned about the need to protect children’s digital footprints, and in this time of unusually high internet usage and heightened surveillance, such worries are amplified. Children are a particularly vulnerable category of data subjects and extra care with their data is demanded. The UNICEF Responsible Data for Children initiative highlights that if children’s data can lead to individual identification, they can be placed at physical risk or risk of stigmatization. For example, in Hong Kong a 13-year-old girl wearing a quarantine monitoring bracelet was seen dining at a restaurant, then followed, filmed and ultimately shamed online. Beyond cases such as this, little to nothing is known about the impact current health surveillance measures are having on children. Until we understand how children’s data is captured, categorized and reported, we will be unable to measure the actual impact.
One explanation for such lack of information is that the many stakeholders involved in the data collection value chain, such as software developers, mobile network operators, government officials and data analysts, are normally not trained in children’s rights, or do not prioritize children’s interests. Often they do not design for or implement the additional protections required for children and their data.
What can be done?
Out of crisis can come opportunity for immediate and lasting change. Below are some suggestions:
- Innovate to minimize trade-offs: Developers of technology-driven solutions should safely explore all avenues for minimizing the trade-off between data privacy and data sharing, especially for children. For example, while there has not been a formal evaluation of the TraceTogether app in Singapore, it claims to support contact tracing without collecting or sharing an individual user’s location data.
- Apply rights-based principles: Governments should apply rights-based principles to health-related data tracking so that these measures are temporary, necessary, proportionate and transparent. Responsible technologists can promote accountability from the sidelines: for example, as a proactive measure, privacy experts have issued an open letter to the United Kingdom health chiefs demanding transparency and effective protection in a pandemic tracking app, expected to be released there soon.
- Learn and adapt: Beyond the immediate crisis, policymakers and technologists must build on valuable lessons learned now when adapting privacy guidelines for the future. Current guidelines may “not consider many of the risks presented by the novel digital surveillance measures that countries have enacted in response to COVID-19”. Further, future guidelines must give adequate attention to the rights of children, which are often not sufficiently prioritized.
- Include children: A core requirement for both safer innovation and the development of stronger privacy standards is the application of guidelines for child rights in the design of digital tools and measures. Where possible, children should be included in the design and testing process from the beginning. Children are part of the digital health surveillance ecosystem and should not be an afterthought in the creation of tech-driven solutions or policymaking. Health-related data collected from all users, including children, can be beneficial to societies, but only if it is done safely and without the loss of trust by the youngest generation of internet users.