Online misinformation and disinformation in the age of COVID-19

A conversation with disinformation expert Renée DiResta

The Global Insight team
09 July 2020
Expert Q&A  |  6 minute read

How well do we understand the interplay between social media and the spread of misinformation and disinformation during COVID-19? Should regulators ask more of social platforms in combatting the infodemic? What principles should we teach young people to help enhance their media literacy? The Global Insight team invited Renée DiResta, a technical research manager at Stanford Internet Observatory, to share her thoughts and observations on misinformation, disinformation and the rapidly evolving social media dynamics in the wake of COVID-19. In the Q&A below, our partnership lead Yoonie Choi followed up with Renée on some of the most interesting aspects of our discussion. The transcript has been edited for length and brevity.

Can you tell us about the work you do?

Renée DiResta: Stanford Internet Observatory is a cross-disciplinary program for the study of abuse in current information technologies, with a focus on social media. Our work looks at how a range of different types of actors — state governments, extremist groups, mercenary PR firms, spammers, domestic activists — leverage the social media ecosystem to spread narratives. Sometimes we are simply looking at how tools shape public conversations; other times we’re looking specifically at how these tools are misused. 

What lessons can we learn from COVID-19 that can help the overall cause of fighting disinformation? 

RD: Disinformation refers to information that is spread with the deliberate intent to influence and deceive someone. That’s a very particular thing. Usually disinformation is put to use to achieve a particular objective, often political.

What we’re seeing with COVID-19 is primarily misinformation; people are looking for information about the disease, treatments, latest developments, and sharing what they find, but the intent is usually to try and help their friends and loved ones stay informed, and stay safe. The disease is so new, which means that many theories have been prematurely reported out as fact, and many official communications — particularly on topics like masks — have changed as new information has emerged. There’s a lot of noise on social platforms, and as people turn to these platforms to get information, they are sometimes receiving and spreading misinformation. What this tells us is that how platforms surface and curate information is a key challenge going forward — there is a glut of content today, and as platforms rank it for us via search algorithms and newsfeeds, getting authoritative, accurate information in front of the public on health topics is critical. 

Boy sits at a computer next to his father
UNICEF/UNI323338/Gevorgyan
On 14 April 2020 in Yerevan in Armenia, Danny, age 7, sits at a computer next to his father.

Speaking of disinformation, how much do you see the whole conspiracy theory business being pure fabrication versus selective stitching together of actual facts? And how does that impact the efficacy of fact-checking?  

RD: There's always some truth in a conspiracy theory. In content from the anti-vaccine movement, there are always some historical moments that people can point to — like vaccines that actually didn’t work well — to support a particular conspiracy theory. What happens is that these real moments are selectively edited and then daisy-chained to try to create a narrative that if one claim is true, all of the other claims in the anti-vaccine article or video are also plausible and true.

One of the challenges with debunking is that it requires a degree of nuance and the ability to unbundle the content, which can take a lot of time. So, the people who want to mislead can come up with a short slogan, spicy meme, or selectively edited video designed to touch on people’s emotions to get it shared, make it go viral. The fact-check — conveying the appropriate historical context, say, or clarifying how scientists incorporated new findings to develop a better vaccine — often requires a long explanation, and that’s not particularly well-suited to social media dynamics. 

When asked about what you would like to see more from social media companies, you mentioned some interesting tactics these companies could use such as throttling accompanied by rapid fact-checking. Could you elaborate more on that?

RD: Moderating content generally takes one of three forms: the platforms can remove it entirely, they can downrank it in curation (meaning, it appears lower in search results, or is not proactively pushed via a recommendation engine), or they can fact-check it and present the corrected information alongside the original post. Fact-checking is generally outsourced to partners such as news organizations. The challenge is that information moves so quickly. Social platforms are high-velocity, and they’re built for virality. They are designed to help information move from person to person. It takes time to fact-check, and fact-checks usually don’t go viral — they often don’t even reach the same audience. At the same time, taking down content inhibits free expression. The challenge is to strike the right balance. Slowing distribution of health misinformation to give fact-checkers some time to get a correction out is one potential option.  

Community leader participates in a Coronavirus awareness campaigns
UNICEF/UNI322644/Haro
Yayé Modi a neighborhood chief in Niamey, Niger, participates in a Coronavirus awareness campaign in April 2020.

We also touched on the sad reality that authoritative information is usually seen not presented in a compelling way (e.g. PDFs from UN agencies) and is thus less shared on social media. How can public institutions stay relevant, credible and interesting in sharing information in times when the mode of communication is more critical than ever?

RD: So much of the dynamics around what spreads is related to how the content is designed, how the narratives are framed, and how posts are initially boosted across networks. Social media design privileges certain types of content — compelling visuals, video, memes. People are drawn to first-person stories, to relatable speakers who feel authentic. They’re more likely to share content when they’re made to feel like active, valued participants in a campaign for a cause that matters to them. Scientific communication has more traditionally relied on presenting statistics and facts, and speaking precisely because accuracy is paramount. It’s a very top-down approach to communicating information to the public. It’s often slow. The challenge for public health institutions today is how to update their communication style, to communicate authentically and transparently, while retaining that commitment to accuracy.     

You also mentioned about the importance of enhancing people’s media literacy, such as teaching them what good, authoritative sources are. What does this mean for young people and children? And what role can UNICEF play in this area?

RD: First of all, most young people today aren't hanging out on Facebook. If we're going to be totally honest, Facebook is for a much older demographic than you would think at this point and we see this reflected in what content spreads. If you want to reach younger audiences, more of them are on platforms like Tiktok or in WhatsApp chat groups. We see doctors in Brazil who are active and engaged on WhatsApp, trying to ensure that they're meeting communities where they are. India, same thing. There are a handful of American doctors who are producing little videos on Tiktok, sharing accurate health and vaccination information. And they do go viral! You just have to be creative, fun and do interesting things with good music.

On the flip side, helping students realize that anyone can produce a video with selective editing, that slick production value doesn’t translate to the ideas in the video having merit, is also important. There are so many micro-media sites on the internet, including many publications that are rather surreptitiously funded. Teaching young people about how micro-media and media operates is valuable. Media literacy is not only about how to do an internet search, it also involves helping people realize that anyone, anywhere can put up a very official-looking website. I remember when I was writing papers in high school, my teachers explaining what it meant for something to be a valid source…and then, in college, that expanding to, “if you're going to use Wikipedia, you must click back through the link to the primary source, validate that source, and then quote the primary source.” These lessons are relatively timeless regardless of the topic. 

UNICEF can help its audience understand how to be a good judge of health communications. In general, we encourage people to think before sharing, because in the social media era we are all part of the process of spreading information within our communities. Putting out good information, helping young people learn how to share information that will help their communities, are important parts of the process.


Photo of Renée DiResta


Renée DiResta is the technical research manager at Stanford Internet Observatory. She regularly writes and speaks about the role that tech platforms and curatorial algorithms play in the proliferation of disinformation and conspiracy theories. Prior to her current role, Renée was the Director of Research at Yonder. She has degrees in Computer Science and Political Science from the Honors College at SUNY Stony Brook. @noUpside