Technology and the power of narratives in times of uncertainty
A conversation with innovation and forecasting expert Mr. Nicholas Davis.
Expert Q&A | 5 minute read
Our world is often shaped by great stories, but this could be a double-edged sword if the narrative limits our imagination. In times of Industry 4.0, this has particular implications on how we conceptualize technology: Are we deceiving ourselves by equating open source data with public goods? Why is it so difficult to stop treating governance and ethics as an afterthought when we design new technology? The Global Insight team invited Nicholas Davis, a Senior Advisor to the CEO of GAVI (the Global Vaccine and Immunization Alliance), to share his observations and reflections on the changing narratives emerging from the COVID-19 era. In the Q&A below, our partnership lead Yoonie Choi followed up with Nicholas on some of the most interesting aspects of our discussion. The transcript has been edited for length and brevity.
Can you tell us about the work you do?
Nicholas Davis: I spend a lot of time talking about futures. I'm a firm believer, like some of your earlier speakers, that the world essentially runs on narratives. It runs on stories, particularly stories about the future.
At the World Economic Forum, I led the work on The Fourth Industrial Revolution, focusing on how emerging technologies could disrupt industries, labour markets, societies and governments, and how technology can be used to empower communities to create a human-centred future. The story about this was very consciously related to the idea of values, and credited to this idea of technology as a public good. We tried to create a socio-technological imaginary by articulating an entire narrative and way of talking about the subject.
Since then I've taken up a role as Special Advisor to the CEO on Innovation at GAVI where I coordinate internally and externally to seek more coherence in innovation efforts.
You conducted a meta-analysis of 40 different post-COVID scenarios that were published in the past few months. Was there anything striking?
ND: I think it really varies from place to place and we often swing between more devolved versus centralized decision-making and governance approaches. To me, it seems that our collective imagination for governments is becoming narrower and we're thinking more and more of governance as an authoritarian, top-down, centralized function and less about the ability for us to coalesce at smaller levels. So even though I know that community-led policing, for instance, can fail spectacularly, I also become concerned when we pin our hopes on centralized systems, which is essentially what we are doing. At least that's what the scenarios are revealing. And it makes me worry for what happens in places further from the beta-path or where local community needs are so specific that you do want more devolved and local control.
You said this will be the decade of technology governance and technology ethics. What’s your view on the perception that governance and ethics will always be a step behind technology?
ND: There’s my favourite podcast which runs down the idea of ‘How do you take advantage of a shift in narrative and meaning and turn that into interesting normative change in a particular direction?’ And the normative change is really important when talking about technology today because technology indeed needs to serve us rather than we need to serve it. It [technology] has to be incredibly human-centered, rather than product-centered, and it really needs to have values as a feature, not a bug. You can't build a set of technologies these days, particularly the more powerful ones like artificial intelligence, and then just wait to determine whether, later on, it's biased against minorities or marginalized groups. If you do that, then you are contributing to that problem. This is exactly why MIT has now removed its database of images and facial recognition from the internet and from use by AI researchers.
That [MIT] database is inherently flawed because it goes from that perspective of “Let's build it and then check to see if there's any bugs in the form of discrimination or any type of security concerns in there.” This is technology backwards. Today, you have to start with the idea that this piece of facial recognition technology has to be designed to be as neutral as possible when it comes to the variety of faces that are going to be in front of it and most technology systems providers developers. As I'm sure many of you are aware, you don't work like that: you have a great idea and you put it together. So the key learning from this is that, whether you're UNICEF or other institutions that are looking to grapple with this concept of emerging technologies (such as blockchain, AI, space and geospatial technologies, or any hugely powerful and ubiquitous new sensors), we really need a new set of ethics and a new way of looking at these technologies.

Do you think it’s organic in the sense that the self-fulfilling, self-defeating narratives have sprung up? Or were there underlying factors that people are building narratives around to either justify prevailing ideas or push them forward?
ND: For me, this idea of narratives defining the limits of our imagination and really shaping them is huge. That means we do need to be a little bit artificial in pushing our own narratives -- a bit further, being a bit more crazy, and bringing in more creativity or foresight techniques, just to be able to push our thinking beyond what we read in the paper, what's on Netflix, and what's in those stories that are very dominant in our culture. And I am absolutely convinced that there's a huge amount of manipulation in those stories […] In the array of narratives that get told in popular culture. They are often dominated by commercial motives and we see huge efforts especially on social media [….] and I'm really worried about that. But I know that the rising awareness of the fact that narratives are being manipulated in the media is also helping us to track them down. Hopefully, we’ll get better about dealing with that particularly pernicious problem.
Where do you see UNICEF being able to push in order to promote the idea of technology as a public good that serves everyone beyond those most privileged?
ND: Open source data is not a very useful public good if it just exists as .jpeg that people need to spend huge amounts of effort to analyze and do something with. It's not a public good if the road outside your house is just a pile of concrete and bitumen and you have to create the road in order to drive on it. So when I say tech as a public good, I'm really talking about combined collaborative action to actually create something where someone can just ask a question like “How often is there water on this plot of land?” or “What was the damage done by locusts here in northern Kenya in the last three seasons?” Those are really, really important and useful strategic questions for the agricultural analysis for commodities, but also for the farmers themselves and it shouldn't just be the rich western, Silicon Valley traders that have access to that information. So being able to provide that platform is critical here.
With some technologies today, there’s a “double dipping” going on where the public pay for the tech to be developed in the first place, and then we’re paying for it again by volunteering our data to serve and train algorithms. Who do you think will have to pay for the rest of the Industrial Revolution?
ND: I think you raise a really important point here, which is the tension coming from the public paying for most of the basic research (through public university systems in most countries around the world, especially in the U.S. until recently). I think a lot of research has been funded by overseas students in the last 20 years or so. And with the pandemic we see that funding fall off considerably. And in the case of UK research funding, they're going to get grants from the EU and they're not going to get the flow through of international students, so that's going to put a lot of basic research under threat.
So it's going to have to be a combination of public largesse, and we are going to have to look at tax dramatically -- new tax rules for tech platforms and tech companies, which is already well underway. We were waiting for the OECD to come together and resolve some of the issues there on the basis of shifting the bets process, hoping that it will prevent a lot of the tech companies from shifting the profits out of jurisdictions into low tax areas like Ireland. So tax reform is one thing here that will help the payment of these things.
Then I mentioned philanthropy as well, like the Helmsley foundation putting $10 million towards public goods, etc. If you look at technology advancements in health and think about where that's coming from, a huge amount of innovation and technology is being led by Gates [Foundation] among many others. And this should be the private philanthropy money with a little bit of public assistance chimed in, rather than the other way around.
Finally, unless we get the governance right, it will be our data and our privacy that will pay for it. We will be monetized as consumers and literally you pay for it as a tax on everything you buy when you don't need to.
Nick Davis is an Australian strategy professional and scenario expert, focused on the links between the private sector, non-profits and public policy in economic development, stakeholder management, organizational performance and risk management. He is also an author of journal articles, reports and other publications on foreign investment, regional development, industry-level strategic analysis and scenarios. @nickdavis97