Sander van der Linden, Ph.D., is Professor of Social Psychology in Society and Director of the Cambridge Social Decision-Making Lab in the Department of Psychology at the University of Cambridge.
Professor van der Linden, we have seen many attempts to debunk climate and health disinformation. How well do fact checks fare?
Plus, people might acknowledge a fact check, but continue using the debunked misinformation in their reasoning. They cannot unhear misinformation. US studies tested this effect using some of Donald Trump’s false statements. People who were exposed to fact checks did acknowledge that he may have lied, but that did not change how they felt about voting for him. Misinformation just lingers. And in terms of scale, a fact check will hardly ever reach as large an audience as the original misinformation.
So you developed a ‘vaccine’ against misinformation that immunises people before they are even able to consume misinformation. How did that come about?
Sander van der Linden: We found that some epidemiology models can be used to predict the spread of viral information just like the spread of a biological pathogen. In some cases, misinformation acts like a simple virus, infecting humans after a single exposure. The logical consequence of this was: if misinformation behaves like a virus, then we should be able to produce a vaccine to counter it.
How does your misinformation vaccine work?
Sander van der Linden: Just like a traditional vaccine that injects us with a weakened dose of a pathogen, which then triggers the production of antibodies. Ideally, this then confers immunity against future infections. So we tried pre-emptively exposing people to severely weakened doses of misinformation. It is a two-pronged process. First, we warn people in advance, and then we refute the misinformation before they even encounter it. This activates their intellectual and psychological immune system.
What makes forewarning people so crucial?
Sander van der Linden: The forewarning kick-starts the immunisation process and alerts the psychological immune system. Psychologically speaking, it gives people a sensation of relevance, as it tells them why they should care about a specific bit of information. Otherwise, it would just get ignored, like many fact checks. This is just how we navigate the world; we ignore most of the information surrounding us if nobody warns us that something could affect us negatively.
Once you’ve warned people, you ‘prebunk’ misinformation. Can you give us an example of how that works?
Sander van der Linden: We tested that approach on climate change misinformation. In 2016, a story went viral on Facebook claiming that 31 000 US scientists had signed the ‘Oregon Petition’, which claimed that there was no scientific consensus on global warming being man-made. In our experiment, we first warned people that there are politically motivated actors trying to deceive them. This warning is quite powerful, because people are very sensitive to manipulative intent. Nobody likes to be deceived.
We then refuted the actual misinformation in advance. To do that, we had to expose people to a weakened dose of the false claim and then ‘prebunk’ it. We told our participants that they may come across a false petition, but did not tell them in advance which one. After that, we told them that many signatories of the actual petition were fake, like Charles Darwin and the Spice Girls. We also pointed out that most of the 31 000 signatories were not scientists with PhDs and we put the false claim in context. Even if all of the signatories had been scientists, that would have only accounted for 0.1% of all US science graduates.
How did the participants react to this inoculation?
Sander van der Linden: After the inoculation, we exposed them to the full dose of misinformation by letting them read the petition’s website. We then tested their beliefs against a control group’s beliefs. It turned out that the treatment group had been successfully inoculated against the information. Notably, we found no backfire effect. There is a fear that exposing people who already have sceptical views towards climate change to even small bits of misinformation as part of a fact check could reinforce their original belief in that misinformation, but we found no such effect. On the contrary, after the inoculation, our participants were able to recognise and protect themselves from absorbing more climate change misinformation.
How would that have gone without the inoculation?
Sander van der Linden: Consensus is a very persuasive tool, but misinformation can be used to undermine it. We first confronted our participants with the large scientific consensus that more than 97% of scientists support the fact that global warming is man-made. Without any prior knowledge, the participants may have assumed a 60% consensus. After seeing the facts, their estimate would then jump by 20% to more closely match the facts. Then we showed them the Oregon Petition’s misinformation, which we already expected could cause some confusion. However, we were shocked to learn that the misinformation had completely wiped out the initial learning effect. The inoculation prevented that wipeout effect to a substantial degree.
Do you have to develop and deliver a new vaccine for every single falsehood?
Sander van der Linden: Building specific passive vaccines for every single myth takes a lot of effort, so we thought of a way to let people generate their own counterarguments instead of us highlighting the facts in ‘prebunks’. We built a simulated environment – a sandbox really – where people could actively use a weakened dose of common fake news techniques, such as conspiracy theories, trolling and impersonation. This way, they could discover for themselves how these work. Then we exposed them to deceiving headlines that were crafted using the same techniques, but that came from different domains. It turned out that being exposed to a specific technique does help people recognise that same technique in a different context.
And you made it fun to get actively inoculated by developing a game that lets you step into the shoes of a misinformation producer.
Sander van der Linden: Yes. In our game ‘Bad News’, you can try the same techniques that false news producers use. Your goal is to produce false news content in a simulated environment in order to gain followers and build credibility among your audience. For example, one of the game’s fabricated tweets was seemingly written by Donald Trump and contained a declaration of war on North Korea, which in reality never happened. Also, if you look closely, we’ve tweaked subtle but crucial details, like the name of the tweet’s author. It looks almost real, but if you give it a closer look, it becomes clear that all of it is fake.
How do you know whether playing the game has had any effect?
Sander van der Linden: We ran tests based on simulated headlines that employ the same techniques the players had encountered in the game, but the content was different than what they had been inoculated against. For example, there was a conspiracy headline claiming that a rich group of bankers was manipulating the Bitcoin exchange rate. When we evaluated participants’ responses after the game, we found that our treatment group found misinformation significantly less reliable than the control group, because they recognised the technique being used. Plus, the effect is even visible when people encounter different techniques than those they have been trained on. So we do see cross-protection.
Let’s talk about herd immunity. How do you the maximise the vaccine’s reach?
Sander van der Linden: We have already reached more than a million visitors to ‘Bad News’. We have also produced short videos that highlight how techniques like emotional language and false dilemmas work. That is not a completely active inoculation, but it has a much wider reach. We tried to scale it up further by collaborating with Google Jigsaw. They let us do an experiment with five million users on YouTube as part of an ad campaign in which we inserted our ‘prebunk’ videos into the ad space that appears right before watching the video. If users clicked on our ‘prebunk’ ad, we would then test them on how their ability to discern fact from fiction had changed.
How well did that work?
Sander van der Linden: Better than we expected. We boosted misinformation literacy in users by 5%. This may not sound like a lot, but it needs to be put in perspective: brand lift ads typically yield small reaction rates.