Site icon European Science-Media Hub

Renée DiResta: “The effectiveness of countering Covid-19 disinformation depends on the degree of trust your government enjoys”

Renee DiResta interview, newspapers on a laptop

Renée DiResta ESMH scientistRenée DiResta is the Technical Research Manager at the Stanford Internet Observatory. She investigates the spread of malign narratives across social and other media networks. Renee’s areas of research include disinformation and propaganda by state-sponsored actors, and health misinformation and conspiracy theories. Renee has advised Congress, the State Department, and other academic, civic, and business organizations, and has studied disinformation and computational propaganda in the context of pseudoscience conspiracies, terrorism, and state-sponsored information warfare.


Can you tell us a couple of things about the Virality Project that you have launched at the Stanford Internet Observatory?

Renée DiResta: The purpose of the project is to look at similarities and differences in the ways nation states communicate on the issue of Covid-19, both to their citizens and in their outward-facing online public diplomacy. We have started looking at China, Russia, Saudi Arabia, Venezuela, and the United States, and more specifically the ways in which information is targeted at particular audiences and how Covid-19 disinformation is playing out in these countries.


Have you seen anything that resembles coordination in terms of how different state actors are pushing Covid-19 disinformation online?

Renée DiResta: We are looking at ways in which one country picks up another’s narrative. For instance, we have seen Russia pick up Iranian and Chinese state media content and run it on its own channels. What you see is Russian media, RT or Sputnik, reporting for example that the Iranian foreign minister suggested coronavirus was a US bioweapon. In effect, they repeat the other government’s conspiracy theory without stating they themselves are supporting it. We ‘ve seen Russia do this in the past with the Hong Kong protests for example, where they would pick up and run narratives about violent protesters from Chinese media. This information overlap is central to ensuring that certain themes are repeated and re-enforced.


You mentioned RT and how it is factored in the misinformation issue. In the context Covid-19 misinformation, where you able to compare how legacy and social media contribute to this issue?

Renée DiResta: It’s very difficult to separate legacy and social media at this point and I’ve spent a lot of time trying to emphasize that these two things are not distinct anymore. Legacy media has a presence on social media now, too. The aspect of social media that is still distinct is when it’s peer-to-peer unique content that’s spreading, where people are creators as well as conduits. So in our research we try to examine this differentiation between bottom-up user-generated content, versus top-down media or state-media generated content.


You have extensive experience researching anti-vaccination disinformation. How much of an overlap have you seen between the actors engaging in that and Covid-19?

Renée DiResta: There is an extensive amount of overlap. Covid-19 misinformation is bigger than just anti-vaxxers. Anti-vaxxers are highly active in the Covid-19 disinformation space but other groups have come into the conversation as well. You see anti-vaccine communities have been active in online platforms for about ten years, they developed their networks extensively over the last five. Anti-vaxxers don’t want any vaccines because they believe the safety studies are falsified or that the government or vaccine makers are lying to them. So that dynamic is very much an undercurrent in the Covid-19 conversation in the US right now. But we also see the anti-vaxxers in Spain, Italy, Romania, Poland, Ukraine, so there a couple of other countries where they are prevalent in the Covid-19 discussion.


You have mentioned that one of the challenges governments will face is disinformation in relation to the Covid-19 vaccine – if and when we manage to create one. Do you have any thoughts about how to prepare for that?

Renée DiResta: The effectiveness of countering Covid-19 disinformation really depends on the degree of trust your government enjoys. This dynamic is the same around disinformation in relation to elections. There are certain places where government countering narratives just works better. Those places have usually a high degree of trust in the media and a high degree of trust in the government. So there is going to be a really big spectrum in how these situations are going to play out. In the US we have a low degree of trust in government and a low degree of trust in media, as well as this pervasive belief that small media online are somehow better or less “compromised” than large institutional media and the government. That’s the perception. The media sources that people choose to follow are so integral to this particular problem, because depending on who they trust, they’re either going to get good information or bad information.


How do you see Covid-19 disinformation affecting the upcoming presidential election?

Renée DiResta: Any story can feed a conspiracy theory if it’s framed in a certain way. For example, in the recent US protests a lot of people were in close proximity to each other, so the chances of a second wave have just increased significantly. In some media we may see the positioning of the protesters – most of whom are of the opposite political party to the president – as the instigators who caused the second wave. I think there will be a lot of conspiracy theories about the death count. In the US the death toll was a topic of conspiracy theories, with online influencers suggesting that certain types of deaths were misclassified as Covid-19 related to inflate the toll and ruin the president’s chances of re-election. There are so many different layers of this.


Are there any aspects or strategies of Covid-19 disinformation that are not currently discussed by policymakers?

Renée DiResta: The policy focus should be on the systemic problems that enable the various forms of information disorder, not on specific features or tactics. Unfortunately every now and then, you see feature-focused things like the “bot bill” pass, and they are usually out of date before the law is signed. The bot bill is a law that legislators in California decided to pass, to regulate automated bots on social media. But there are a lot of problems with that law. It was poorly defined, legislators couldn’t really articulate what they meant by “bots”, they couldn’t really articulate who should be responsible for them, so they ultimately put the responsibility for declaring that an account is automated on the account owner itself, which is bizarre. People who want to stand on the right side of the law, people who use bots for business or artistic endeavours, are the ones who are going to declare – but they are also the ones whose bots aren’t harmful or manipulative anyway. There is no Russian troll running malicious automation who is going to abide by that law. So it’s important for regulators to understand that they shouldn’t be doubling down on these small tactics but instead focusing on high-level aspects such as the structural pillars that result in online harms and how we can address those.

Consult all our exclusive interviews on infodemic

Exit mobile version