Zarine Kharazian from Digital Forensic Research Lab about the research on Covid-19 disinformation

Zarine Kharazian profile pictureZarine Kharazian is Assistant Editor with the Digital Forensic Research Lab, a start-up within the Atlantic Council that focuses on researching and combating disinformation and protecting democratic institutions and norms from those who would seek to undermine them in the digital engagement space. At the DFRLab, she has covered disinformation trends in the United States as well as the South Caucasus, with a particular focus on actors and operations that challenge existing definitional boundaries and developing novel research techniques to account for them.


Can you talk a bit about your current work on Covid-19 disinformation? What aspects of the issue is the Digital Forensic Research Lab currently investigating? 

The DFRLab primarily studies the informational aspects of the pandemic – particularly misinformation, disinformation, and incomplete information about it. Several months in, we have already seen some of the first empirical research assessing the relationship between the consumption of misinformation and public health outcomes in the early stages of the outbreak. This growing body of work increasingly suggests that the quality of the information we consume shapes our beliefs – and those beliefs, in turn, can mean the difference between life or death. 

Another defining feature of the online conversations we’ve studied related to Covid-19 disinformation is that they are multipolar – there is no one actor dominating the discussion, but rather, various communities that promote aspects of Covid-related conspiracies for varying reasons. Some are ideologically motivated (state actors, QAnon, antivaxxers, etc.); others are motivated by profit (promoting an ebook, podcast, or so-called “natural healing” businesses). A portion appear to be a combination of both. The diversity of actors, sources, and motivations is what makes disinformation about Covid-19 so difficult to eradicate.  


How do you see Covid-19 disinformation playing out in the upcoming US election? In that context, do you think measures taken by tech platforms recently can be effective in protecting electoral integrity? 

Election-related disinformation often breaks down into two types – disinformation about the issues and the candidates, and disinformation about the electoral process. Amid the pandemic, this latter form of disinformation poses a heightened risk to electoral integrity. False information about revised voting procedures due to the pandemic, such as mail-in ballots, for example, may discourage people from voting. When targeted at specific groups of people, this kind of disinformation can amount to voter suppression. 

Platforms have generally been stricter with regard to disinformation related to the electoral process than with disinformation about political issues – and they’ve been stricter with coronavirus disinformation as well. The steps they’ve taken to combat disinformation – employing a spectrum of possible responses, from downgrading content to removing it completely, as well as introducing friction in the user experience by prompting users to read an article before they retweet it – are welcome ones. But these measures will have limited effect without a full-spectrum response that also includes government and civil society. 


In an article you published you mentioned members of the QAnon community heavily promoted the Plandemic video. Can you talk a bit about QAnon, why it has got so big and what its link with Covid-19 disinformation is specifically? Are these groups coordinating amplification?

QAnon is a fringe movement that really describes a sprawling web of connected and unfounded conspiracy theories, ranging from the existence of a worldwide shadow government to celebrity paedophile rings involving political and cultural elites. Its adherents believe that someone using the name “Q,” often believed to be a senior member of the Trump administration, is anonymously posting cryptic messages, called “Q Drops,” to online forums alerting them to the existence of an ongoing plot by a worldwide political and cultural elite to exert control over governments and institutions. According to the theory, Trump is battling these nefarious forces, and the so-called “deep state” is working in turn to foil him.  

What started as a fringe movement on obscure internet imageboards has been catapulted into the mainstream by far-right influencers and politicians, including the US president himself. Trump has regularly amplified QAnon accounts and conspiracies on Twitter, and a recent tally by Media Matters counted 60 current or former US congressional candidates who seem to have embraced the theory.  

One of the reasons for QAnon’s widespread and enduring appeal is that it is an all-encompassing and versatile belief system that can quickly integrate additional conspiracies without compromising its underlying worldview. This is what QAnon has done with Covid-19 disinformation. Covid-19 conspiracy theories about government cover-ups, global vaccinations conspiracies, or allegations that the virus is “a hoax” peddled by “shadowy elites” already resonate within the QAnon belief system, and it doesn’t take much additional work – either psychologically, or organizationally – to integrate them into it.  


Your research into Plandemic related activity on Facebook showed rapid link sharing among groups. What does that tell you? 

The rapid link sharing related to Plandemic underscores two points: first, that conspiracies like Plandemic spread through certain online communities at an alarmingly high rate, at times more quickly than platforms can suppress them; and second, that much of this spread is cross-platform and iterative.  

In studying the spread of Plandemic, we saw that the top URLs being rapidly shared to Facebook groups were links to the video hosted on YouTube. When YouTube started taking copies of the video down, and sometimes, in anticipation of these takedowns, people uploaded the videos to alternative platforms that have a reputation for lax content moderation standards. Links to copies of the video hosted on these alternative platforms were in turn shared to the same Facebook groups – and so the chain of Plandemic’s spread was never disrupted, even after YouTube began taking down copies of the video. The cross-platform nature of this activity makes the efficacy of content moderation by a single platform limited – harmful content usually finds niche online refuges for itself as long as there is demand for it. 


You have also talked about alt-tech platforms. Can you explain what they are and why – or indeed if – we should be concerned about them? 

Alt-tech platforms are social media platforms that brand themselves as alternatives to the major platforms, such as Facebook, YouTube, and Twitter. They typically claim a “censorship-free” philosophy, advertising the fact that they have more lax content moderation standards than their more popular counterparts. Because of those lax standards, they often serve as safe harbours for far-right and extremist online communities that have been de-platformed by mainstream hosts.  

One of the arguments in favour of de-platforming harmful content and its producers is that, even if the content moves to an alternative host, it loses a significant mainstream audience in the process. The thinking is that only the most hardcore of conspiracy believers will follow a conspiracy theorist from a popular platform, like YouTube, to a relatively obscure one, like BitChute. There is some logic to that, but the danger of these alt-tech platforms lies in the fact that the content they host almost never stays confined to them – links to the harmful content often make it back to communities on more popular platforms, luring more casual consumers to increasingly extremist content.  


You have also reported on Facebook accounts being hacked by a foreign, rival power to spread Covid-19 disinformation. Can you elaborate on how coronavirus disinformation can be weaponized in the context of international relations? 

We generally see two types of foreign disinformation: covert influence that works to exacerbate existing vulnerabilities in societies; and overt, opportunistic influence that attempts to further a particular geopolitical agenda. In the context of coronavirus, we have seen more of the latter than the former.  

Russia and China have both engaged in “mask diplomacy,” highlighting their shipments of medical equipment and aid to countries battling the coronavirus and contrasting these responses with those of Western governments, several of which are struggling to contain their own outbreaks. These efforts often tend to rely primarily on official diplomatic and government accounts, not covert or fake accounts – and they don’t always qualify as disinformation, because they do not always employ intentional falsehood. But they still constitute an effort to influence hearts and minds abroad in both governments’ favour – in that sense, they are information operations.  

We have also seen state-sponsored disinformation from foreign powers and their proxies – in most cases, this has been motivated by geopolitical competition and regional objectives. Pro-Kremlin outlets, for example, have surfaced claims blaming US-funded biolabs in Armenia, Georgia and Kazakhstan for the coronavirus outbreak as the Kremlin continues to push for Russian specialists’ access to some of those labs. This is a recurring narrative – as many disinformation narratives are – that resurfaces every once in a while, and is refurbished to reflect major ongoing public health threats – in most of the word right now, that happens to be the coronavirus. 


All of our exclusive interviews on infodemic

Newspapers on a laptop
Newspapers on a laptop
newspapers on a laptop
Stephan Lewandowsky ESMH interview
Infodemic exclusive interviews Expert Rasmus Kleis Nielsen

Leave a Reply

scroll to top