Interview with Dr. Ed Pertwee: vaccine hesitancy and lessons learnt

Rumors, conspiracy theories and misinformation during a health crisis: “The problem seems to be informational reliance on social media, not social media usage per se,” says expert Dr. Ed Pertwee.

With possible new Covid-19 waves in the back of our mind and in order to save lives, understanding communication during a health emergency is critically important.

Ed Pertwee profileWe asked expert Dr. Ed Pertwee, Research Fellow at the Vaccine Confidence Project, London School of Hygiene and Tropical Medicine about social uncertainty, vaccine hesitancy and the role of social media. How can governments communicate better, should they use new communication tools such as artificial intelligence-based chatbots to disseminate health information in the future?


Dr. Pertwee, in your recent research you mention that rumors and conspiracy theories linked to Covid-19 could be read as fears and anxieties, which emerge in time of acute social uncertainty. Do politicians and social media platforms have a role in reducing social uncertainty?

Ed Pertwee: The Covid-19 pandemic itself, and the policy uncertainty and information overabundance that have often accompanied it, have all likely contributed to people feeling less secure, less in control of their lives and more uncertain about the future.

However, these are not the only sources of social uncertainty. To pick one of many possible examples, technological change is another major source. For instance, conspiracy theories falsely linking Covid-19 to the development of 5G technology can be seen as part of a longer history of health scares around new technologies, as we have seen in the past with similar scares around high-voltage power lines, microwaves and mobile phones among other technologies.

Conspiracy theories like these appeal to a segment of the population that is uncomfortable with the pace of technological change and distrustful of the motivations of governments and technology companies.

Politicians and technology companies can certainly help alleviate some of the fears and anxieties expressed in Covid-19 conspiracies, for by example through greater transparency and accountability. However, the current social uncertainty has so many causes, from pandemics and technological change to financial crises, wars and ecological disasters, that it is not within the control of any single national government or any single sector of society.


Regarding vaccination, do social media play a role in influencing vaccination uptake and vaccine safety concerns?

Ed Pertwee: The available evidence suggests that digital media do play a role in influencing people’s attitudes towards vaccines, but it’s difficult to quantify the scale of the effect because there are so many other factors at play.

Some academic studies have used experimental designs to try to control this, but we need to be careful about how we interpret these, as experimental settings can never precisely mimic the real-world contexts in which people consume digital content.

One key finding across a number of different studies is that in terms of the relationship between social media and vaccine hesitancy, the problem seems to be informational reliance on social media, not social media usage per se. In other words, the issue is not how much you use social media, or how often you encounter misinformation there, but the extent to which you come to rely on social media as a sole or main source of information about vaccines.


Do you think that Facebook, Instagram, Twitter and YouTube have responsibility in tackling the spread of misinformation on their platforms? Are there sufficient and adequate policies in place to minimise the spread of misinformation?

Ed Pertwee: Social media platforms do have a responsibility to address the spread of misinformation and other harmful content on their platforms. At the very least, they need to ensure that their recommendation algorithms are not promoting such content and that they are not profiting from it.

An investigation by a UK-based campaign group early in the pandemic estimated that English-language anti-vaccination accounts alone could be worth up to nearly 1 billion euros to the platforms – a huge sum of money. Since then, under pressure from politicians and the public, platforms have begun to adopt stronger policies to address misinformation about Covid-19 and vaccines more generally.

There is an ongoing debate around how far platforms should go in terms of removing potentially harmful content, versus softer approaches such as signposting users to credible information sources.

While content removal may be justified in extreme cases, I would personally want to avoid a scenario in which social media platforms end up with a quasi-regulatory role in relation to online speech. This is not a role to which they are necessarily well suited, and in any case, these are private companies so there are major questions around democratic accountability.


What would be your advice to social media platforms on how to communicate during future pandemics?

Ed Pertwee: I’d like to see more transparency from social media platforms around what’s happening on their platforms. Often there’s a lack of basic information, for example around how recommendation algorithms work or how policies for tackling misinformation are being enforced. If we had better data, there would be more chance of developing effective measures to address misinformation and related issues.


In your opinion, should other communicational tools, such as SMS or voice calls, be used to disseminate adequate information related to vaccination and/or the pandemic in the future?

Ed Pertwee: Within the public health world there’s some interest in conversational artificial intelligence – “chatbots” – as a communication tool. The Vaccine Confidence Project, which is the research group that I belong to at the London School of Hygiene and Tropical Medicine, has been involved in collaborative research on this topic with Hong Kong University, the National University of Singapore and the Health Intervention and Technology Assessment Programme in Thailand.

This is a new field of research, but emerging findings from this and other studies suggest that chatbots may have a role to play in supplementing other forms of vaccine communication, for example during crisis situations where healthcare systems are overwhelmed and it’s difficult to get a face-to-face appointment with a healthcare provider.


Do you think it would be better if scientists (instead of politicians) disseminate pandemic-related information? Would that increase trust among the general population?

Ed Pertwee: I’m not convinced that a more technocratic model of health communication would necessarily help increase public trust. Scientists certainly have a vital role to play in communicating facts and evidence, especially during a health emergency, and ensuring that they can do this free from political interference is clearly important for public trust in science.

However, insofar as politicians are the ones deciding which interventions to enact (or not enact), I think they should be responsible for communicating the rationale for those decisions to the public.

Related content
Professor Heidi Larson: “This is a key moment to build trust in countries and socio-economic groups with relatively low confidence in vaccines.”
Trust in science: a weapon to combat misinformation
Interview with Frank Kelly: How to effectively communicate uncertainty?

Consult all our exclusive interviews on the infodemic