Site icon European Science-Media Hub

Interview with Philipp Schmid on science denialism, misinformation & the importance of public confidence in the safety & effectiveness of the Covid-19 vaccine

Philipp Schmid interview, Newspapers on a laptop

Dr. Philipp Schmid is a scientific researcher at the Department of Psychology at the University of Erfurt (Germany). Philipp’s research aims at analysing the psychological reasons of science denialism. He is the lead author of the WHO guideline on how to respond to vocal vaccine denier in public and a co-author of the Debunking Handbook 2020.

Can you tell us a couple of things about your background and how you came to conduct research into disinformation and science denialism?

I am a psychologist and currently a lecturer for statistic and methods at the Psychology Department of the University of Erfurt in Germany. I wrote my Phd thesis on effective strategies to counter science denialism and this continues to be my main research interest. As a student I worked on research projects focusing on why individuals reject or delay vaccination even when given the statistical facts. Individuals often encounter misrepresented data or emotional narratives shared by a science denier with some refusing to get life-saving prevention measures because of that content. The awareness that disinformation can kill motivates me to support people in making informed decisions.

In a paper you co-authored with Cornelia Betsch and Marius Schwarzer you criticised legacy media’s approach to objectivity (presenting science based views and those of deniers as equal) as contributing in a way to the amplification of false narratives and conspiracy theories. Can you talk a bit about that? Have you had discussions with media professionals and ethicists on this issue?

Just as psychologists, journalists are usually motivated to help individuals stay informed and make informed decisions. Journalists’ so-called balancing of viewpoints is highly important for democratic discussions on different opinions but can be misleading in discussions about scientific facts. For example, media reports and discussions on how we want to address the climate crisis should include viewpoints from all parts of society. However, the question of whether climate change is real, is an empirical question and can thus only be addressed by the scientific community. Balancing viewpoints on whether climate change is real could leave the public with the impression that no one actually knows the answer, that is, balancing viewpoints can mislead. Several scientific studies have shown the misleading effects of balancing. For example, a study by Dixon and Clark (2013) shows that balanced media reports about the autism-vaccine controversy can decrease the intention to vaccinate a future child.

Media professionals also suggest potential solutions for this issue. One of the ideas is the so-called weight-of-evidence reporting. A pioneer of weight-of-evidence reporting, Sharon Dunwoody, describes this method asnot to determine what’s true but, instead, to find out where the bulk of evidence and expert thought lies on the truth continuum and then communicate that to audiences.” In the study by Cornelia Betsch, Marius Schwarzer and myself we tested different weight-of-evidence strategies and found that journalists can effectively reduce the negative impact of messages of science denialism by simply warning the public about the potential impact of balancing. The work on weight-of-evidence reporting is a good example of how psychologists and journalists can work together to tackle some of the biggest challenges of our time.

Another report you co-authored, the Debunking Handbook raised a couple of interesting points that I would love to hear you expand on such as the fact people will believe falsehoods if they are repeated often enough. Also, is there any consensus in terms of the ‘backfire’ effects of correcting disinformation? In terms of interventions which strategies work best with science deniers?

The handbook reflects the current state of knowledge for debunking misinformation and is intended for all users of science communication. It also addresses the issue of repeating misinformation in corrections. A robust finding in psychological studies is that repeated information is more likely to be judged true than novel information because it has become more familiar. Early studies also found that this can be the case when one seeks to correct a myth because corrections usually repeat the myth. That means, by correcting the myth one might end up reinforcing it. This is called the familiarity backfire effect. However, a number of new studies found no consistent evidence for systematic backfire effects. In conclusion I think there is no need to fear the backfire effect. Nonetheless, one should be mindful not to give undue exposure to myths. If a specific myth is not known in public, then there is obviously no need to correct it.

Another issue is how to debunk. The Debunking Handbook provides some detailed guidance for that. For example, debunking is more likely to be successful if one uses a fact-sandwich, that is, the myth should be embedded in scientific facts and connected to an explanation. The recommendation is to top the sandwich in the following way: fact-myth-fallacy-fact. This topping translates into: Lead with the fact, warn that misinformation will follow, point out logical or argumentative fallacies underlying the misinformation and conclude by reinforcing the fact. Such detailed corrections promote sustained belief change over time.

Given the fact the next big challenge will be people’s willingness to take the Covid-19 vaccine, how would you advise health authorities and governments to prepare? What do you foresee as the main problems in its acceptance?

Representative surveys of public opinion, like the Covid-19 Snapshot Monitoring (COSMO) in Germany, reveal that confidence in the safety and effectiveness of the vaccine will play a major role. A lack of confidence in vaccination is often the result of misinformation and a lesson from the debunking handbook is that in times like these we should focus on prevention first. Debunking will not be necessary if we can inoculate the public against misinformation. According to psychological inoculation individuals can respond adequately to arguments from science deniers if they learn to formulate counterarguments in advance. Research about science denialism (e.g. Diethelm and McKee, 2009), shows that science deniers tend to use the same five rhetorical techniques to persuade others: fake experts, conspiracy theories, false logic, impossible expectations and cherry-picking. For example, vaccine deniers use impossible expectations when stating that vaccines should be 100% safe because no medical product from heart surgery to pain-killers is ever 100% safe. These techniques will most likely be used about the Covid-19 vaccine so knowing this in advance can help design communication strategies to warn the public and equip them with solid counterarguments.

However, apart from misinformation complacency is another important determinant of vaccine hesitancy. Complacency relates to a lack of awareness of the threat of the vaccine-preventable disease. The Covid-19 Snapshot Monitoring (COSMO) in Germany shows that if Covid-19 is not perceived as a threat then the intention to vaccinate decreases. Complacency could be a major driver of vaccine hesitancy among individuals that are not considered members of a risk group, such as younger individuals. However, a study by Betsch et al. 2017 reveals that individuals are more willing to get vaccinated if they know about the concept of herd immunity. It follows that highlighting the social benefit of vaccination could increase the willingness to get vaccinated.

Why do you think a growing number of people across the globe believe in Covid-19 conspiracy theories? If the causes of this are psychological does that mean we have to ground any response on psychology rather than just science too?

Work by Karen Douglas at the University of Kent shows that conspiracy theories are appealing for reasons that are rooted in epistemic, existential and social motives. First, individuals aim to find causal explanations for major events such as a pandemic. Conspiracy theories (e.g. Covid-19 is a biological weapon) provide a simple explanation – that is they satisfy an epistemic need. Second, individuals aim to feel safe. Other conspiracy theories (e.g. Covid-19 does not exist) provide a simple way out of a threating environment by providing an illusion of safety – that is they satisfy an existential need. Third, individuals aim to belong to a social group and conspiracy theories are an easy way to define an ingroup (the believers) and an outgroup (the conspirators who are actually in control and guilty) – that is they satisfy a social need. Independent of the motives, studies show that belief in conspiracy theories leads to a decrease in the willingness to comply with prevention measures such as wearing masks. Identifying effective measures to prevent the spread of Covid-19 can only be one building block in an effective fight against the disease. Another building block is to understand human behaviour and facilitate informed decisions.

Useful links :
Weight-of-Evidence Strategies to Mitigate the Influence of Messages of Science Denialism in Public Discussions.
Effective strategies for rebutting science denialism in public discussions.
How to respond to vocal vaccine deniers in public.

Consult all our exclusive interviews on infodemic


Exit mobile version