Interview with Sara Degli-Esposti, Research Director of the TRESCA project and Principal Investigator (PI) for the Spanish National Research Council (CSIC). She is a Research Fellow in the Institute of Public Goods and Policies (IPP-CSIC) and an Honorary Research Fellow in the Centre for Business in Society, Coventry University (UK). Her areas of research include the effects of institutional trust on people’s acceptance of digital surveillance technologies, automation, digital rights and compliance with European data protection laws, cybersecurity economics and algorithmic accountability. With a Masters in Business Administration and Quantitative Methods and a PhD in Information Management, Sara is keen on applying mixed-methods research design and non-parametric statistics to tackle her research questions.
What will the CSIC’s contribution to the project be?
How dangerous is fake news and what is the best way to fight misinformation?
Sara Degli-Esposti: Between March and June 2020, during the pandemic, there has been a radical increase in the amount of fabricated information around COVID-19. As reported by NewsGuard’s Coronavirus Misinformation Tracking Center, a large number of conspiracy theories build on the idea that SARS-CoV-2 was a human-made bioweapon (either in China or in the US depending on the narrator), contained HIV-like insertions, that its contagiousness is exacerbated by 5G technology, but (on the bright side) it can be cured with garlic or lemon. However, it will get worse if you have been vaccinated against flu. So, to recap, you start out by believing something very general and end up with some clear instructions about what to do. Your brain is satisfied because it is making almost no effort by accumulating only consonant cognitions. The problem is you have been infected by an infodemic that is pushing you towards taking very bad health decisions.
Fighting misinformation is not easy. Certain individuals are predisposed to refrain from changing their beliefs even in the face of good corrective evidence, and ideology and personal world views can be major obstacles to de-biasing. Skepticism can reduce misinformation effects, as it leads to more cognitive resources being allocated to the task of weighing up the veracity of both the misinformation and the correction. Memory effects play an important role here. It is important to minimise the unnecessary explicit repetition of misinformation to avoid boosting memory and retrieval effects. When providing factual alternatives to the retracted inaccurate information, it is important to offer an alternative causal explanation of the event to fill the gap left by the retracted information. Another good strategy is to lead people to self-affirming corrections through the use of educational tools for refuting misinformation. Warnings at the time of the initial exposure to misinformation are also useful.
Can anybody be vulnerable to conspiracy theories or is there a specific profile?
Sara Degli-Esposti: As we explain in a TRESCA blog post on conspiracy theories, conspiracy theories are powerful because they offer a very vague and therefore flexible and adaptable causal explanation that can work well in many circumstances. Having a bunch of bad, powerful people plotting against people-like-you can explain almost anything going wrong in your life. Think about it. It is a theory made up purely of assumptions that don’t need to be proved. Once you believe in it, you can use it to explain almost everything. Once you have started using it as an overarching theory, you will retain only information consonant with that theory. Why? Because of cognitive dissonance. Everybody has a tendency to prefer consonant over dissonant cognitions. This means that we tend to seek out and remember information that is congruent with what we already believe. This process can easily lead people to disregard information that runs counter to their opinions or knowledge. Social media and recommendation systems on digital news outlets can further reinforce this bias by showing us information consistent with what we have seen in the past. This phenomenon is known as the filter bubble. Information travelling on encrypted messaging applications (think of WhatsApp as an example) also have the potential to reinforce this type of confirmation bias by delivering messages from strangers forwarded to us by our trusted contacts. Often these messages are completely fabricated!
Does the TRESCA project take gender equality into account? If so, how?
Sara Degli-Esposti: The TRESCA team is a diverse, gender-balanced team. Not only is the CSIC team led by a woman, who also happens to be more junior than the rest of her team, but also the role of Marina Tulin in assisting Jason Pridmore in leading the Erasmus University coordination team and the entire TRESCA project is very important and an example of building female leadership within academia. This is particularly important within social science research, a field dominated by women, but one in which their contributions are all too often overlooked. TRESCA will continue to highlight the contributions of women researchers to our work. Moreover, the Science|Business team is led by Jenny Lee, the Marketing and Communications Manager, and the ZSI team is coordinated by Pamela Bartar. TRESCA has also benefited enormously from the work of Kurzgesagt’s Elisabeth Steib who is working to fact-check, develop and script our project’s video. The team is also very diverse in terms of skills and types of experience. We have managed to create a small laboratory where the scientists involved in the project are continuously being asked questions by science communication professionals about their public role and engagement.