Site icon European Science-Media Hub

A scientist’s opinion: interview with Ionica Smeets on hype in press releases

ESMH Ionica Smeets interview: News reporter or TV journalist at press conference, holding microphone and writing notes

Interview with Ionica Smeets, professor of science communication at Leiden University, the Netherlands.

Prof. Smeets, how do you feel when you encounter buzzwords such as ‘groundbreaking’, ‘world first’ and ‘landmark study’ in a press release?

© Ype Driessen

Ionica Smeets: One of the first things a journalist should realise is that press releases tend to overstate findings using these buzzwords. I recall reading one research paper that looked at how some animal studies were portrayed as medical breakthroughs. Years later, the paper’s authors revisited the studies’ actual effects and found that most had not really been breakthroughs. We have seen an immense rise in bold claims and buzzwords in press releases during the COVID-19 pandemic.

Who should take responsibility for this kind of miscommunication in press releases?

Ionica Smeets: Scientists, press officers and journalists often shift the blame onto each other. But all parties are involved: while scientists and press officers are responsible for their share of misinformation in press releases, journalists are responsible for checking that information before passing it on to the public. This triangle is actually a complicated relationship.

But there is one crucial difference: scientists and journalists are held accountable if they propagate misinformation. Press officers are not.

Ionica Smeets: That is an interesting way to look at it. It is true that if something goes wrong, the responsibility won’t fall on the press officers but on the scientists. We had such a case in the Netherlands a few years ago after a study on the quality of fish came out. The university issued a wildly exaggerated press release and the case was brought before the national body for scientific integrity. The paper’s author blamed the communication office, but the committee ruled that the scientist was responsible. I think that is the right way around. If the press office publishes hyped-up press releases, then it is your job as a scientist to protest.

Isn’t that asking scientists to do the job of press officers?

Ionica Smeets: Universities need to make sure they hire press officers who know their job in the first place. They need strategies that prioritise quality, and they need to instruct science communicators accordingly. The trouble is, we currently see many universities prioritising marketing and branding. That is also important, but the focus should be on releasing correct and honest information. That is what we need in order to gain the public’s trust and support for science. And we scientists, it is our duty to get this right – after all, we belong to the same institutions and share the same goals as our press officers.

Scientists like you conduct studies into how press releases misrepresent their field. Does that help?

Ionica Smeets: I think it does indeed help, but the insights we gather are limited to that specific field. What we really need to understand is this: what are the underlying structural issues; and what drives hype and misrepresentation? Only then can we draw conclusions for everybody involved in science communication, from press officers to consumers, and of course also journalists. Those insights then become the foundation for how we can improve trust in science news and develop journalistic codes.

What kind of journalistic codes?

Ionica Smeets: One mechanism that we looked into is how journalists work with expert quotes. More than 99 % of British and 84 % of Dutch science articles include at least one quote. But most of the time, this is just the quote from the press release. Only 7 % include an additional quote from an independent expert. As a result, these articles lack context. And those without a quote from an independent expert tend to be more often exaggerated. Alas, not all newspapers have a code for how to handle quotes.

Do you have an example of such a ‘quote code’?

Ionica Smeets: The Dutch newspaper de Volkskrant has an official and open code about how to handle quotes. It mandates that whenever a journalist writes about science, they have to call an independent expert, ask for an assessment of the research and then quote them in the article. Regardless of whether they agree with the findings or not, the journalist would always quote them to show readers that they have done their due diligence. Again, not every newspaper does this. Many journalists are pressed for time, so they don’t call up independent experts.

Is that why most journalists just take press releases at face value?

Ionica Smeets: It’s one of reasons, yes. Many general journalists have to write six or seven articles a day, and they are so pressed for time that they just take the material that universities hand them. This issue is specific to general journalists who cover science. They place great trust in trust universities’ press releases, something they would never do with press releases from a company or political party. We are also seeing more of this behaviour because real independent science journalists have become so rare.

How so?

Ionica Smeets: There are very few positions available to staff science journalists, so much of the science news coverage is being written by general journalists. It is really difficult to make a living as a dedicated science journalist. This introduces another conflict of interest: many are also working in science communication jobs. This, in turn, raises the question of how you can write independently about a university if you are also working on their brochures and project reports. You just can’t.

So the problem is systemic. How could we tackle it?

Ionica Smeets: Science media centres are bridging the gap to some extent. They are doing a great job as intermediaries between science and journalists. They are doing what journalists are often unable to do because of time pressure, collecting a number of independent comments on new studies which put findings and methods into context. We have a hunch that their approach really works, but as a researcher, I would like to investigate and quantify the difference they make. One way to do that would be to set up a media monitoring system that continuously compares media coverage with and without the intervention of media centres.

Related article

Exit mobile version