A scientist’s opinion: interview with Petroc Sumner on hype in press releases

Interview with Petroc Sumner, psychologist and co-director of the InSciOut project on science in the media.


What do we know about how the quality of press releases influences news stories?

Petroc Sumner profilePetroc Sumner: There is a strong correlation between the quality of press releases and the quality of the news stories that are based on them. Whether you look at the terms and phrases press releases use to describe a study, at the percentages, risks and quotes they give, or even at the funding sources and conflicts of interest they declare, if you find it in the press release, you will likely find it in the news story as well. Most of the data we have is correlational but there is only one plausible conclusion: journalists rely on press releases as at least one of, if not the major source for these news stories.


Would you say that there is an intrinsic motivation to overstate study findings?

Petroc Sumner: The motivations are complex. It is inevitable there will be some exaggerations because there is an intrinsic desire to get news. But there is also a very strong, competing desire to be seen as trustworthy. At the moment, press releases have a stamp of trustworthiness about them. If, 10 years from now, people trust blogs or isolated opinions instead of university press releases, then the universities will have lost something very valuable. It is more important to keep that stamp of authority than for any individual story to make headlines.


What qualifies as an exaggeration?

Petroc Sumner: For us to know where exaggerations come from, we first needed to define what an exaggeration is. For example, one way to exaggerate is to interpret correlational data as being causal when nothing in the original study points to that. That causal interpretation often comes either in the press release or at the news stage. We have defined another area of exaggeration: when articles start talking about humans when the original study was on mice. For example, when a news story referring to pregnant mothers implies that a study was conducted on humans, when in fact it investigated pregnant mice.


Why would press offices imply that a treatment is effective for humans when it has only been tested on mice?

Petroc Sumner: One of the historical motivations to withhold that kind of information has nothing to do with exaggeration. Press officers tried not to draw attention to the fact that universities were running laboratory experiments on animals. In the past, this secretiveness created a vicious cycle and encouraged the public to assume that scientists had something to hide. Luckily, a lot of work has been done to persuade scientists to be open and straight with the public about their work with animals. Now this pattern of withholding information about animal experiments is slowly beginning to break up.


While health-related press releases generally say too little about study limitations, they do tend to give too much advice. What do you make of that?

Petroc Sumner: It is quite common for articles on exercise or diet to give advice to readers, even when the study does not. Both journalists and press officers tell us they see it as their duty to reinterpret findings for their audience. Because news stories are aimed at different audiences to scientific articles, you would not necessarily expect that same advice in the original publication. That is a reasonable objection and may be the reason why we were not able to replicate our findings for this area in a replication study we conducted. In that sense, this third kind of exaggeration is different.


Which elements of a press release are most prone to being misinterpreted by journalists?

Petroc Sumner: The first few lines are absolutely critical. Both scientists and press officers have told us anecdotally that the headline is there to hook the reader. Some assume that, as long as they explain it accurately in the body of the press release, it does not matter if the headline is too simplistic. I would strongly disagree with that. We know from priming effects in psychology that what you read first is going to influence how you interpret the rest. A lot of news stories are very short, but even when they are longer, a lot of readers never get to the bottom of the story. They only read the first few lines. That is why it is important to get the headlines right.


But aren’t accurate science headlines often boring?

Petroc Sumner: If a study is not experimental and you cannot write a deft, causal headline, previous advice was to use correlational expressions such as ‘is associated with’ or ‘is correlated to’, which are quite cumbersome. If you write ‘cabbages cuts cancers’, that is very headline-able. But ‘cabbages are correlated with lower rates of cancer’? That is not a headline anymore. So we stopped advising press officers to use these expressions and asked them instead to use expressions like ‘might’ or ‘may’ as a compromise.


How do readers react to this difference in wording?

Petroc Sumner: We did a study where we compared how strongly they assessed different phrases and found three levels of strength. ‘Cabbages cut cancer’ is the strongest because it is causal. When we introduced ‘can’, that was still perceived as being stronger than all the correlational expressions. We also found that readers treat ‘is correlated to’ and ‘may’ more or less equally. If you write ‘might cut’ or ‘is correlated with cutting’, that makes no difference to readers.


After you published a paper on exaggerations in UK press releases in 2014, the hype rates dropped. Was that causal or correlated?

Petroc Sumner: It was correlated. We looked at a collection of press releases before that paper, and then again at another collection within the year after the paper came out. What we found is that the rates of exaggeration were lower afterwards. So we know that a change happened, correlated in time. But we do not know whether our paper caused that impact.

Related article

Leave a Reply