
One of UnHerd’s key concerns is how we can better measure and report our fast-changing times. With that in mind, we are running a series on which subject areas/ institutions/ parts of the world are least helped by how the news media reports upon them. Tom Chivers kicks us off…Â
Here is a fact that many people might not realise, and that explains a lot of what the media gets wrong about science: usually, scientific studies are just not that interesting or important. That may sound ridiculous, but itâs true. Scientific studies, plural, are interesting and important. But any given one of them, not so much. Itâs not always true â the first papers about the discovery of the Higgs boson or gravitational waves, they were pretty interesting and important â but often, especially in complex areas like healthcare or psychology, it is.
Thatâs because science is messy. Say youâre trying to find out whether wine gums prevent or cause haemorrhoids, so you do a literature review. You find 20 studies, but they all say different things. Five of them say people who eat wine gums are a bit more likely to get piles. But six say a bit less. Two of them say âmuch less likelyâ, one says âmuch more likelyâ. And six find no significant effect at all. Findings like these are completely normal: any real effects are often hard to tease out from noisy data.
If you actually wanted to know what the impact of wine gums on haemorrhoids was, youâd look at all 20 studies in the aggregate. Maybe, after carefully looking at the data and checking that the studies were well conducted, youâd conclude, cautiously, that they might have a small protective effect.
But if you wanted to tell people that wine gums cause haemorrhoids, to scare them, then youâd just take one of the studies that show the opposite. Then you might, for instance, put it on the front page of your newspaper, under the headline âWine gums cause piles, says new studyâ. It would be literally true; the study does say that. But it would also be nonsense, because the evidence does not show that. As a wise blogger once said: beware the man of one study.
Unfortunately, the media â by its nature â is largely made up of Men Of One Study, or more accurately Stories Of One Study. Thatâs because we are incentivised, as journalists, to show new things and sudden change. We need events. Plane crashes, not mortality risk statistics.
In science, the events are usually the publication of new studies. I found 1,700 examples of the exact phrase ânew study saysâ on the Daily Mail website alone. New study says half a glass of wine could stop some babies breathing. New study says a glass of wine a day can lead to the shakes. A new study says coffee can make you live longer. Some of these studies may accurately represent the state of reality, but many will not.

Itâs also worth remembering that the media is also incentivised to find the most dramatic studies, because drama sells, and that the most dramatic, surprising results are the least likely to be true. (If itâs surprising, we werenât expecting it, so it doesnât fit the existing body of evidence.)
Itâs not fair to pick on the Mail. It happens across the industry, although some outlets do it more than others. And it is, I think, the fundamental problem of how science is represented in the media. At my old employer, BuzzFeed, we used to try to get around it by not reporting on single studies unless we felt they met a threshold of believability and importance; otherwise we tried to do original reporting. But thatâs time-consuming and expensive, and few reporters have that luxury.
Iâve spoken to science journalists who have to write five stories a day. All you have time to do is find the sexiest press releases and write them up. As the media withers, and there are fewer reporters to fill the same space, the problem wonât go away.
Normally itâs nice to end this sort of piece on a âhow weâll fix itâ note, but honestly I donât know how. The incentives to publish dramatic stories on the back of a single study are so extreme. All I can recommend is that readers, if they see a story about substance X having effect Y on humans, look out for the phrase ânew study saysâ and treat it with extreme caution.
Join the discussion
Join like minded readers that support our journalism by becoming a paid subscriber
To join the discussion in the comments, become a paid subscriber.
Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.
Subscribe