This is a romp through Dr. Goldacre’s analysis of weak claims and poorly reported science. He argues that journalists should cite, and link to, the sources of the research behind the headlines. He also argues that we, the unsuspecting public should know how to read scientific studies for ourselves, and we should question the reports rather than swallow the conclusions whole.
So if you’ve ever read a science-y headline and thought to yourself “that doesn’t sound right” this book is for you. It takes a look at scientific method and points out some of the pitfalls in constructing a good experiment and in the process gives some pointers about what to look for when evaluating a scientific story;
- Who funded the study?
- How well was the experiment designed?
- sample size
- scientific method; was there a simple
- testing a single hypotheses
- Cherry Picking the data; does the report use a small group of reports to prove a point rather than all research?
In the past three weeks three cases have popped up in social media that prove the need to both hold journalists to a higher standard and to educate us all.
(1) Proving nothing; A Swedish family ate organically for two weeks, and tests showed a drop in the concentration of pesticides in their urine.
So the family had their urine tested for various pesticides on their usual diet, then ate organic food for two weeks, then tested the urine again. Their urine was tested daily over the two weeks and by the end there was almost no pesticide in the urine.
Note that “organic” doesn’t mean pesticide-free, so the family could still have consumed some pesticide with their organic meals. The article doesn’t report on whether that was tested for.
Which the article calls a ” staggering result”. No, not staggering, school level biology. You could do the exact same test with vitamin C. Give people a high vitamin C diet for a month, then remove vitamin C from their diet. Hey presto! No vitamin C in the urine.
This report hits the trifecta; small sample size, poor design, funded by a supermarket with a range of organic foods. Essentially this “experiment” simply proved that the Swedish family have well-functioning kidneys.
(2) Faked Data; There was a really interesting study done on the attitudes to same-sex marriage. It concluded that conversation with a gay surveyor/canvasser could induce long-term attitude change. The study seemed to be well constructed, with a good data set supporting the conclusion. The optimistic news was widely reported late last year when the study was released.
But when scientists started digging into the data, and trying to replicate the results something didn’t stack up. The study has now been retracted by one of the authors, it seems there will be a further investigation.
It’s not always the journalists at fault.
(3) We’re easily fooled; Daily dose of chocolate helps you lose weight.
But it turns out that it’s rather easy to generate the research and result to prove this, and extremely easy to get mainstream media to report on it. As John Bohannon proved in setting up this experiment and the associated PR.
So there can be flaws or outright fraud in science. Journalists can, on occasion, twist the story to deliver the headline. And we, the public are ready to believe reports that re-inforce our own opinions, and we’re too ready to believe good news about chocolate.
Turns out if it sounds too good to be true we should ask more questions.
Many of the articles in this book are already published in the Guardian, and if you want to read more on bad science Dr. Goldacre has his own site with the helpfully short title; Bad Science. He campaigns for greater journalistic responsibility on reporting science, for using the scientific method to test policy decisions, and for better education on scientific method.
He’s right, on all three.