We were sailing back to our home port and a dense fog descended. Suddenly we couldn’t see more than a boat length ahead. My father, a mariner by profession, plotted a course and steered by it, sending my brother and me forward as lookouts.
My mother was convinced we were sailing in the wrong direction, that we’d steered off course (and this was before the reassurance of GPS). “No,” said my father “you must trust your instruments”.
We made it safely home; it was an early lesson in believing data.
The amount of data produced and collected every day continues to grow. “Big Data” is a well-known, although poorly understood term. In many companies we’ve moved on to “data-driven decisions”. But we’re not always good at believing the data.
I was in a meeting recently where the most senior person in the room looked at a graph of twitter follower growth and said “I just don’t believe this data”. The data showed that goals for follower numbers would not be met. Leaving aside the argument on whether follower numbers is a good goal, the data don’t lie. If there’s a straight line of progress that won’t reach the goal then you need to change something or accept missing the goal.
It made me think about when we believe data and when we should be sceptical.
We tend to measure progress against an expected path, and in a large organisation invariably report that progress upwards in the organisation. In our plans and projections that progress follows a nice upward curve. But the reality is different, every project encounters setbacks, and the graph is more jagged than smooth.
In fact a smooth graph, where targets are always met should raise questions.
Years ago I was chatting to a guy who left his previous company after about four months. He left because the targets for the quarter were increased by 25%, and everyone met them. As an experienced business person he knew that a situation where every business unit met the stretch goal in the first quarter it was applied was very very unlikely. His suspicions were raised and he left as quickly has he could. A year later the company collapsed under its own lies. The company? Enron.
In his articles (and books) Ben Goldacre campaigns for greater journalistic care in reporting data, and better education on scientific method. He points to the dangerous habit of pharmaceutical companies in cherry-picking their data, choosing studies that support their product and ignoring those that don’t.
I said earlier that we should trust the data, but we also need to know how the data was collected, what errors might be inherent in the data collection methodology, and what limits there might be to interpreting the data. This should be part of everyone’s mental toolkit. It would help us evaluate all those advertising claims, refute 90% of the nonsense on the internet, be honest about progress to goals, and finally make data-driven decisions.