Readers like you keep news free for everyone.

More than 5,000 readers have already pitched in to keep free access to The Journal.

For the price of one cup of coffee each week you can help keep paywalls away.

Support us today
Not now
Sunday 24 September 2023 Dublin: 17°C
HowardLake via Flickr
Column We’re all being misled. Why? Because we don’t understand numbers
Misunderstandings and lazy reporting of statistics can lead to everything from wrongful convictions to tragic deaths, writes David Robert Grimes.

JUST LAST WEEK Fianna Fáil came out with a truly cracking howler: “Fianna Fáil is now getting more than 100 per cent less access to Prime Time than the Labour Party in the same position.”

This is poorly worded on several fronts, but as 100 per cent is the totality of something, this quote actually implies Fianna Fáil were getting minus coverage, a state of affairs that perhaps chunks of the Irish population might wish were possible.

But such faux-pas abound everywhere we look. Statistics can be a useful way to represent data, but lack of understanding and innumeracy often renders them tools of obfuscation, turning potential insight into real ignorance. In this regard, the effervescent Mark Twain’s alleged quip about the three kinds of lie, “lies, damned lies and statistics” was undoubtedly correct.

Take, for example, a HIV test. The western ink blot/ELISA test is 99.99% accurate. Your doctor, ashen faced, apologetically tells you that your test came back positive. Your heart sinks, and a myriad of questions creep up on you. But what is the probability you actually have HIV ? This might seem like a redundant question – it would seem self-evident that it’s practically certain that a positive test result means HIV is present. Yet if you’re a typical Irish person, there is just about a 50% chance that you have HIV with a positive test.

How does this make any sense ? The paradox is resolved by a little bit of extra information: the base infection rate for HIV among low-risk Irish people is 0.01%, which is about 1 in 10,000. If 10,000 low risk people walked into a clinic to get tested, you’d expect one of them to have HIV. It is practically certain this person will test positive for the disease. However, in the remaining 9,999 uninfected people, because the test does have a small margin of error, you expect it to produce one false positive. So with this reference class of low-risk people, you have two positive results, only one of which is a true positive. Hence the 50% figure.

‘If you find this surprising, you are not alone’

If you find this surprising, you are not alone. A statistic in isolation does not give the full picture. Many doctors, scientists and other experts get such odds incredibly wrong, sometimes with tragic consequences – patients sometimes get diagnosed with diseases they don’t have due to a statistical blunder. In the early days of HIV, this led to a number of suicides of patients who were actually HIV free. Even today, a surprising high figure of medical scientists make this mistake with diagnostics.

Statistical thinking is not intuitive. Yet we live in an age where we are constantly assailed by percentages, some potentially informative and some utterly dubious. Statistics can also be reported a multitude of ways, with seemingly different interpretations. Imagine the odds of getting a certain disease were two in a million or 1/500,000. Now let’s say a study finds that eating lots of red meat brings this up to three in a million. The absolute risk increase is 1/million, or 0.000001% – a minor, almost negligible risk.

But this is more likely to be reported as a relative risk increase. Imagine the screaming headlines – “Eating red meat makes you 50 per cent more likely to get this disease!” They both convey the same information, yet one is far more likely to shift copy. The numbers themselves do not lie – it is how they are read that causes problems.

The ambiguity of statistics also appears in finance. With just the right amount of finesse, one can make a loss appear as a profit to the unwary. A stock falling in value by 50% and then increasing by 80% seems to imply the final value of the stock would be greater by 30% than its initial amount. But this is a fallacy because these percentages refer to different things.

Using real numbers, this is easy to see: €1,000 falls by 50% to €500. This then increases by 80% to €900. The stock has fallen in value, but as the statistics subtly refer to different events, it fools us into assuming there has been an increase. This is why all too often talk of currency values, house prices and interest rates are often essentially meaningless – percentages on their own tell us practically nothing.

‘Several women were convicted on the basis of numerical ineptitude’

Yet despite this credo, many experts have the unfortunate habit of abusing statistics. Sir Roy Meadow was expert witness in a number of cot-death trials in the late 1990s . Using atrocious statistics, he convinced a jury it was almost certain these were in fact murder cases. As a result, several women were convicted on the basis of numerical ineptitude. These convictions drew the ire of the Royal Statistical society and were eventually overturned, but not before leading to the death of Sally Clark, one of the wrongly-accused women. Journalists and media outlets regularly commit jaw-dropping statistical abuse, perhaps from the urge to make something more powerful, or through yet another example of our pervasive innumeracy, or perhaps both.

So what can be done about this ? It’s important to state again statistics themselves are not the problem – our lack of understanding is. The first step should be to improve this very understanding. Darrell Huff’s How To Lie With Statistics and Gerd Gigerenzer’s Reckoning with Risk are entertaining introductions to statistics for general audiences and light on math. Wikipedia has a number of well written entries on the subject, including Misuse of Statistics.

When possible, media outlets should report figures in natural terms when statistics are likely to mislead. This would massively reduce confusion as it is a much more intuitive way of stating changes in risks and values. If it is 20 degrees Tuesday, everyone understands a fall of four degrees the next day means 16 degrees on Wednesday. Writing this as a fall of 20% is not nearly as clear. Tuesday was four degrees higher than Wednesday, or 25% higher. The natural frequencies remain consistent with a difference of 4 degrees each way, but the statistical difference changes both ways. For this reason, natural statements of finance, risk and value should be preferred over statistical statements when possible.

The crux of the issue is we must learn to understand statistics. Numeracy can be improved, and it is essential that we, as a society, do so. HG Wells once said “Statistical thinking will one day be as necessary for efficient citizenship as the ability to read and write.” That day is now. Until we understand statistics and data as more than just some arcane numbers or soundbite, we will continue to fall victim to misinformation.

David Robert Grimes is a doctor of medical physics with a keen interest in the public understanding of science. He writes a blog on science and medicine called Three Men Make A Tiger.

David Robert Grimes
Your Voice
Readers Comments
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.