Tuesday, 28 May 2013

Confusing numbers

We should always be alert to the pervasive phenomenon of badly reported statistics!

I've just been doing some online research looking at statistics for the proportions of people who say they believe in God, or not. As a committed Christian believer, this subject is obviously of some interest to me. I came across this report in the Guardian (11 October 2010):


A report for the Church of England last week suggests that for many young people in the UK, Christianity is no more than a "faint cultural memory". A minority are explicitly atheist – about one in eight – and around four times as many believe in either a personal god or a vague higher power. But by far the largest category are those who just find the question bewildering and not very interesting: "I don't really know what to think" got 43% of the answers.

This is a typical example of reporting where the reported statistics just do not add up! 

(a) 'About one in eight' are explicitly atheist. That's about 12.5%.
(b) 'Around 4 times as many believe ...' So, that's about 50%.
(c) Which leaves only 37.5% for the third category of response - but this is reported as 43%.

If this 43% is correct, the discrepancy might be explained as the result of compounding two rounding errors. 

'About 1 in 8' could be a proportion somewhere between '1 in 7.5' (13.3%) and '1 in 8.5' (11.8%).
So, assuming the smallest proportion that might count as 'about 1 in 8', we could have 11.8% in category (a).

This means the most we can have in category (b) is 100% – 43% – 11.8% = 45.2%. And this would be acceptably described as 'about 4 times' the proportion in category (a) – it is about 3.8 times.

But unfortunately, the report told us that (c) was by far the largest category. And 45.2% is larger than the 43%!

I just cannot find any way in which all the statistical statements in this report can be correct. 

Why did the reporter not just give us the straight percentages?



No comments:

Post a Comment