Did you see the BBC's report on how, contrary to what we all thought, the UK actually loves religious values and wants more religious influence on our lives? Now I should start out by saying, for the record, I'm not a big fan of religion and don't think people should have a say in law-making, or how I live my life, simply because they feel more comfortable living by the moral standards of, say, an Iron-Age Middle-Eastern society. I fully understand why people would feel more comfortable with the simple black-and-white morals of an ancient and distant society, which save one dealing with the scary complexities of a modern pluralist society, but then I also fully understand why other historical re-enactors like to dress up as Vikings at weekends.
Anyway, as I wasn't sure I believed the BBC's conclusions - which claimed the majority of Britons wanted much more religious influence over their lives - I thought I'd delve into the source of the data a little, to see if they would persuade me. It turns out the survey was conducted by a polling organization called Comres. Comres, on their website, boast all sorts of big-name clients and proudly declare they are a member of the British Polling Council and the Association for Qualitative Research. Sounds like a group of researchers who know what they're doing.
Looking at their portfolio of 'Social polls' is interesting. Most of their recent 'social polls' have been about religion, and all these were conducted at the behest of Christian-interest groups. Hmm. Why might Christian-interest groups give so much business to Comres? I wondered.
I then went back and looked at the results of the Comres/BBC poll [PDF link]. Gosh, what detailed analysis! The data are there, broken down in minute detail by gender, age, social grade and region. Big pages full of scary numbers: this looks like a thoroughly rigourous and scientific study!
But let's look at these numbers in a little more detail. Down on page 3 we can see how the sample of 1045 people breaks down into religious groups. Of the 1045 people surveyed, it turns out 639 were Christians and 279 were of no religion. Now this immediately sent alarm bells ringing. To show you why, let me digress slightly into some introductory sampling theory...
In human research, an early step is to identify the population of interest. This is the group about which you want to reach a conclusion. For example, if you want to learn something about the opinions of all the people in the UK, your population is 'all the people in the UK'. In an ideal world you would then conduct a census, whereby you speak to every member of this population. At the end of such research you know exactly what are its opinions.
Of course, when you're dealing with really large populations - like the population of the UK - conducting a census becomes logistically difficult so you instead use a sample. A sample is a subset of your population which you hope will behave exactly like the population. In other words, it should be the population in miniature, a microcosm of the population. You are hoping it will behave exactly like the population, whilst being of a manageable size. If it does, that's great: you've learnt something about a big population from studying a convenient number of people. But if your sample is in some way biased, or behaves differently to the population as a whole, you will reach false conclusions about the population. The only information you have about the population comes from your sample, so every effort must be taken to ensure that sample isn't biased in some way. Ideally this is done by keeping the sample large, and using methods such as random sampling to choose the people included.
And this is what first worries me with Comres's sample. This survey is being used to represent the views of the UK population (it certainly is in the BBC article). For it to have any validity, then, the sample has to be a smaller version of the UK population - it has to look just like the UK population, in miniature, or else we can't meaningfully generalize from it. But there's the thing: the UK population isn't 64% Christian and 28% non-religious (I'm ignoring Comres's 'weighted' numbers as they haven't bothered to report what they were weighted by). Nor is the population of the UK 0.02% Muslim, and nor does it have exactly 10 times more Muslims than Jews. This sample is clearly biased. With the majority self-identifying as Christians, whatever the sample 'says' is simply going to represent Christian views (at least to the extent Christians all agree with one another on things). Strange - you'd expect a member of the British Polling Council to be a bit more careful than that.
Where did this sample bias come from? Critically, we cannot know. Any sort of proper scientific report would contain full details of how the sample was recruited, so we could read the report fully informed and judge its findings according to the strengths or weaknesses of its methodology. But Comres's report doesn't bother to say how the sample was recruited. Given the massive skew towards representing Christians, I'm tempted to suspect they did a lot of their polling outside churches, at religious group meetings, or something similar. But I don't know, because they don't tell us. Nor do we know how they defined groups like 'Christian', 'non-religious' and so on. This lack of detail really matters: you're going to see very different results if you define 'Christianity' as 'I actively go to church at least once a week, am born-again and believe Jesus Christ is my personal saviour' or if you define it to include all those people who say they're Church of England as a sort of 'default' option because they don't feel very strongly one way or the other (like my mother), or who choose that option because they feel 'spiritual, like there must be something bigger' and so won't choose the non-religious tag. In this case I suspect Christianity was defined somewhat like the first of these options, and the wishy-washy undecided made up the non-religious group. But again, I can't tell because these crucial details aren't reported.
Moving on from the sampling, let's look at the survey itself. It included questions like "The media reports my religion fairly and accurately (agree/disagree/don't know)". From experience of similar surveys I can tell you this is a very strange question to ask someone who is not religious. It simply doesn't make sense - not having a religion is not a religious position, except possibly for some of the more hard-line atheists. Asking someone who isn't religious about their religion is like asking someone who doesn't own a hat about their hat: what are they to answer other than "Huh? I don't have one"?
But it seems Comres have something of a history here. Let's look at their questions in other surveys. How about their "Rescuing Darwin" survey, conducted at the behest (i.e., payment) of Theos, a Christian think-tank? Here we see questions like "Young Earth Creationism is the idea that God* created the world sometime in the last 10,000 years. In your opinion is Young Earth Creationism: definitely true, probably true, probably untrue or definitely untrue". With 11% saying this is definitely true, again alarm bells are ringing about which particular evangelical church they got their sample from (and again, they don't tell us), but let's ignore that for a moment as we're looking at the questions. How about question 3: "Atheistic evolution is the idea that evolution makes belief in God unnecessary and absurd. In your opinion is Atheistic evolution: definitely true, probably true, probably untrue or definitely untrue" with 30% saying 'definitely untrue'. I'm sorry, is that question dispassionate and scientific, carefully designed to elicit opinion, as it should be, or is it emotive and written in the language of fundamentalist Christianity? There's plenty more of this sort of thing in Comres's oeuvre.
(* 'God' you notice. Not '...the idea that a powerful entity created the world' but 'God', with a capital letter.)
So what's the conclusion? Basically, it rather looks as though Comres have established themselves as the polling organization of choice for religious groups wanting to find the 'right answers' in national opinion polls. With dubious questions which only make sense to a subset of those questioned (seriously: go and read the rest of the Rescuing Darwin questions), and apparently biased samples (which we can't even properly evaluate, without information on where they came from), they seem always to support exactly what the paying customer wants to find - which is nice, as that's a good way of getting repeat business.
I'm tempted to call for important organizations such as the General Medical Council to stop giving their business to such a polling organization, but I think the bigger question here is why on earth the supposedly dispassionate BBC News commissioned this particular organization - with their track-record of questionable polling in the interests of religious bodies - to conduct their snapshot survey of religious feeling in the UK. And I'm also curious as to why the BBC didn't notice the rather flagrant sample bias in the data they eventually received. I would be very very interested in knowing the religious background of the individual who commissioned this 'research'. Very interested indeed.
And this is where I turn all Ben Goldacre: this doesn't really bother me because of its religious aspects, but rather because it is the sort of thing which gets proper and effective researchers a bad name. Public opinion polls can play an important role in testing the Zeitgeist, and also contribute a great deal to our modern discourse about society. But for them to have any use they have to be done properly, and reported transparently. This sort of thing not only breeds distrust of opinion polls in general, but is also a classic example of how you can't just believe any sort of research reported in the media but rather need to go back to the source of the data and evaluate where they came from. I know this for a fact: I learnt it from a rigourous survey of me.
EDIT: Here's something I just typed in the comments to this post. Oh, and there's another issue I forgot to mention in the post, which is a shame as it was one of the things that really bothered me. One of the questions was "Our laws should respect and be influenced by UK religious values (agree/disagree)". Surely that's two questions rolled into one! That question was rolling people who think the law should respect religion into saying they also think the law should be influenced by religion. Because those are really quite separate ideas: personally I wouldn't be too worried by the first part of the question (as I think the law should respect our right to believe what we want), but I'd vehemently oppose the second part of the question. Tricksy, I'd say. I do wish I'd remembered to put that into the main article!