Dear Reader, I did it again.
I could say that I’m blogging this because it could be used in the classroom. (It could, as a discussion about using data in context.) I could justify it with the fact that I’ve recommended books by the scientist-communicator in question. (And will again, because they’re ace.) I could talk about the challenges of the inevitable religious questions in a science lab, which we’ve all faced. (Like the year 10 who honestly believed, as he’d been told, that human bodies were literally made of clay like his holy book said.)
But the truth is I got annoyed on Twitter, got into a bit of a discussion about it, and don’t want to give up without making the points more clearly. So if you’re not up for a bit of a rant, come back when I’ve finally sorted out the write-up from the #ASEConf in Sheffield.
(I should point out that family stuff is a bit tricky at the moment, due to my Dad breaking his brand-new, freshly-installed hip. Before he’d even left the ward. So it’s possible that I’m procrastinating before lots of difficult decisions and a long journey to the Wild South.)
Appropriate Context?
A PR write-up of an academic study has been shared by several people online. The tweet I saw was from @oldandrewuk, who I presume shared direct from the page or RSS as it used the headline from there.
I responded pointing out the source of the research funding, the Templeton Foundation, which was founded to promote and investigate religious viewpoints. He suggested I was ‘poisoning the well’, a phrase I vaguely recognised but to my shame couldn’t pin down.
a fallacy where irrelevant adverse information about a target is preemptively presented to an audience, with the intention of discrediting or ridiculing everything that the target person is about to say. (Wikipedia)
I agree that this was preemptive, but would challenge the judgment that the information is irrelevant. The Templeton Foundation has a history of selectively funding and reporting research to fulfil their aim of promoting religious viewpoints. I thought of this information as providing valuable context; the analogy I used later in discussion was that of tobacco companies funding research showing limited effects of plain packaging. This was fresh in my mind due to recent discussions with another tweeter, outside of education circles. So when does providing context become a form of introducing bias? An interesting question.
Correlation and Causation?
Another point I made was that the data shared in the press release (although not in the abstract) seemed to hint at a correlation between the respondents’ religious views and their criticism of Richard Dawkins. It’s not unreasonable to suggest that this might be causative. The numbers, extracted:
- 1581 UK scientists responded to the survey (if answers here mentioned Dawkins it’s not referenced annywhere I can see)
- 137 had in-depth interviews
- Of these, 48 mentioned RD during answers to more general questions*
- Of these 48, 10 were positive and 38 negative
*Before I look at those numbers in a little more detail, I’d like to point out: at no time were the scientists asked directly their view on Richard Dawkins. The 89 who didn’t mention him might have been huge fans or his archenemies. They might never have heard of him. To be fair, in the paper some follow-up work about ‘celebrity scientists’ is suggested. But I’d love to have seen data from a questionnaire on this specific point addressed to all of the scientists.
Of the 48 who mentioned him:

I suggested that the apparent link had been glossed over in the press release. That not a single scientist identified as positive had been positive about his work stood out for me. I wasn’t surprised that even non-religious scientists had identified problems; he is, let’s face it, a polarising character! But the balance was interesting, particularly as a ratio of one third of respondents being religious seeming a higher proportion that I remembered for UK scientists. But the makeup of the 137, in terms of religious belief vs non, wasn’t in the available information.
The Bigger Picture
I wanted more information, but the paper wasn’t available. Thankfully, #icanhazpdf came to my rescue. I had a hypothesis I wanted to test.
And so more information magically made its way into my inbox. I have those numbers, and it turns out I was right. It’s not made perfectly clear, perhaps because the religious orientation or lack thereof is the focus of other papers by the authors. But the numbers are there.
According to the paper, 27% of UK scientists surveyed are religious (from ‘slightly’ to ‘very’). It doesn’t make clear whether this is based on the questionnaire or applies specifically to the 137 interviewed. (EDIT: I’ve reached out to the authors and they weren’t able to clarify.) 27% of the 137 gives 37 who are religious, and therefore exactly 100 who are not. These numbers are used as I’ve nothing better, but I’ve labelled them ‘inferred’ below.
Now, there are loads of ways to interpret these numbers. I’m sure I’ve not done it in the best way. But I’ve had a go.

What stands out for me is that religious scientists make up just over a quarter of those in the sample, but well over a third of those critical of Dawkins’ approach to public engagement. What’s clearer from this table is that the religious scientists were more likely to mention him in the first place, and as pointed out earlier these mentions were all negative. Is the difference significant?
- 15 of 37 religious respondents were negative: 41%
- 23 of 100 non-religious respondents were negative: 23%
I can’t help but think that’s a significant – although perhaps unsurprising – difference. Religious respondents were nearly twice as likely to be negative. So my hypothesis is supported by this data; the religious are over-represented in those who mentioned Dawkins during their answers. I’m surprised that this correlation escaped the Templeton-funded researchers. An equally correct headline would have been:
Scientists identifying as religious twice as likely to criticise Richard Dawkins’ approach to engagement unprompted.
Conclusions
I think in a lot of ways the numbers here aren’t the big story. I don’t think any of them are particularly surprising. I don’t have any answers for myself about the difference between providing necessary and important context, and ‘poisoning the well’ as @oldandrewuk put it. But I do have two insights that are important to me, if nobody else.
- The headline used in the article press-release is subtly misleading. “Most British scientists cited in study feel Richard Dawkins’ work misrepresents science.” My italics highlight the problem; 38 who were negative is not a majority of the 137 interviewed.
- The data used was selected to show one particular aspect of the results, and arguably some links were not explored despite being of interest. This can never be as good as a study designed to test one particular question. Only by closely reading the information was it clear how the judgments were made by the researchers.
I’d like to highlight that, as seemed fair to me, I invited @oldandrew to comment here following our discussion on twitter. He has so far chosen not to do so.
Conflicts of Interest
To be transparent, I should point out for anyone who doesn’t realise that I’m an atheist (and humanist, and secular). I often also disagree with Dawkins’ communications work – in fact, if they’d asked me the same questions there’s a fair chance I would have made the point about him causing difficulties for the representation of science to non-scientists – but that’s why I recommend his science books specifically!
Links
The wonderful @evolutionistrue posted about this research too. As a contrast, have a look at how EvangelismFocus wrote it up.