I am in the midst of reading Joseph Heath’s Enlightenment 2.0, which was shortlisted for this year’s Donner Prize. It covers a lot of similar ground in other recent books about how humans think, such as Daniel Kahnman’s and Jonathan Haidt’s books. Collectively, these books are having a powerful impact on my views of the world and on my scholarship.
Heath’s book is a great read. It is very accessible and provides an excellent summary of the literature on cognitive biases and decision making (at least it’s consistent with Kahnman’s and Haidt’s books!). Continue reading
Among many important and interesting tidbits, Heath argues that one of the major problems that all citizens face, whether they are academics or non-academics, is confirmation bias (and indeed there’s research showing that philosophers and statisticians, who should know better, also suffer from the same cognitive biases). It’s why some scholars insist on the need to reject the null hypothesis when engaging in causal inference.
Yet confirmation bias is such a powerful cognitive effect on how we perceive the world and make decisions. Certainly in my subfield, and I assume in many others involving strong normative debates and positions, there is a strong temptation to accept and embrace confirmation bias.
In the words of Joseph Heath:
The whole “normative sociology” concept has its origins in a joke that Robert Nozick made, in Anarchy, State and Utopia, where he claimed, in an offhand way, that “Normative sociology, the study of what the causes of problems ought to be, greatly fascinates us all”(247). Despite the casual manner in which he made the remark, the observation is an astute one. Often when we study social problems, there is an almost irresistible temptation to study what we would like the cause of those problems to be (for whatever reason), to the neglect of the actual causes. When this goes uncorrected, you can get the phenomenon of “politically correct” explanations for various social problems – where there’s no hard evidence that A actually causes B, but where people, for one reason or another, think that A ought to be the explanation for B. This can lead to a situation in which denying that A is the cause of B becomes morally stigmatized, and so people affirm the connection primarily because they feel obliged to, not because they’ve been persuaded by any evidence.
Let me give just one example, to get the juices flowing. I routinely hear extraordinary causal powers being ascribed to “racism” — claims that far outstrip available evidence. Some of these claims may well be true, but there is a clear moral stigma associated with questioning the causal connection being posited – which is perverse, since the question of what causes what should be a purely empirical one. Questioning the connection, however, is likely to attract charges of seeking to “minimize racism.” (Indeed, many people, just reading the previous two sentences, will already be thinking to themselves “Oh my God, this guy is seeking to minimize racism.”) There also seems to be a sense that, because racism is an incredibly bad thing, it must also cause a lot of other bad things. But what is at work here is basically an intuition about how the moral order is organized, not one about the causal order. It’s always possible for something to be extremely bad (intrinsically, as it were), or extremely common, and yet causally not all that significant.
I actually think this sort of confusion between the moral and the causal order happens a lot. Furthermore, despite having a lot of sympathy for “qualitative” social science, I think the problem is much worse in these areas. Indeed, one of the major advantages of quantitative approaches to social science is that it makes it pretty much impossible to get away with doing normative sociology.
Incidentally, “normative sociology” doesn’t necessarily have a left-wing bias. There are lots of examples of conservatives doing it as well (e.g. rising divorce rates must be due to tolerance of homosexuality, out-of-wedlock births must be caused by the welfare system etc.) The difference is that people on the left are often more keen on solving various social problems, and so they have a set of pragmatic interests at play that can strongly bias judgement. The latter case is particularly frustrating, because if the plan is to solve some social problem by attacking its causal antecedents, then it is really important to get the causal connections right – otherwise your intervention is going to prove useless, and quite possibly counterproductive.
In the subfield of Aboriginal politics, there are powerful incentives to ascribe everything that has gone wrong with Aboriginal communities post-contact to the British and later the Canadian state. Those who try to say otherwise are routinely hammered and ostracized by the public and some members of the academy without even taking a moment to consider seriously their work. Say what you want about the books and articles by Tom Flanagan, Frances Widdowson and Ken Coates, but at least they are providing us with an opportunity to test for confirmation bias. Causal inference requires eliminating rival explanations! Otherwise, how can you be sure that A causes B?
In many ways, it is for these reasons why I’ve long been suspicious and wary of ideology (and certainty), whether it comes from the right or the left. Someone who is hard core left or right, it seems, is more likely to be driven by confirmation bias. I’ve seen dozens of episodes in my life where ideologues (from the left and the right) or those with strong views of the political world, when confronted with overwhelming evidence, refuse to budge. It’s irrational, in many ways. And so I long ago vowed to try and avoid becoming one of them and to embrace uncertainty. Sure, I will take a strong a position in my articles, books, and op ed columns, but I’m always ready and willing to change my mind.
Perhaps it’s a cowardly way of approaching politics and scholarship (and so I guess I should never run for office!) but for me, it conforms to my goal of striving towards causal inference and certainty.