Can Reason and Bias Coexist?
“Humans may be vulnerable to bias and error, but clearly not all of us all the time, or no one would ever be entitled to say that humans are vulnerable to bias and error.” — Steven Pinker, Enlightenment Now
On occasion, big ideas escape scholarly circles and take hold in the popular consciousness. Freed from the constraints of academic rigour, such ideas quickly find themselves where they don’t belong. For instance, confused people have taken the Big Bang as proof of Genesis 1:3 (“Let there be light”), used quantum mechanics to peddle half-baked science and spirituality, and enlisted natural selection to justify all manner of abuse. Lately, a similar sense of confusion has emerged around the idea of cognitive biases.
Through careful reasoning and experimentation, we’ve gained a better understanding of the human mind, including a refined knowledge of our cognitive blind spots. As a result, concepts such as motivated reasoning, confirmation bias, and in-group–out-group bias have seeped into common parlance. For the most part, this is a good thing: by spotting our biases, we’re better able to correct for them. However, some people have interpreted the research on bias in a negative, ironic, and dangerous light. Since we’re thoroughly biased, they say, the pursuit of truth is a fool’s game. Because our certainty emerges from a hopelessly biased perspective, we can never actually be sure of anything. As such, we should give up the naïve belief that we can use reason to improve our lives, or to persuade others to update their views.
Now, such naysayers do have a point: we are indeed biased and regularly stray from sober-minded reasoning. But although we are biased, we are not hopelessly so. We can still use reason to understand and improve our world. And by learning how we’re led astray by bias, we can actually reason more carefully.
When people claim that human bias precludes the possibility of reasoned discourse, they’re committing a common error (and perhaps indulging postmodern sympathies). This error flows from the mistaken belief that human reasoning is invalid whenever it’s influenced by emotion. By thinking that effective reasoning can only take place in the absence of emotion, intuition, or any hint of bias, people grant themselves licence to ignore the realm of reason. Of course, as mere mortals we’re never free from emotion, but fortunately reason and emotion can — and do — coexist. Contrary to widespread belief, we learn to reason not by ditching emotion for a world of Platonic forms, but by skilfully steering it down new tracks. Let me explain.
We are ruled, at base, by intuition. Without intuition to guide us, we could not learn anything. From our most outlandish conspiracy theories to our most rational scientific assumptions, intuition props up all of our beliefs. After all, unless we can intuit that something is true (or false), we cannot incorporate it into our worldviews. And though it often misleads us, only by intuition’s light can we approach truth and better ways of thinking. Therefore, our ability to reason depends upon the character of our intuition. To reason well, we must intuit wisely.
But, critics will object, if intuition — an irrational feeling — informs our reasoning, then aren’t we fooling ourselves? How can reason possibly emerge from an irrational base? Paradoxically, many of the people who make this objection tend to be the types who claim to abhor reductionism, arguing that it ignores the holistic, emergent properties of the universe. As we’ll see, human reason is itself an emergent property, arising from the confluence of thought, emotion, and an adroit application of bias. To argue that human reason cannot exist is to take reductionism to the absurd.
To understand this point, consider what takes place when learning mathematics. Math, the epitome of pure reason, is typically viewed as a detached, emotionless affair. Numbers are manipulated according to precise rules of logic, seemingly unmuddled by the imperfections of human feeling. Yet this (un)romantic view obscures the fact that emotion informs even our most rational of pursuits.
To learn mathematics, we must rejig our intuitions around numbers and the relations between them. Essentially, we train our emotions to respond favourably to the time-tested rules of mathematics. And to carry out this training, we take advantage of a number of pre-existing inclinations of the human mind. We use the fact that we’re biased to trust authority figures (teachers), biased to respond to repetition (drills), and biased to seek sense in data (pattern recognition) to produce mathematical knowledge. And from our imperfect minds — so full of emotion and bias — something akin to perfect reason emerges.
Regardless of subject matter or practice, all knowledge forms in this way, by bootstrapping intuition to ever-higher levels. As we bias our intuition to endorse rules of logic, numerical literacy, and a preference for scientific — rather than anecdotal — thinking, we strengthen our capacity for reason, in any field. Therefore, it makes little sense to say that the sheer presence of emotion or bias nullifies the existence of reason. Indeed, reason and emotion can happily coexist.
This may all sound, well, reasonable, but haven’t psychologists reached the opposite conclusion? Aren’t the experts convinced that reason is merely the slave of the passions, existing solely to support our preconceptions and prejudices? While psychologists have indeed learned of many ways in which reason gets sidetracked, they have certainly not stripped reason of all its strength (if they had, how could they hope to convince anyone of this fact, lest they contradict themselves?).
Readers might be familiar with the work of Jonathan Haidt, a social psychologist and ardent critic of what he calls “the rationalist delusion”. Haidt believes that we’re overconfident in our ability to reason; rather than lead us to higher truth, reason mainly serves to validate emotions such as tribal allegiance or disgust. Haidt makes an important point: we’re generally not as reasonable as we think, and we often recruit reasons solely to vindicate our preconceptions. Given this, one might think that Haidt has no trust in reason.
Yet even Haidt, who has accused the New Atheists of “sacralising reason”, has some faith in reason. He fully grants that it can impact our views. As he writes, “Words and ideas do affect friends, allies, and even strangers … If one can get the other person to see the issue in a new way, perhaps by reframing a problem to trigger new intuitions, then one can influence others with one’s words.”
The psychologists David Pizarro and Paul Bloom agree, stating “Prior reasoning can determine the sorts of output that emerge from [intuition] … One of the most effective ways to change one’s intuitive moral responses, then, is to change one’s thoughts or appraisals about an issue.”
* * *
The interplay between reason, intuition, and bias has implications for how we conduct our discourse. Too often, we think that we’ve won an argument simply by spotting the other side’s bias, or by calling attention to their motivated reasoning. But just because someone is biased does not mean they’re wrong, and pointing to the intuition that underlies an argument is not, in itself, a counter-argument. Humans are imperfect animals, and as our sciences of mind progress we’ll be tempted to evade issues by appealing to any number of cognitive flaws. But ideas and arguments are right or wrong on their merits, whatever the emotions and biases that give rise to them.
Few big ideas come without fallout, and the idea of cognitive biases is no different. Where knowledge of our biases makes us more reflective, we’re the better for it. But where such knowledge deters us from reasoning and distracts from the issues at hand, we’d best draw back and reorient ourselves. Reason exists and reasons matter, regardless of what anyone says. And if we hope to improve our world, we must honour this fact.
 Haidt, J. (2001). The emotional dog and its rational tail: A social intuitionist approach to moral judgment. Psychological Review, 108(4), 814–834.
 Pizarro, D.A., & Bloom, P. (2003). The intelligence of the moral intuitions: Comment on Haidt (2001). Psychological Review, 110(1), 193–196.