Confirmation Bias: What Limits Our Intellect

Omid Panahi, RDT
The Startup
Published in
6 min readNov 19, 2019

We live in an unprecedented information age characterized by ever-growing polarization among the populace. The advent of the internet, some have argued, has made it so that the information readily available to us far exceeds our ability to scrutinize and comprehend it. This has, in turn, created an “anything goes” scenario whereby facts, evidence, and reason seem to hold less and less weight in the public eye.

What was once considered objective reality can now be easily dismissed in the service of “alternative facts,” and the wildest of conspiracy theories can be made to seem “plausible” through sufficient mental gymnastics. This circumstance could be partly attributed to the psychological phenomenon of confirmation bias, namely the “tendency to seek, interpret, and use evidence in a manner biased toward confirming [one’s] existing beliefs or hypotheses.” The presence and potency of this human instinct makes it so that, absent a conscious effort to counteract it, the veracity of a piece of information is only secondary to whether it is consistent with our pre-conceived notions. Needless to say, this poses a significant hurdle for any enterprise that deals in fact and empiricism, most notably academia and the journalistic profession.

The origins of this cognitive bias lie in the evolutionary history of our species. In order to keep themselves out of harm’s way, our ancestors were under great pressure to process information as quickly as possible, and a genetic predisposition toward confirmation bias offered a relatively efficient way to examine new information, thus rendering it a favorable trait in terms of natural selection. Evolutionary biologists have noted, however, that human beings “are better able to rationally process information, giving equal weight to multiple viewpoints, if they are emotionally distant from the issue.”

Psychologists and cognitive scientists have also attempted to make evolutionary sense of confirmation bias, often characterizing it as an effective argumentative maneuver. The cognitive scientist and author Hugo Mercier, for instance, has credited this lapse in human reasoning with helping us “devise and evaluate arguments that are intended to persuade other people.” This, however, begs the question of why, in an evolutionary context, persuasive prowess of this sort would enhance an individual’s chances of survival and reproduction. Nor is it clear, for that matter, whether the evolutionary benefits of this cognitive trait truly outweighed its costs in this regard — after all, confirmation bias represents a palpable failure of human rationality and works to cloud our judgment in a wide variety of ways.

This psychological phenomenon has also been a subject of discourse in philosophical circles, making appearances in the literature long before it could have been studied within a scientific framework. The English philosopher and statesman Francis Bacon, often referred to as the “father of empiricism,” had the following to say about the human mind in 1620:

The human understanding when it has once adopted an opinion . . . draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects and despises, or else by some distinction sets aside and rejects; in order that by this great and pernicious predetermination the authority of its former conclusions may remain inviolate . . . .

The multidisciplinary study of confirmation bias goes to demonstrate a fascinating, twofold fact about the human brain, namely its susceptibility to error but also its capacity for self-reflection. The former, I would argue, is a mundane reality that only ties our species to the rest of the animal kingdom, and the latter a cognitive feat that slightly sets us apart. Although errors in judgment are rather common among our evolutionary counterparts, we are perhaps the only species that is capable of contemplating and consciously correcting them. In this regard, we could be justifiably said to be relatively “intelligent.”

An area in which irrational biases could be of grave consequence is, indeed, the medical profession, given the extraordinarily high stakes that it entails. Although practitioners within the medical community tend to be more versed in their own psychology and better equipped to combat their subconscious biases, they are certainly not immune to confirmation bias. This was demonstrated by an interesting 2011 study conducted by Mendel and colleagues, which found that “confirmation bias is present in some psychiatrists’ and medical students’ information search,” leading to “poorer diagnostic accuracy.” The researchers found that 13 percent of psychiatrists and 25 percent of medical students displayed a form of confirmation bias after having made a preliminary diagnosis, which rendered them less likely to correct the diagnosis in the face of abundant contradictory evidence.

A recently published federal study on the effectiveness of invasive surgery relative to medical therapy among heart-disease patients, entitled ISCHEMIA, is currently making rounds in the medical community as well as the news media; and it, too, could be said to evince the presence of confirmation bias in medical practice. The paper found bypass procedures and stents — small wire cages used to open narrowed arteries — to be no more effective than drug therapy in squashing the likelihood of heart attacks and heart-related deaths, thereby settling a long-standing controversy in cardiology. As The New York Times reports, however, previous studies had pointed in the same direction but failed to deter doctors, many of whom had “called earlier research on the subject inconclusive and the design of the trials flawed.”

In his 2008 book How Doctors Think, the science writer Jerome Groopman points to the possibility of a complementary relationship between confirmation bias and another cognitive phenomenon often called the “availability heuristic,” namely the “tendency to judge the likelihood of an event by the ease with which relevant examples come to mind.” He provides the hypothetical example of a businessman whose assessment of a financial undertaking is skewed by the experiences most familiar to him, writing that “a businessman may estimate the likelihood that a given venture could fail by recalling difficulties that his associates had encountered in the marketplace, rather than by relying on all the data available to him about the venture.” Needless to say, this sort of misjudgment would be deeply troubling in a medical setting, and Groopman floats the idea of inductive-reasoning courses meant to educate new doctors about their own subconscious biases.

Although the literature is rather rich in documented instances of confirmation bias, it has a lot less to offer concerning the neurobiological mechanisms underlying this cognitive phenomenon. One proposal, put forth in a relatively recent paper by Talluri and colleagues, ties confirmation bias to human decision making and selective attention. The researchers conducted a novel experiment whereby participants viewed two successive, random dot-motion stimuli and made judgments as to their direction of movement, and they were able to demonstrate that the “participants’ sensitivity for the second stimulus was selectively enhanced when that stimulus was consistent with the initial choice.” They dubbed this mechanism “choice-dependent selective gain modulation.”

Talluri et al.’s findings show that “categorical decisions bias the acquisition of new evidence by overweighting the evidence consistent with the decision and underweighting the inconsistent one, akin to the mechanism of selective attention,” according to an analysis of their paper by Prat-Ortega and de la Rocha. Selective attention is the human brain’s means of scrutinizing the information it deems most relevant, which impels it to “[target] certain aspects of the available information while neglecting others.” As the authors suggest, these results make an invaluable contribution to our understanding of the brain and go a long way in explaining the human propensity for confirmation bias.

Science is an exceptionally effective tool for uncovering the wonders of the human brain, but also for revealing its limitations. Confirmation bias, as I have alluded to throughout this piece, represents a fundamental flaw in our capacity for reason, logic, and rationality — precisely the cerebral functions that we believe distinguish our species from our evolutionary cousins. Equipped with this understanding, we can at least partially explain the rise of online echo chambers, conspiracy-theory enthusiasts, ultra-religious cults, the “fake news” phenomenon, and so on and so forth. Although a whole host of unanswered questions remain, that is nevertheless a feat worth celebrating.

Follow me on Twitter (@OmidPPanahi), Instagram (@omidp.panahi), and Snapchat (@omidp.panahi).

The Rhinestoned Cowboys Podcast: https://bit.ly/2LEued4

--

--

Omid Panahi, RDT
The Startup

Bodybuilder, Registered Dietetic Technician, nutrition coach and trainer based in Reno, NV