How cognitive biases impact your team?
Bias is real, and everyone has it
We all like to think that, when it comes to interacting with one another, we’re objective, logical, and able to evaluate all the data available. However, our communication, and therefore our interactions and decisions, are influenced by a wide variety of cognitive biases.
All humans have biases. If making a decision meant considering every single possible option-it would be impossible. Because of time limits, and the amount of information and its complexity, it is necessary sometimes to rely on some mental shortcuts. Shortcuts are what allow us to decide and act quickly.
Our brains weigh only three pounds, yet account for 20 percent of the body’s total energy usage, so it makes sense that we want to save energy there as much as we can.
Bias is a natural side effect for our “shortcut” brain
As a result, the human mind is considered to be a “cognitive miser” (1).That means the human mind often seeks to avoid spending computational effort. This isn’t laziness; it’s biology!
Negative biases are stronger than positive ones
It may not be totally possible to eliminate the brain’s predisposition to take shortcuts, but understanding that bias exists can be useful when making decisions.
Bias arises from various processes that are difficult to distinguish. One of these processes is using information processing shortcuts, the information is perceived and processed through a personal experience and preferences’ filter. This mechanism is called a heuristic. Heuristics are like a “rule of thumb” for your brain. It’s a rough guide which, while not always perfect, is helpful enough to get the job done. It saves a lot of brainpower, and often leads you to the right decisions anyway. Sometimes, however, it leads to bias. And bias is not helpful.
It turns out bad memories are scientifically more powerful than positive ones. The brain reacts more strongly to stimuli it deems negative. Our capacity to weigh negative input evolved for a good reason: from the beginning of time, our survival depended on our ability to sense, recognize and avoid danger.
The ultimate bias: Bias blind spot.
Unfortunately, that often means we make negative associations between people and their characteristics (regardless of overall accuracy). Humans will read someone else’s behavior and intentions through their own lens.
Biases may impair our decision making, and misjudge another person and their intentions. This is especially dangerous when it comes to teams, and their cohesion and productivity.
Biases may be difficult to spot, especially those based on deeply-held beliefs. Addressing bias requires a deft hand.
Research by Emily Pronin, a psychologist at Princeton University, and colleagues, has found that people tend to recognize how bias affects judgment and decision making except when it’s their own.
How bias impacts teams?
Bias blind spot is the failure to notice our own cognitive biases. We are more likely to notice bias in others than ourselves because our own biases are ingrained in our world view (naïve realism). In other words: We all see ourselves as being more objective than others.
We may be drawn to a way of working without being aware of it. For instance, we tend to hire people who match/share our own ways of seeing the world and are unaware we are doing so.
Bias unconsciously pushes us to build teams with members sharing the same characteristics and viewpoints. These teams are more inclined to the “groupthink” phenomenon and will often experience a loss of creativity and critical thinking (conformity of viewpoints in such groups is tied to the confirmation bias). Diversified teams can provide a natural check to different ways of thinking, creating better outcomes.
Of course, how teams diversify matters immensely. There’s evidence that forced diversification programs may actually create greater bias. Diversity is not a blind tool, but an objective.
One person trying to understand another may seem like a full-time job. But trying to understand how cognitive bias influence others without acknowledging our own can only ever be half effective.
References:
● Stanovich, Keith E. (2009). “The cognitive miser: ways to avoid thinking”. What intelligence tests miss: the psychology of rational thought. New Haven: Yale University Press. pp. 70–85. ISBN 9780300123852. OCLC216936066. See also other chapters in the same book: “Framing and the cognitive miser” (chapter 7); “A different pitfall of the cognitive miser: thinking a lot, but losing” (chapter 9).
● Daniel Kahneman, “Thinking, Fast and slow” Farrar, Straus and Giroux, 2011 (ISBN 978–0374275631)
● Dawes, R. M. (1998). Behavioral decision making and judgment. In D. T. Gilbert, S. T. Fiske, & G. Lindzey (Eds.), The handbook of social psychology (4th ed., Vol. 1, pp. 497–548). Boston: McGraw-Hill.
● Emily Pronin, Daniel Y. Lin and Lee Ross (2002) “The Bias Blind Spot: Perceptions of Bias in Self Versus Others” Personality and Social Psychology Bulletin, Vol. 28, №3, 369–381 doi: 10.1177/0146167202286008
Originally published at https://www.linkedin.com on July 26, 2017.
