Buster Benson’s Cognitive Bias Cheat Sheet
Albert Einstein said, “The more I learn, the more I realize how much I don’t know.”
This has been a humbling lesson for me as a manager. Almost daily, I’m confronted by the fact that not only do I not know everything, but I don’t know most things.
This was especially apparent to me when I stumbled into the world of cognitive biases. Cognitive biases are mental habits we’ve developed as humans to make sense of our world. In many cases, these biases are useful, but they can lead to blind spots in how we lead our teams and make decisions for our organizations.
Benson groups cognitive biases around four mental problems that we use them to address. “Every cognitive bias is there for a reason — primarily to save our brains time or energy,” he writes. “If you look at them by the problem they’re trying to solve, it becomes a lot easier to understand why they exist, how they’re useful, and the trade-offs (and resulting mental errors) that they introduce.”
Here’s an overview of each problem, along with Benson’s catalog of mental workarounds we use to address them — many of which can lead to particular biases that impact our decision making. (Italics are straight from Benson.)
Problem 1: Too Much Information
To avoid information overload, our brains filter out tons of inputs to focus on what appears to be the most useful.
- We notice things that are already primed in memory or repeated often.
- Bizarre/funny/visually-striking/anthropomorphic things stick out more than non-bizarre/unfunny things.
- We notice when something has changed.
- We are drawn to details that confirm our own existing beliefs.
- We notice flaws in others more easily than flaws in ourselves.
Problem 2: Not Enough Meaning
Having filtered down the information coming into our brains, we fill in the gaps with what we already know or with what seems right.
- We find stories and patterns even in sparse data.
- We fill in characteristics from stereotypes, generalities, and prior histories whenever there are new specific instances or gaps in information.
- We imagine things and people we’re familiar with or fond of as better than things and people we aren’t familiar with or fond of.
- We simplify probabilities and numbers to make them easier to think about.
- We think we know what others are thinking.
- We project our current mindset and assumptions onto the past and future.
Problem 3: Need to Act Fast
Especially at work, we have to avoid analysis paralysis and make decisions even when there’s ambiguity.
- In order to act, we need to be confident in our ability to make an impact and to feel like what we do is important.
- In order to stay focused, we favor the immediate, relatable thing in front of us over the delayed and distant.
- In order to get anything done, we’re motivated to complete things that we’ve already invested time and energy in.
- In order to avoid mistakes, we’re motivated to preserve our autonomy and status in a group, and to avoid irreversible decisions.
- We favor options that appear simple or that have more complete information over more complex, ambiguous options.
Problem 4: What Should We Remember?
We can’t remember every piece of information that comes into our brains, so we pick and choose. “What we save here is what is most likely to inform our filters related to problem 1’s information overload, as well as inform what comes to mind during the processes mentioned in problem 2 around filling in incomplete information,” Benson writes. “It’s all self-reinforcing.”
- We edit and reinforce some memories after the fact.
- We discard specifics to form generalities.
- We reduce events and lists to their key elements.
- We store memories differently based on how they were experienced.
Benson’s “Cognitive Bias Cheat Sheet” goes into a lot more detail that’s worth diving into. Consider keeping it bookmarked, and happy managing.
Originally published at Manager Companion.