Image for post
Image for post

In-Depth: How Biases Easily Distort Our Beliefs (In The Workplace)

A long-read about 8 cognitive and social biases, their underlying research and how to (hopefully) reduce their impact

Christiaan Verwijs
Aug 10, 2020 · 18 min read

In this series of ‘In-depth’-posts, we take time to consider the bigger picture. This series is not about easy answers or practical tips, but to develop a more complete understanding of what may be going on.

If you take the time to fully read and digest this post, you will (hopefully):

  • Understand how easily your thinking and the thinking in groups is biased, tainted or distorted by a eight very common cognitive and social biases. The less susceptible you think you are the time, the more you usually are.

The writing and research for this post was made possible by our patrons. If you like it, and you’d like more of it, please consider supporting us too.

Setting the Stage

Have you noticed how people often only look for what confirms what they already believe? Or how people make blanket statements about entire groups based on individual observations?

I have always been fascinated by cognitive and social biases. As we’ll see in this post, we often vastly overestimate our ability to arrive at sound, rational conclusions that adhere to the facts. While this is already a bias itself (called the “Bias blind spot”), it has big ramifications for our work in organizations too. Because what does it mean about our work when our reasoning is often so flawed? When the beliefs we have, and the assumptions we make, are shaped and distorted by bias?

For me personally, it's one of the reasons why I like the Scrum Framework, as it gives guide-rails to help us think and validate our assumptions with data. It's also why I like Liberating Structures and how they purposefully include different perspectives and voices to reduce bias. It won’t make you invulnerable to bias, but it hopefully reduces them.

This post is about eight biases, and how they manifest in the workplace. My hope is that after reading this post, you’ll be more aware of them and can hopefully reduce their impact.

This post was made possible by our dear patrons. As you can imagine, quite a lot of research goes into writing something like this. If you like it, and it's helpful, please consider supporting us too.

When is something a bias?

A listener of the science podcast Radiolab once called in to share a weird observation. He had noticed how he would often run into the same combinations of vehicles at the intersections he’d crossed; one cyclist, one car and one truck. He judged the odds of that happening so low that he became suspicious. What was going on here? But a statistician then went on to explain that the odds are actually quite high when you consider the number of intersections you cross and the number of vehicles that pass. Also, the observation was further exaggerated because the person only remembered those instances where the combination was what he expected. Nothing suspicious was going on. Instead, the listener had experienced two biases: sampling bias (underestimating the odds) and confirmation bias (only remembering confirmations).

Although this is an innocent example of biases, their effects are not always as benign. Biases are at the root of many sociological problems, like racism, sexism, political rifts, and general inequality.

“Biases are at the root of many sociological problems, like racism, sexism, political rifts, and general inequality.”

So what do we mean by “bias”? A naive interpretation would be to assume that biases are inherently wrong. Behavioral scientists would’ve agreed with this view for a long time. But more recently, they are understood as shortcuts to reduce the cognitive capacity we need to make decisions (e.g. Gigerenzer, 2000). Even though they are distortions, they can sometimes result in the right conclusions. But the fact remains that biases can easily lead to wrong beliefs that actively hurt yourself and/or others, especially when you’re unaware of them.

Bias 1: Confirmation bias

The first, and most researched bias, is confirmation bias. Initially coined by Wason (1960), this bias manifests when we only look for confirmation of our beliefs. It happens, for example, when we don’t trust a certain person or group of people, and then only see behavior that fits with our belief (or even interpret it as such) without considering observations where they don’t (Oswald & Grosjean, 2004). This is further compounded by how our beliefs shape our understanding of their intentions in the first place. This creates a self-fulfilling prophecy where the belief strengthens itself.

This bias often manifests as a “positive test strategy” (Klayman & Ha, 1987) where people only test assumptions by looking for what confirms them, but not by also considering what would falsify them. It is one of the primary mechanisms behind “echo chambers” on the internet, where people constantly reaffirm and strengthen beliefs that are in conflict with the facts.

Examples in the workplace

  • When organizations start initiatives or change programs and only see or look for evidence that supports the belief that it is working, but fail to see where it doesn’t work or even causes damage.

How to reduce this bias

Like all other biases, this bias can be reduced by being aware of when it happens. So whenever you or a group you are with needs to validate an assumption or a belief, also consider what you’d need to see to challenge that belief. The Liberating Structure Myth Turning is a good example of this strategy. This is also a reason to actively look for information that conflicts with your beliefs, or surround yourself with people with different beliefs.

Bias 2: Fundamental attribution error

In our day-to-day life, we often attribute the behaviors of others to their inherent traits — like personality, experience, and skills. But as it turns out, our behavior is determined to a much larger degree by the situation than by inherent traits. Ross & Nisbett (2011) offer an extensive overview of research in this area.

This naive psychology where we attribute behavior to inherent traits is an example of a bias called the fundamental attribution error. Initially coined by cognitive psychologist Lee Ross (1977), it happens when people underestimate the influence of the situation on the behavior of others while overestimating the influence of their personal traits and beliefs (Berry, 2015). This bias is also known as the ‘correspondence bias’.

Examples in the workplace

  • When you attribute the mistake that someone in your team makes to their lack of skill, their inherent clumsiness or overall intelligence without considering situational factors like time pressure, the novelty of the problem, and the (lack of) support that this person received from others.

For each of these examples, a situational view might’ve resulted in different behavior on the parts of others. The problem with the fundamental attribution error is that it puts responsibility entirely with the other person (and their personality, skills, experience). Even worse, it can lead us to blame the other person or get angry at them.

How to reduce this bias

More recent analyses have shown that this bias isn’t as fundamental as previously thought (Malle, 2006). For example, the bias mostly disappears when people are made aware of how situational factors influence the behavior (e.g. Hazlewood & Olson, 1986). So one way to reduce the influence of this bias is to ask: “How can I explain the behavior through the situation instead of their personality or other traits inherent to them?”.

Bias 3: False causality

When two events or activities happen together, people often conflate them by seeing one as causing the other where no connection exists in reality. This is called a “false causality”, and it is captured in the maxim that “correlation is not causation”. When two events happen at the same, meaning that they are “correlated”, it does not mean that one causes the other. There are many examples of this:

Examples in the workplace

  • The interpretation of marketing metrics — like conversion rates and customer satisfaction—can easily lead to false causalities. This happens when a current marketing activity is seen as causing the changes in scores where they are only correlated in reality.
Image for post
Image for post
Marketing metrics are particularly prone to false causalities, where rises or drops in metrics are attributed to running campaigns.

How to reduce this bias

One way to disprove false causalities is to look for where one event happens, but not the other. When two things always happen at the same time, there might be another variable causing both. For example, the incidence of violent crime tends to rise and drop along with ice-cream consumption. Does the eating of ice-cream cause violent crime? Or do people eat more ice because of crime? Of course not. Instead, it is well-known that both violent crime and the consumption of ice-cream increase as it gets warmer.

Bias 4: Regression fallacy

This fallacy is caused by a statistical effect called regression to the mean. It implies that an extreme score on a variable is likely to be followed by one that is closer to the average, provided that nothing has profoundly changed in-between measures. The fallacy happens when we attribute the drop in the second score to anything other than this statistical consequence, like skill, a particular intervention, beginner’s luck, or time.

Examples in the workplace

  • When a Scrum Team scores much higher than their average on a metric of their choosing (e.g. happiness, velocity, defects), and if nothing has profoundly changed in-between measures, the second score is likely to be much closer to the average (and thus, lower).

People often fail to understand this fallacy because they underestimate how much of their behavior and their outcomes are influenced by randomness (Taleb, 2007).

How to reduce this bias

The best way to avoid this fallacy is by being cautious when you interpret a single extreme score followed by one that is closer to the average. Instead of attributing the difference between the first and the second score to an intervention, to skill or time, it may simply be a regression to the mean.

Bias 5: Anchoring bias

This is a cognitive bias where recently acquired information influences the decision of a person more than it should (Tversky & Kahneman, 1974).

Examples in the workplace

  • When teams estimate work, hearing an initial estimate is likely to “anchor” further estimates. So when people are asked how much time something will take, and they are offered an initial estimate of 20 days, their own estimates will gravitate towards that number.

How to reduce this bias

The anchoring bias explains why Planning Poker requires participants to show their estimates at the same time. It also explains why Liberating Structures often start with giving people a few minutes of silent thinking before moving into group interactions. It may not prevent anchoring bias entirely, but it hopefully dampens it.

I’ve personally found it helpful to distance myself from a decision for a while and revisit it with fresh eyes. The influence of initial anchoring is less, especially when I take care not to anchor myself again.

Bias 6: Survival bias

Survival bias happens when failures are ignored when you are evaluating if a process or decision is the right one (Schermer, 2014). It is a more specific form of confirmation bias.

A famous example of survival bias is how in the 2nd World War, allied planes were reinforced in those areas where ground crews observed many bullet holes. It seemed like a good idea. Until the mathematician, Abraham Wald pointed out that this was the damage on the planes that survived, and that these holes were obviously not critical enough to make them crash or explode. Instead, he recommended reinforcing the areas without bullet holes (Mangel & Samaniego, 1984).

Image for post
Image for post
You’ll end up with the wrong conclusions about safety and maintenance if you only look at the planes that survived the trip back.

Examples in the workplace

  • HR departments can conclude that their recruitment process is working well because it is producing suitable candidates. But the fact that some candidates “survived” the process isn’t enough to conclude that it works. How many suitable candidates were (wrongly) rejected? How many candidates did the process miss that would’ve been more suitable? Without that data, survival bias is likely.

How to reduce this bias

The best way to reduce this bias is to be skeptical of taking what made someone or something successful (i.e. a “survivor”), without considering the failure rate. Search for examples of where it didn’t work.

“Personally, this is why I’m always very skeptical of “best practices” and success stories I hear at conferences.”

Personally, this is why I’m always very skeptical of “best practices” and success stories I hear at conferences. Although their success may be real, it doesn’t mean that the practices they used were the cause of it.

Bias 7: Illusory superiority

In a study among academic teachers, 94% rated themselves as above average in terms of their teaching skills (Spring, 1977). All drivers consider their own driving skills above average (Roy & Liersch, 2013). And people overestimate the contribution of their country to world history (Zaromb et al., 2018). In short, most people have an inflated and overly optimistic view of their own abilities and contributions compared to others. Or from the group they are part of.

This self-serving optimism also manifests in other biases. For example, the “Optimism bias” happens when underestimate our chance of misfortune and disaster compared to others. For example, Weinstein (1980) found that people consistently rate their own chance of developing health problems, like alcohol addiction, much lower than others.

Another variation is the “Dunning-Kruger effect”, where the less experienced people are at a skill, the more likely they are to overestimate their ability (Kruger & Dunning, 1999). Or the fewer people know about some field of expertise, the more confident their opinion about something in that field will be — even when it is wrong.

Examples in the workplace

  • This bias easily leads to frustration when people feel they are contributing more to the team than others — even when that is not true in reality. Because people can’t see the whole system, and how much everyone is contributing to it, people tend to overestimate their contribution;

Whatever the case, this bias shows that most people are (overly) optimistic about their own abilities and the confidence of their beliefs.

How to reduce this bias

Illusory superiority is difficult to overcome, as the bias exists because people are unaware of it. In general, it helps to encourage diversity in opinions and viewpoints in groups and to create space for people to voice their views without fear of being judged for it.

Personally I’ve found the Liberating Structure Conversation Cafe a great way to do this. Once there is openness in groups, people can learn to see that others may have more experience with something than they do, and start trusting them.

Bias 8: Social conformity

Another class of biases is related to our social nature. For tens of thousands of years, our ancestors had to depend on others to survive. So being able to fit in with a group was a core survival strategy.

One example of this is our susceptibility to follow the beliefs of the majority in our group, even if that belief is objectively wrong. The psychologist Solomon Asch demonstrated (1955) this with his famous “conformity studies”. A group of people, with one real participant, had to collectively pick the shortest or longest line out of a set of lines of different lengths. The real participant was unaware that other members were all confederates of the researchers. After some initial rounds of this simple task, the confederates would eventually collectively pick the same wrong line. The researchers found that 74% of the participants followed the opinion of the majority in at least one round, even though it was objectively wrong. Where most participants knew that it was the wrong answer, but went along because of the social pressure, some participants actually made themselves believe they’d picked the right answer. The effects of this study have been frequently reproduced in other cultures, environments, and groups (e.g. in Bond & Smith, 1996).

Image for post
Image for post
Social conformity can also cause groups to rejects people with different perspectives

Social conformity plays a big role in another social bias called “Groupthink” (Janis, 1972). Here, our desire for social conformity with a group takes precedence over critical reflection — even when the decisions of the group are unethical or dangerous.

Examples in the workplace

  • When decisions need to be made in a group, the opinion of the majority will likely be followed — even if that opinion is factually wrong or at least questionable. This effect becomes more pronounced as the majority increases and starts appealing to social norms (e.g. “Don’t be so difficult all the time” or “Let's just get along”).

Our social nature makes it hard to prevent social conformity. Plus, social conformity is also useful in many cases. It's important to be aware of it, and how powerfully it can distort beliefs and decisions.

How to reduce this bias

We know from research (Asch, 1955) that social conformity decreases as the minority increases, becomes more visible, or is purposefully given space.

Closing words

This post captures only a handful of cognitive, social, and logical biases. It demonstrates how flawed our reasoning and thinking can be. Then again, “flawed” may be too strong of a word. A milder perspective is to understand biases as the shortcuts that our brains have evolved to reduce the processing capacity and to make snap decisions.

But although biases may serve a helpful function, they can easily lead to dangerously wrong beliefs. They are often the foundation of racism, of intolerance, and fear of others. On a smaller scale, they impact the decisions we take in our workplaces, with our colleagues, and within our teams.

Perhaps approaches like the Scrum Framework and Liberating Structures can help here too — that is my hope — but it starts with recognizing that biases exist and they distort our ability to arrive at solid conclusions and well-grounded beliefs.

Image for post
Image for post
You can already support us with $1/month. Find out more on patreon.com/liberators

References

  • Asch, S. E. (1955) “Opinions and social pressure”. Readings about the social animal.

The Liberators

The Liberators: Unleash The Superpowers Of Your Team

Christiaan Verwijs

Written by

I liberate teams & organizations from de-humanizing, ineffective ways of organizing work. Passionate developer, organizational psychologist, and Scrum Master.

The Liberators

The Liberators: Unleash The Superpowers Of Your Team

Christiaan Verwijs

Written by

I liberate teams & organizations from de-humanizing, ineffective ways of organizing work. Passionate developer, organizational psychologist, and Scrum Master.

The Liberators

The Liberators: Unleash The Superpowers Of Your Team

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store