What we really mean when we talk about unconscious bias

Andrea Jones-Rooy, Ph.D.
7 min readJun 14, 2019

--

Unconscious bias is probably the most widely used term in discussions of diversity. It’s used so much that, in my experience, two different groups of people typically shut down when I announce the topic in workshops.

The first group are people who assume I’m there to call them racist and sexist. To be fair, I am, but there’s a bit more to it. The second group are people who are sick to death of hearing about unconscious bias but don’t see things improving. My goal in this article and in those trainings is to get everyone excited about unconscious bias is from a social scientific perspective, which I believe then helps us understand why it’s important, and might even shed light on what to do about it.

In the corporate world, unconscious, or implicit, bias refers to some variant of: the assumptions and preferences we have that we are unaware of that inform our view about a person or idea. In my workshops, I usually add something to the effect of — a bias isn’t terribly problematic from a diversity or objectivity perspective until it affects how we judge a person’s performance or ideas. It’s when these biases influence our assessment of someone that we get into trouble. For example, suppose you are particularly taken with French culture. This is fine until you hire exclusively French people, which, obviously, is a kind of discrimination against people already in the office who are afraid of mimes.*

This so far isn’t likely news to anyone who has thought about diversity for more than one second. The real fun (and challenge) comes when we unpack where all this comes from. To do that, we need to split the universe of biases into two categories: cognitive biases, and what I lump (technical term) together as socialized biases.

Cognitive biases

Cognitive biases are what’s talked about in most unconscious bias trainings. These are things like “recency bias” — a tendency to judge people based only on recent evidence — or “halo and horn bias” — allowing one positive or negative experience with someone to color your entire assessment of them.** An example is: Someone gives a lousy presentation, and then you assume they suck at everything forever. (This may sound extreme, but we actually may rule people out that fast.)

Cognitive biases come to us largely from behavioral economics, which is a subfield of economics focused on how people … behave. If it comes as a shock to you that there’s a separate branch of economics about behavior, you are not alone. Indeed, you might even be asking, then what is the rest of economics about? Math.

For a long time, economists tried to make precise predictions about human decision-making using equations about expected value, which is how good an outcome is for you multiplied by the probability of that outcome. For example, the 2017 Mega Millions jackpot was worth about $500 million and the odds of winning it was about one in 300 million, or .00000003%, giving it an expected value of $1.7 (value * probability). Most lottery tickets cost about $2, so according to this logic, you should never buy them.***

But here’s the snag: People do buy lottery tickets. This means the frame of pure math might not be up to the task of actually explaining human behavior. Are we all idiots? Are we all being lied to, or manipulated? Yes — but in some pretty deep, systematic, and powerful ways.

Social scientists Daniel Kahneman and Amos Tversky are credited for formally putting this on the map in economics with an insane paper they published in 1979 in the journal Econometrica where they took down the world of expected utility on its own terms. (Do have a read for some serious drop-the-mic sh*t.) They showed experimentally that people behave differently when faced two identical choices framed differently. Specifically, they were presented with two alternatives with the same expected value, but in one you stood to potentially gain some money, and in another you stood to potentially lose some money. They find that:

“…preferences systematically violated the axioms of expected utility theory.”

And:

“In light of these observations we argue that utility theory…is not an adequate descriptive model and we propose an alternative account of choice under risk.”

Kapow! Right? I know.

From there — behavioral economics took off. It’s now woven alongside research in psychology and political science, too, where many scientists are empirically uncovering all kinds of systematic biases in humans. There are now hundreds of documented biases — some of which make their way to your diversity trainings — and many of which you can read about in this fun list.

I am biased(!) in favor of social science, but I think knowing where all this comes from is cool, and hopefully will make the next powerpoint slide you see on recency bias a little more interesting. I also summarize some of what I think are the more interesting and fun ones in the image below. To give just one example of how deep this research runs, it turns out scientists have spent a tremendous amount of effort trying to understand how humans experience, predict, and recall the durations of things.

There are hundreds more of these

Now, the problem is: Many unconscious bias trainings stop with a list of biases and a message that “we are all biased; now, get back out there!” Ok, but when we talk about diversity we need to go deeper than, say, “similarity bias”. We need to also recognize that humans in different cultures also have biases that are a little more contextual and experience-based than the more scaffolding-in-our-heads style cognitive biases that (in large part, as far as we can tell) are fairly universal features in most human brains.

Socialized biases

When we talk about unconscious bias, fascinating as it is, I worry that we have become so focused on them that we’ve lost sight of— or used these as problematically minimizing euphemisms for — real, persistent, and deep biases. I’m talking about racism, sexism, ableism, ageism, veteran-ism, and more. As much as I love behavioral economics, it’s not helpful or fair to wrap all these up in “similarity” and “availability” bias and call ourselves cured.

The biases I described above largely operate across cultures and groups of people. But, a number of scientists started to generate evidence that there are real cultural differences in other kinds of behaviors and decision-making — especially when the decision-making involved making guesses about what other people are going to do, not just about information in front of them. Many scholars showed variants of this, but a paper I particularly love is one by Joseph Henrich et al. from 2001 that shows that people in different societies play interactive games with differing levels of cooperation.

Anyone who has ever traveled abroad or moved from one company to another is not going to be surprised by this — oh, people behave differently in different societies? Thanks, science.

But this is actually a powerful point. This means we don’t all have the roughly same distribution of cognitive biases in our heads everywhere. It means that something in our social environment is giving us other biases — like how cooperative we think we should be, or how cooperative we think the person we’re working with is going to be. Or, you know, how women or black people or disabled people or older people are likely to or ought to behave.

This game-theoretic entry point I just described by Henrich et al. is only one very small addition to centuries of thinking about the biases we carry in our heads about certain groups of people. Yes, they can change over time and place, but there’s also some pretty persistent ones: we seem to have across the planet come to pretty consistent conclusions about, for example, people of color and women as second-class citizens. (It also turns out in much of social science, for example, there’s a bias against research on gender bias.)

Where do these biases come from? “Socialized” isn’t a particularly helpful answer, as it could point to anything from “the media” to “our parents” (is there anything else, really?). But what we do know is they’re deep and hard to change. But we also know they do change, which if you’re feeling optimistic could be a reason for hope.

There are also millions more of these, both in terms of types of biases and examples of them. I also hesitate to even share these because I feel like I’m reenforcing stereotypes, but I think it’s important to call out perceptions that have been documented about different groups. I also want to acknowledge that as a white, non-disabled, non-veteran Millennial, I have almost no business even listing these, but I want to try to understand. Also, I am truly horrible at captions.

Connecting Cognitive and Socialized Biases

I believe cognitive and socialized biases do not operate in isolation. We know from behavioral economics that we tend to rely on biases more when we’re dealing with uncertainty or limited information. This happens to describe most workplace scenarios — how will this project pan out? Is this person we’re hiring any good? — which means we’re really really likely to use these cognitive and socialized biases in these contexts.

And, they can work together. Similarity bias might mean I’m more likely to be more candid with other people who look, sound, and think like me, which means even less opportunity for me to learn about how my socialized biases might be inaccurate. And, one of the big hurdles to diversity in my view is that because so many people in positions of power in influence look, think, and sound the same, they’re less likely to even have colleagues around them to help call it all out.

I was going to talk more about what to do about all this, but (1) if I really knew, diversity would be solved, and (2) this article is too long so I’ll offer some thoughts on what I think can actually nudge us in less biased directions in cognitive and socialized biases in another article!

Thanks for reading, as always!

* Treading in stereotypes is, apparently, a cornerstone of my diversity education philosophy.

** A more woke colleague than I once pointed out to me that halo and horn bias has some heavy Christian leanings.

*** I tried to get the right numbers for this but the records for the Mega Millions are surprisingly insanely confusing, so they might be wrong. But you get the idea.

--

--