To Be Human is to Be Biased

LRN
6 min readApr 5, 2016

--

By Andrew Soren & Katie Gilbert

At LRN, we believe that principled performance and long-term success emerge when decisions are inspired by sustainable values. We often say this requires more humanity at the core of our organizations. To be human, however, is to be biased. Even when we think we’re making decisions based on our values, there are thinking traps and mental shortcuts we use as crutches that prevent us from making the best decision.

Too frequently, organizations try to tackle biases by imposing process and regulating decision-making with more rules. In this post we’ll show another way. By increasing our self-awareness, fostering cultures with the right behaviors, and architecting our choices, we can create more human operating systems that give everyone the freedom to arrive at better decisions.

HOW BIASES WORK:

The term bias is quickly becoming yet another piece of jargon in the workplace. But biases are an integral part of how we think and an important thing to understand. When we use the term bias, we’re talking about a misalignment between the ways we see the world around us and what might be considered objective or purely rational. Biases can lead us to distorted views, poor judgment, misinterpretation, or being viewed as irrational. We’re often told we need to deal with out biases — as if it were possible to just tuck our biases away and act in a purely rational manner. But we can’t.

Biases show up in just about every decision we make. For example, biases have big impacts in the ways that we lead and manage people in our organizations. Whether we’re hiring, recognizing, coaching, rewarding, promoting, or firing, every moment that matters between an employee and their manager can be a moment where bias hinders potential. For over 15 years, Implicit Association Tests have been used to look at the ways we unconsciously see the world. Researchers have used IATs to explore implications of our biases on real-world business decisions. One study, for example, found that recruiters who had an implicit bias against Arabs rejected more resumes with Arab-Muslim names on them (Rooth, 2010). The same researcher sent job applicants with exactly the same qualifications, but differing weights into interviews (Agerstrom & Rooth, 2011). The only thing that reliably predicted whether the applicant got rejected for a call-back was the score their hiring manager had on the IAT about obesity. How do your unconscious associations impact the ways that you manage your talent?

Implicit associations are what Nobel Prize winning behavioral economist, Daniel Kahneman, calls a heuristic, or mental shortcut. He says it happens because we have two basic systems that are constantly at work in our minds when we make decisions: The first system, (Jonathon Haidt calls it the elephant), works fast and automatically, judging things based on familiar patterns. The second system (the rider) is slow and deliberate, carefully focusing us on the details.

The rider, perched on the elephant’s back, seems like it’s in control of the situation because it’s using all of these sensible inputs to make rational decisions and steer the course. But when you put a mouse or a peanut in front of the elephant, let’s see who’s in control. Sometimes the rider’s methods are in fact just a means of justifying choices the elephant has made. The hiring manager who rejects the resume with the Arab-Muslim name or the candidate who is overweight, is not doing so based on the rational merits of the candidate– even if they may be able to argue that’s exactly what they are doing if called on it.

Consciously or not, bias shows up in certain predictable patterns. We assume that because something happens a lot it must be true, a bias known as availability heuristic. We hate to lose about twice as much as we like to win, a bias known as prospect theory. An object we’ve owned or used seems more valuable to us, a bias known as endowment effect. When it comes to complex decision-making, our elephants and riders will work together to support effective decision making most of the time, but sometimes they lead us astray.

NOW WHAT:

It’s tempting to believe that the only way to manage bias is to impose rules and process to protect our organizations and penalize people for having them. That may work if we could be programmed like computers. But no amount of rules or process will stop our human minds from working the way they do. We’ll find a way to rationalize our biases regardless of how much regulation is put in place to prevent it. The only way to shore up our decision-making is to create leadership, culture and governance solutions that embrace the humanity of biases.

Step one — increase self-awareness. As each of us acts as a leader in our sphere of influence, our greatest opportunity to make better decisions is to follow the ancient imperative to “Know Thyself”. We must become aware of our individual thinking traps, challenge ourselves, argue the other side of the debate, find the evidence that could suggest our elephant assumptions are right OR wrong. To do any of that requires us to be mindful and deliberate in the way that we lead and open to the probability that we are not perfect. When we do find that we might be falling prey to bias, we should check in with others. By discussing our assumptions out loud we can benefit from the perspective of our colleagues.

Step two — establish cultural norms. If being human is being biased, tackling the problem is not a solo sport. We need to foster cultures where our teams, our peers, our coaches, our leaders feel comfortable raising issues and speaking up. All of us are smarter than one of us, as the saying goes. That requires educating everyone in our companies about biases and how they work. Rewarding for the right behaviors. Holding values that make it easier for everyone to be on the look-out for bias. It also means removing potential barriers that stop us from feeling freedom to speak up when we otherwise know that we should.

Step three — be “choice architects”. Ideally, each of us would be self-governing and accountable for challenging biases in ourselves and others. But the real world isn’t ideal. We have the best of intentions, but there are times when we don’t have the wherewithal to be mindful and challenge our automatic beliefs. So how can we nudge ourselves (and our companies) in the right direction by creating systems that help us prioritize and pinpoint our focus? What metrics should we focus on? What defaults and recommended solutions can we crowd-source from our organization and craft into play-books that respond to those metrics? How can we do that while still giving our colleagues the freedom to have autonomy and choose their own solution (not forcing this recommended approach)?

Finding a way to harness leadership, culture and governance in recognition of our biases gives us a human operating system that ensures better decisions.

If we want to create the conditions for principled performance in our organizations, then HOW we work with biases is critical. We can neither ignore them or regulate them away. We have to acknowledge that biases are an unavoidable part of being human, and hold values that give us the courage to surface them in ourselves, and the compassion to challenge them in others. When we we do so, we have the capacity to collectively harness our collective humanity to increase the power of our decision-making.

References

Banaji, M. R., & Greenwald, A. G. (2013). Blindspot: Hidden biases of good people. Delacorte Press.

Haidt, J. (2006). The happiness hypothesis: Finding modern truth in ancient wisdom. Basic Books.

Kahneman, D. (2011). Thinking, fast and slow. New York: Farrar, Straus and Giroux.

Rooth, D.O. (2010). Automatic associations and discrimination in hiring: Real world evidence. Labour Economics, 17, 523–534.

Agerström, J., & Rooth, D. O. (2011). The role of automatic obesity stereotypes in real hiring discrimination. Journal of Applied Psychology, 96(4), 790.

Schwartz, B. (2004). The paradox of choice: Why more is less. New York: Ecco.

Seidman, D. (2011). How: Why how we do anything means everything. John Wiley & Sons.

Thaler, R. H., & Sunstein, C. R. (2008). Nudge. New Haven: Yale University Press

--

--

LRN

Our mission is to inspire principled performance in global organizations by helping them foster winning, ethical cultures and sustainable values. http://lrn.com