How to avoid unconscious bias

Michael Bond
Common Collective
Published in
5 min readDec 3, 2018

Intuitive judgements are not always good. Can we think our way around them?

Photo by rawpixel on Unsplash

EARLY in his career, the American psychologist Irving Janis joined an addiction clinic to study the behaviour of people trying to give up smoking. As the course progressed, he noticed that rather than cut down their habit, members of the group put pressure on each other to increase the number of cigarettes they smoked each day. They managed to convince themselves that heavy smoking was almost incurable, and that trying to reduce their tobacco intake was pointless; anyone who disagreed was ostracised. Meanwhile, the group became increasingly amiable and harmonious, and Janis concluded that this cohesiveness had become more important to its members than their own health, a phenomenon for which he later coined the term ‘groupthink’.

Most workplaces are nothing like addiction clinics, yet they are subject to the same kind of psychological effects — particularly in meetings. Group decision-making is popular with managers because it is democratic, and pooling collective knowledge seems like a smart thing to do. Unfortunately, if it is not carefully orchestrated, it often turns out badly, either because of groupthink, or because like-minded groups quickly become polarised in their thinking, or because members with valuable dissenting views are afraid to share them.

Flawed group dynamics are caused by what psychologists call unconscious biases — intuitive patterns of behaviour and thinking that happen outside our awareness. We’re all vulnerable to them, whether we’re thinking in groups or on our own. They kick in automatically, and likely evolved to help our early ancestors respond quickly and effectively to things that threatened them. Those ancestors lived in very different environments to us, and the cognitive shortcuts that they found so useful can sometimes trip us up (though in some situations they still serve us well). If you’d like to know more about what drives intuitive decision-making and why it trumps deliberate analysis, the psychologist Daniel Kahneman wrote an exhaustive book about it.

Unconscious biases distort our decision-making in dozens of ways. We seek out knowledge that confirms our existing beliefs (confirmation bias). We focus too heavily on the first piece of information we encounter (anchoring), a tendency often exploited by salespeople. We underestimate how long it will take us to complete a task, always imagining the best-case scenario (the planning fallacy). We assume that someone who is good at one thing is good at another (halo effect). We make quick judgements about people based on general preconceptions (stereotyping). We value things — ideas, objects — more highly if they belong to us (the endowment effect). We believe that past events affect the likelihood of future ones (the gambler’s fallacy).

Recently at Common, we’ve been thinking a lot about these cognitive quirks while working on a project with the e-learning consultancy Green Onyx, developing a programme that will help their clients recognise and deal with unconscious bias in organisations (if you’re interested, you can sign up to it here). Some unconscious biases are particularly prevalent in the workplace. Groupthink, polarisation and other social pulls are obvious ones to watch for. Gender bias is another. There are copious examples of this. Job ads in male-dominated industries often use stereotypically male wording, such as ‘leader’, ‘competitive’ and ‘dominant’, that may put off female applicants. In some parts of academia, peer reviewers critique papers by female researchers more harshly (this doesn’t happen when submissions are made ‘gender blind’). Many studies have shown that in group situations women are interrupted (by both genders) far more than men.

If you’re curious about your own biases on issues such as gender, sexual orientation and race — which a Guardian survey recently identified as a major concern in the UK — you can take Harvard University’s Implicit Association Test (IAT). (It’s worth noting that if you harbour unconcious biases — and most of us do —it doesn’t necessarily mean you’ll behave with prejudice in any given situation. In fact there is considerable disagreement among academics about what the IAT actually says about a person.)

It can be difficult to overcome unconscious bias. Dan Kahneman has said: “It takes an enormous amount of practice to change your intuition. Intuition rules decision-making, that is human nature and that is how it is going to be.” Some researchers claim that anti-bias training programmes don’t work in the long-term because they don’t remove the underlying problem; likewise, tests such as Harvard’s IAT can lead people to think that their bias is innate and unchangeable. One training regime aimed at tackling gender stereotypes actually increased participants’ prejudices. Understanding the issue and learning how to recognise it are good first steps but clearly they’re not enough to change behaviour.

With all that in mind, in our work with Green Onyx we’ve identified some approaches that academic studies suggest can make a difference. For example, in one successful intervention, participants were taught to actively reject any information that reinforced stereotypes (by declaring ‘that’s wrong!’ to themselves each time they came across it). Another well-tested method is to watch videos or play games that take the perspective of someone in a minority or disadvantaged group, which forces you to see the world through their eyes.

Ultimately, the best bet may be to prime the environment: establish a set of norms or principles in your organisation that make prejudice less likely, or design interventions that negate those troublesome mental short-cuts. A good illustration of effective design is the Applied digital recruitment platform created by the Behavioural Insights Team, which helps employers keep their recruitment strategy fair and free of bias (it ensures that candidates are selected on their talents rather than on gender or other attributes).

Or you could apply the tactic used in team meetings by Eric Schmidt, formerly executive chairman of Google. “What I try to do … is to find the people who have not spoken, who often are the ones who are afraid to speak out, but have a dissenting opinion,” he told McKinsey Quarterly. “I get them to say what they really think and that promotes discussion, and the right thing happens.” It worked for him.

--

--

Michael Bond
Common Collective

Writer on human behaviour. Author of ‘Wayfinding’ (Picador, 5 March 2020). Founder member of @common_org. New Scientist consultant