Bugs in your reason — common cognitive biases and what you can do about them

Daragh Byrne
Pointer IO
Published in
7 min readNov 19, 2015

This article is exclusive to Pointer — a reading club for developers.

Signup at Pointer.io for our weekly newsletter.

The edge of reason

I’m a rationalist. I like logic. I love discovering or inventing coherent, consistent ideas about the world. Creating artifacts based on these ideas gives me great pleasure. That’s why I was drawn to the engineering world.

Reason is the foundation of creativity. It solves problems. Solving problems brings personal satisfaction (and, for many of us, paychecks).

Most engineers value a rationalist viewpoint. We’re pretty attached to it; for many of us, it’s a huge part of our identity.

But how valid is it? Are there bugs in our reason?

Simple observation indicates that many people have problems with rationality. Most likely all you have to do is look around your team, your organisation, your friends or family to see this!

Humans don’t always act in a rational manner.

So what’s a better model to understand human behavior? Why do we deviate from rationality, and what can we do to prevent this?

Decisions, decisions

Daniel Kahneman and Amos Tversky introduced the idea of cognitive bias to refer to a set of features of our reasoning processes that explain the way in which human reasoning often appears to deviate from rationality. You can read all about their research in Kahneman’s book Thinking, Fast and Slow — reviewed here.

A “bias” is an systematic deviation from perfect logic. Many biases arise due to the mind using shortcuts to provide a best-guess analysis of a situation, which conserves mental energy.

The trade-off is sometimes accuracy in prediction — think order of magnitude calculation as opposed to running a full-scale model.

Engineering is all about trade-offs — often they’re worth it, if you know what you’re doing. Cognitive biases tend to be unconscious though, operating at a level beneath deliberate thought.

I’ll spend the rest of this article explaining a number of common cognitive biases and how they might affect your engineering work.

Let’s start with confirmation bias.

Confirmation Bias

What is it?

Confirmation bias is the tendency to look for evidence that supports preconceived ideas or beliefs. We generally don’t want to accept that our beliefs about a situation are wrong (hello ego!), so we are biased against evidence that suggests otherwise.

How can you work with it?

It’s Friday afternoon. You’ve been working on a bug all day.

You’ve spent a few hours playing with the front-end code. You’ve convinced yourself that this code is at fault, based on a gut-feeling that turned into a hypothesis.

You’re so convinced of this that you’re blatantly ignoring the fact that a call to the server is returning a bad value, so you waste a lot of time on a wild-goose-chase.

Sound familiar?

Confirmation bias often shows up in overconfidence about the source of problems and proposed solutions.

We become very attached to the hypotheses we propose, ignoring evidence to the contrary. In this case I was convinced that the front end code “had to be” the problem.

Watch out for times when you are thinking like this — you might be falling into the confirmation bias trap.

There are a few things you can do to counter confirmation bias.

Firstly, be ruthless with your own hypotheses.

Ask yourself whether you’re taking confirmation bias into account when evaluating your proposed solutions.

Start by asking why this is wrong, rather than why this is right.

Look for a second opinion but be careful not to “double up” on confirmation bias!

Secondly, analyse your analysis. Engineers are lucky enough to work with data, but the way we interpret it is obviously affected by our biases. Are you being completely honest about what your results are telling you? Are you ignoring a data point or result set that’s telling you something that goes against your theory?

Thirdly, be humble about your solutions. Many of us take pride in producing code, designs, prototypes and other artefacts — so it can be hard to accept that they might be wrong. It’s more productive to accept this as a starting point — a little humility goes a long way.

Confirmation bias often coexists with what at first glance appears to be its opposite — negativity bias.

Negativity bias

What is it?

The negativity bias is a tendency to more easily recall negative events than positive, or to be more heavily influenced by information that is perceived as negative.

It has its roots in evolutionary psychology. I’ve written about this at length here — but to summarise, your mind is a survival device concerned mostly with keeping you safe, so it tends to concentrate on the things that might happen, rather than the reality of here and now.

It does this because it wants to be prepared to take action in the case that the worst happens.

Interesting situations can arise when our confirmation bias attempts to confirm hypotheses that are proposed by our negativity bias!

How can you work with it?

It’s happening again. I’m in another planning meeting with a team member who only sees the downsides in every proposed solution and team decision. It seems like we’re living in a world without upsides.

It’s starting to drag down morale, and worse, it’s becoming a habit for other team members too….

Negativity bias isn’t always a bad characteristic in an engineer. Working to anticipate the potential issues with a product or service is often extremely valuable. Too much negativity just isn’t good for a team though.

Recent findings in positive psychology indicate that deliberate focus on what’s good about a situation can counteract negativity bias.

I’ve worked mostly in agile environments. I’ve always insisted that retrospectives always start with “what we’ve done well” rather than “where are the problems”.

If negativity is eating your life at a personal level, the deliberate practice of gratitude can been a revelation.

It’s quite simple. Write down four things that you were grateful about at the end of each day.

This can be something extremely basic, such as “I enjoyed that cheese sandwich”. Or “my ideas were well received at that meeting”. Or something more substantial.

Try it for a week, or a couple of weeks.

I did this over the course of an extremely difficult year. When I reviewed my gratitude journal, I had plenty of evidence that things were never as bad as my biased mind made them out to be.

Fundamental attribution error

What is it?

The Fundamental Attribution Error (FAE) is a social bias that essentially means we’re likely to judge someone’s actions as representative of their personality traits, rather than their circumstances (for example, coercion, or a pressing need they find themselves having to meet).

We’re not always very good at giving each other a break — even when we know the reasons why another person is behaving in a manner that annoys us! This can lead to friction, anger and arguments.

This 1967 paper demonstrated that participants tended to ascribe the views expressed by a speechwriter as fundamentally the views of the speechwriter, even when they knew that the speechwriter had been randomly assigned one of two contradictory viewpoints to write about!

How can you work with it?

You’re furious at your development manager. He’s just given you news that the scope of the project is changing again! What an idiot, can’t he get his priorities straight, this is really screwing with your zen. What the hell is wrong with him?

Behind the scenes he’s dealing with some complex inter-departmental politics that he’s doing his best to shield you from. The scope changes are largely not his fault and there are a myriad of factors that he’s working against. But you still blame him personally, causing friction in your life.

It is easy to personalise managerial decisions that you disagree with — blaming the people, or the group, rather than the context.

Working in a team will always bring differences of opinion. Colleagues will make decisions or take actions that leave us scratching our heads — or worse, wrapped up in seething rage!

The FAE means we’ll inevitably personalise things. Developing emotional intelligence (the ability to work with your own emotional responses, and those of others) allows us to moderate our responses.

For me, mindfulness meditation is the tool that I’ve found most useful for helping me deal with tricky emotional states. Others find value in simply talking it out, or daily journaling.

Conclusion

It can be challenging to accept that your reasoning ability has systematic flaws, particularly when you are in a career where that ability is valued.

Acceptance of your limitations is the first step towards overcoming them though. You’re already building an edge by learning your own flaws, and how to work around them.

Have you seen any of these biases play out? I’d love to hear more in the comments below, or via daragh@thehappytechie.com.

I write about happiness for technical minds over at TheHappyTechie.com. I’m interested in how findings in positive psychology and neuroscience, as well as practices such as meditation, can lead to optimal living and working for technically minded individuals.

Come say hi! Or follow me @daraghjbyrne.

--

--