Cognitive biases in software development. Part 1. Developers

Myroslava Zelenska
8 min readSep 7, 2018

--

Unite you 2 favourite topics — mind and software development — and get awesome post :)

1st part, next posts — second part, third part, fourth part.

Cognitive biases — what are they?

Cognitive biases are systematic errors in human thinking, a kind of logical traps.

In certain situations, we tend to act on irrational patterns, even when it seems to us that we act according to common sense. In order not to drown in an overabundance of information, the brain has to cut off and filter a huge amount of information, and quickly, almost effortlessly, decide what is really important and should be brought to our attention. Part of the information we cut off turns out to be valuable and important. Sometimes we come up with non-existent details only because of assumptions and prejudices, and also build up meanings and stories which never existed in reality. See full list of those: List of cognitive biases

Some of the rapid reactions and decisions turn out to be dishonest, self-serving and unproductive. Let’s look deeper.

Cognitive biases developers fall for

Survivor’s bias: This won’t work for everyone!

In World War II, the Hungarian statistics Abraham Wald was given the task of analyzing the nature of the damage to British bombers and giving advice on armouring, given that the main hits were on the wings and tail. It would seem that the obvious answer is that you need to armour places which were hit often. Wald thought and decided: no! Just those places are sufficiently protected, since after the hit the plane can fly to the base, and those planes, which were hit on other places, did not reach the base. So it is necessary to armour those places which were not hit, in this case — the cabin and the fuel tank.
After all, we do not want to make the mistake that all those who study other’s success stories do, forgetting the history of failure — the error of the survivor. Asking the programmer how he got his first job, we’ll hear a standard story: reading books, going through training, communicating with a friend/teacher, sending resumes, and going through interviews. But will it work in your case? You read about Jobs, Zuckerberg, Branson, Mask … But you completely overlook those who are much more and whose story is much more important — those who failed. We, like the British Air Force engineers, need to armour vulnerabilities.

Also, confusion often overtakes us when we need to find a solution. Hey, we just solved a similar problem, so let’s use the same solution, because it worked! Do you use the same design patterns over and over again? If so, you are probably looking at different problems through the same lens — the lens of the survivor.
What can you do against this effect?

  • Be careful when jumping onto known decisions and someone else’s experience. Double check you won’t fall in this pit.
  • Communicate with other teams — maybe you don’t take smth into account. Listen to your team on every ceremony — maybe they’ve already spotted your weak places.

Fundamental Attribution Error: Probably they are not idiots?

Everything is very simple — individuals are inclined to explain the motives of their actions by external causes, and others — by personality traits. If more precisely: their fails are explained solely by a hard life, other peoples’ fails — solely because those people are idiots, losers and generally nothing. With wins — on the contrary. Own win — only because you are smart, strong, ambitious, someone else’s win — only because he/she was lucky, had rich parents or so else.
When someone else writes a bug or takes the site down, it is because they are a negligent, crappy engineer. If you create a bug it is because you were tired, there weren’t enough automated tests, you were rushed, the requirements were poorly defined, or because it was a full moon.

Sometimes I can get really annoyed when looking at some code where I think “oh damn, this design won’t get us very far, what a stupid developer” — but wait! — how much do I really know about why the developer took that design decision? Maybe there’s an additional change to come which affects the design problems I spotted? Maybe he had valid reasons not to spend more time on it, like fixing a critical production bug instead? By being aware of this cognitive bias and working to focus on the situation instead of the personality of the human involved, we can reduce the negative impact that this bias has on our lives.
What can you do against this effect?

  • Be careful when attributing behaviour to personal characteristics. Talk to the person before jumping to conclusions.
  • Make sure people get in direct contact so they understand the context under which the other person was acting. Daily Standups, Retrospectives, Planning Meetings…

Confirmation bias: Notice your blind spots!

Ever wasted an hour chasing a bug where you “knew” what caused it, only to find out it came from a completely different area in your app? We tend to interpret information in a way that matches our existing beliefs. When you “know what you are looking for”, you are less likely to pay attention to information which contradicts your belief. You have to actively look for contradicting information, otherwise your brain tends to filter it out.

This bias most commonly occurs in manual testing. When we write some new code, we are biased towards testing the cases that we know will work. This allows us to spend a short amount of time testing (because everyone hates manual testing) and then proudly declare that “it works!”. This bias can result in poorly tested code, and the miserable practice of throwing code over the wall and having other people clean up the mess.

This is one of the harder biases to get over in my opinion, because it means acknowledging our own limitations, and really stressing the fragile parts of the code that we write.
What can you do against this effect?

  • Realize that information which proves you wrong is much more valuable than information that confirms what you already know
  • Keep testing, and always question your assumptions
  • Do some “break the model”-thinking whenever something is easy to interpret: what other explanations exist for what I just observed? If you don’t find any, you’re certainly wrong.

Sunk Cost: Don’t stick to shit, just because it’s expensive shit!

Studies often use a classic example where you have paid for a cinema ticket, and then notice that the movie is boring. Do you leave or stay? Most people stay, being afraid to acknowledge the lost money. What do you do with your evening? Using the “we already invested so much time” as an argument is called the “Sunk Costs Fallacy”: seeing all the money — or here effort — that was put into a decision, people tend to stick with it. Even if, from a rational point of view, an alternative is clearly cheaper. In studies this effect was proven to be even more prevalent if you were the one who was responsible for the original decision.

I personally wonder if this might also be especially relevant with consensus group decisions, where the the emotional barrier for questioning a former decision might be even higher.
What can you do against this effect?

  • Actively ignore the previous emotional, monetary or time-wise investment you made
  • If you feel you’re still attached to that investment, pay special attention to what your critics are saying. They might see the situation more clearly.
  • Help to create an environment where reverting a decision or stopping a project is not seen as personal failure. Failure-blaming environments raise the likeliness to stick to a bad decision.

The Bandwagon Effect: Sir yes sir!

The bandwagon effect asserts that conduct or beliefs spread among people, as fads and trends clearly do, with the probability of any individual adopting it increasing with the proportion who have already done so.

The most common area where I have seen this is when evaluating third party software. I’ve actually walked into meetings where I thought everyone had decided not to purchase a given tool, and then walked out planning to sign a contract. Once one person who is well respected outlines their opinion, it’s appealing for everyone else to hop on board.
What can you do against this effect?

  • Things like blind voting can work well in the right situation. If you have a big decision to make (like spending six figures on a third party tool), it makes sense to have all of the stakeholders write out their opinions in private first, without having their position influenced by other (potentially more senior) people.

Hyperbolic Discounting: I want it right NOW!

Given two similar rewards, humans show a preference for one that arrives sooner rather than later. Humans are said to discount the value of the later reward, by a factor that increases with the length of the delay.

Stated simply, this means that something less valuable today is more appealing than something more valuable that will come in the future. This is frequently how technical debt is incurred. Poor design decisions and shortcuts are sexy because they give you a small amount of value right now (not having to do the work to architect things properly), and you dramatically discount the value you would get in the future by doing it right the first time. It is very difficult for humans to project the long term cost of shoddy code in terms of time we will spend debugging, refactoring, and interpreting it.
What can you do against this effect?

The way around this is to deliberately stop and do an estimation exercise:

  • think about how long the refactor will take, and be extremely generous (e.g. double your first estimate)
  • think about how many people will be working on the code, and how often
  • figure out how long it would take to “pay off” the time you spend refactoring.

Part II (to be continued)

--

--

Myroslava Zelenska

Geek project manager with nonstandard thinking. Passionate for neurology, intellect, mind and all about ‘how-this-damned-brain-works’.