How to Overcome Confirmation Bias in Technology Development

Jillian Abel
TMI Consulting, Inc.
4 min readMar 29, 2019
source: jamesclear.com

Confirmation bias is the human tendency to process information by looking for — or interpreting — information consistent with our own beliefs.

These biases not only prevent us from being able to fully observe, engage, and understand our user’s needs, but ultimately they can lead to the creation of biased work products. Today, we know that confirmation bias affects technology development and when we allow our biases to distort what we think we know, we take a risk. Not only might we miss appealing to an entire group of people, but, in some cases, we can cause those populations serious harm.

Acknowledging our confirmation bias allows us to move beyond our interpretation of our users’ needs and shine a light on their actual needs and concerns. When we take the time to develop work products that consider the diversity of human expression, we create the possibility of building better and more inclusive products. These efforts don’t go unnoticed.

You’ve probably heard before that inclusive design practices can expand your audience and increase retention rates. So the question then becomes how do we acknowledge and prevent confirmation bias from negatively impacting our work?

Working to keep confirmation bias out of your process doesn’t just happen overnight. It is something that requires daily practice. However, awareness is the first step. There are three key steps you can take to ensure that your confirmation bias is not affecting your work.

Step 1: Check in with Yourself

  • What was the original goal?
  • What assumptions have you made?
  • Is this reflective of the research?
  • How are you picturing people interacting with your design? And why?
  • Who is that person, and is anyone missing from this picture?
  • What and who is the “norm”? And why?

Step 2: Reevaluate

  • Is the norm actually representative?
  • Does the norm accurately represent the population?
  • Have you considered people who are differently abled?
  • Could this be offensive or dangerous?
  • Could this be used to target someone? Are the odds really that small?
  • Is the imagery creating unintended exclusion?
  • Are you making a deliberate effort to notice individual differences?
  • Are you seeking opinions from people different from yourself?

Step 3: Reset

  • Who else can you discuss your findings with?
  • Can you test your hypothesis with another demographic before you launch?
  • Can you increase the safety?
  • Do you need to consider expanding your demographic?
  • Who else can help you make sure that you don’t run into this problem?

This process, however, is not one that should be tacked onto the end of a project. Instead, it should be incorporated early and often throughout the design process. But there is more to inclusive and ethical design than accommodating “a few different perspectives”. Instead, inclusive design is about making technology convenient, accessible, sustainable, and serving to multiple populations.

In recent years, we’ve seen that creating technology without this perspective can be dangerous. For instance, in a 2019 study from the Georgia Institute of Technology, researchers found that self-driving cars are more likely to hit pedestrians with darker skin. Today, there are many cases of similar failures to recognize human diversity. Take for instance the facial recognition software used by law enforcement. Police officers can use this technology to apprehend individuals with warrants out for their arrest. But according to an FBI co-authored study, cited in The Perpetual Line-Up out of the Georgetown Law Center of Privacy and Technology, many of these police facial recognition systems don’t work well on darker skin tones, which leads to unlawful arrests. Across the board, facial recognition software has a much lower success rate with people of color, often because the test population data these programs are trained on are overwhelmingly white. In fact, a 2014 study analyzing the diversity of one popular open source dataset, ‘Labeled Faces in the Wild’, estimated to the dataset to be 77.5% male and 83.5% white.

‘Labeled Faces in the Wild’

A quick online search to find more examples like these or other similar “diversity fails” is not difficult. To address the effects of confirmation bias at a systemic level, there are things that we can do to not only combat our own confirmation bias, but to support a culture of inclusive technology development.

1. Move beyond compliance, and work to implement inclusive design training standards.

2. Use diverse training datasets and, where relevant, work to add new data to improve your dataset.

Diverse Dataset Success

3. Start with accessibility as the norm, instead of a feature.

The Paciello Group.

4. Always Ask Why

Make sure to check in with your intentions. Refer back to our 3 steps from earlier.

Additional Resources:

https://erasinginstitutionalbias.com/

https://www.smashingmagazine.com/2018/03/ethical-design-practical-getting-started-guide/

https://uxdesign.cc/what-my-color-blindness-taught-me-about-design-d3009a93ff9c:

https://blogs.adobe.com/creative/5-things-need-know-diversity-inclusion-design/

https://uxdesign.cc/how-to-look-at-evidence-and-not-translate-it-into-your-own-agenda-9860171b7ba9:

https://uxdesign.cc/designing-forms-for-gender-diversity-and-inclusion-d8194cf1f51

--

--