Bias in UX research

This are my personal notes from Google UX professional certificate.

Genís Frigola
15 min readJul 27, 2021

A bias is a tendency, inclination, or prejudice toward or against something or someone. Biases are often based on stereotypes, rather than actual knowledge of an individual or circumstance. Such cognitive shortcuts can result in prejudgments that lead to rash decisions or discriminatory practices. It’s like making up your mind about someone before you’ve really gotten to know them.

A bias is favouring or having prejudice against something based on limited information.

Biases can seriously impact your user research and negatively influence the design of your final product. We all have biases, and they’re often unconscious. While we can’t completely get rid of biases, we can be more aware of them and work to overcome them. In UX design, this is critical to product success and to your professional development.

We’ll examine six types of bias in UX research:

  • Confirmation bias
  • False consensus bias
  • Primacy bias
  • Recency bias
  • Implicit bias
  • Sunk cost fallacy

Confirmation Bias

This bias occurs when you start looking for evidence to prove a hypothesis you have. Because you think you already have the answer, you’re drawn to information that confirms your beliefs and preconceptions.

Let’s say you have the preconception that left-handed people are more creative than right-handed people. As you research, you’ll tend to gravitate toward evidence that supports this belief, and you’ll use it to build your case, even though it’s not necessarily true.

One of the most effective methods for overcoming confirmation bias during research is to ask open-ended questions when conducting interviews. An open-ended question lets the person being interviewed answer freely, instead of with a yes or no.

You also want to get into the habit of actively listening without adding your own opinions. That means you aren’t leading your interviewees toward the answer that you want them to give.

Another way to avoid confirmation bias is to include a large sample of users. Make sure you’re not just looking for a small group of people who fit your preconceived ideas. You want to have a big sample of users with diverse perspectives.

False consensus bias

The false consensus bias is the assumption that others will think the same way as you do. In UX research, the false consensus bias happens when we overestimate the number of people who will agree with our idea or design, which creates a false consensus. It’s possible for the false consensus to go so far as to assume anyone who doesn’t agree with you is abnormal.

You can avoid false consensus bias by identifying and articulating your assumptions. For example, you might live in a community that often identifies with certain political beliefs. When you meet a new person, you might assume they share your political beliefs, because you both live in the same town. But that isn’t necessarily true. Finding a few people who do align with your beliefs and assuming they represent the entire community is a false consensus. That’s another reason to survey large groups of people.

Recency bias

Recency bias occurs when it’s easiest to remember the last thing you heard in an interview, conversation, or similar setting, because it’s the most recent. When talking to someone, you’re more likely to remember things they shared at the end of the conversation.

To overcome the recency bias, you can take detailed notes or recordings for each interview or conversation you have. This way, you can review what people said at the start of the conversation in case you don’t remember.

Primacy bias

Primacy bias occur when you remember the first participant most strongly. Sometimes the first person you meet makes the strongest impression, because you’re in a new situation or having a new experience.

The primacy bias, like the recency bias, is another reason to take detailed notes or recordings, so you can review everything that happened, not just the memorable first impressions.

Recency and primacy biases also demonstrate why you should interview each participant in the same way. Consistency makes it easier to compare and contrast over time. Consistency makes it more likely that you’ll remember the unusual and important moments that happen throughout your research.

Implicit bias

Implicit bias is a collection of attitudes and stereotypes we associate to people without our conscious knowledge. Implicit bias is also known as unconscious bias.

One of the most common forms of implicit bias in UX is when we only interview people within a limited set of identity profiles, such as race, age, gender, socioeconomic status, and ability.

These profiles are generally based on assumptions we have about certain types of people. For example, implicit bias might cause you to feel uncomfortable interviewing people whose life experiences are different from your own.

On the other hand, we might choose to interview people from typically excluded groups, but then ask potentially offensive questions because of our internalized stereotypes.

Both of these scenarios are problematic and lead to a lack of representation in our research and design process. The most important thing to note about implicit biases is that everybody has them.

To overcome our biases, we can reflect on our behaviours, and we can ask others to point our implicit biases. That’s one of the best ways we can become aware of our biases.

Sunk cost fallacy

Sunk cost fallacy is the idea that the deeper we get into a project we’ve invested in, the harder it is to change course without feeling like we’ve failed or wasted time.

The phrase “sunk cost” refers to the time we’ve already spent or sunk into a project or activity. For example, you might think to yourself, I might as well keep watching this terrible movie because I’ve watched an hour of it already.

For UX designers, the sunk cost fallacy comes into play when working on a design. You might have invested hours into designing a new feature, but then learned that the feature doesn’t really address a user problem. It’s easy to keep working on a design that you’ve invested time into. But ultimately, you need to focus on work that positively impacts users.

To avoid the sunk cost fallacy, break down your project into smaller phases, and then outline designated points where you can decide whether to continue or stop. This allows you to go back based on new insights before the project gets too far along.

Now that you know about these biases, you might even start noticing them in your daily life. The more that identifying bias becomes a habit, the better you’ll get at avoiding bias in your design process.

Preventing bias in data collection

It’s important to note that everyone has biases. It’s just a natural part of being human. Being able to recognize your own biases and prevent them from affecting your work is what really matters. As a UX designer, you’ll need to know how to anticipate, identify, and overcome biases in your research, in particular.

Choose your words carefully. While conducting research, it’s important to use words that don’t lead the user in one direction or another. Of course, as a designer, you’re going to be partial to the designs you’ve created, and you’ll likely assume that users will appreciate them too. That’s why you designed them! But when asking users questions about their experience using your product, you don’t want them to answer in a particular way just to please you. Choosing leading words can cause the framing effect, where users make a decision or choice based on the way information was presented to them.

This is especially critical in usability studies. For example, imagine a participant is testing your designs. You ask the participant: “Do you like or dislike the improved layout of these buttons?” Because you used the word “improved,” the user will most likely reply positively. But, this isn’t very useful feedback because you framed the question in a way that led the participant to respond accordingly. To improve your product, you need honest feedback.

Instead, a better way to frame the same question is: “Explain how you feel about the layout of the buttons.” This phrasing allows the user to come to their own conclusions without any outside influence, which will give you better data about their thought process and experience.

Foster independent thinking. Group interviews can be affected by the bandwagon effect, or going along with the group’s opinion instead of thinking creatively, which can discourage open discussion by people who have an opinion that doesn’t align with the majority of the group.

For example, imagine you’re conducting research with a group of five participants. You ask each person in the group to share their thoughts one at a time about a particular product design choice, like the placement of a button on the home page. By the time the last person shares their thoughts, their feedback will be affected by all of the answers that were shared before them. To combat the bandwagon effect, ask participants to write down or record their thoughts before discussing as a group.

Avoid specific language. It’s important to be mindful about the types of questions you ask users and how those questions are framed. You’ll need to be careful to avoid confirmation bias, which is trying to find evidence to prove a hypothesis you already have.

Confirmation bias is particularly prevalent in online surveys. For example, imagine that you’re conducting an online survey with a large group of participants. One of your survey questions is: “How do you use our product?” As the designer, you have a few ideas about how you think people use your product, so you provide four options with specifically worded language that the participant has to choose from. If none of the options you’ve provided apply to the user, they can’t select “other” or skip the question, so they’ll be forced to choose one of the multiple-choice answers that doesn’t match their actual experience. That means you’ll end up with false information that skews your research data and potentially provides incorrect evidence for a hypothesis you already had.

Remember, in a survey, you want measurable results, which is known as quantitative data. You can reframe the question in your survey to ask participants to rate their experiences using the product, which will be a more accurate way to gauge how they felt about using it.

Limit the guidance you give users. Everyone learns and thinks in different ways. When you’re conducting any type of UX research, you have to be cautious to avoid experiencing any false consensus, which is the assumption that others will think the same way as you do.

If you’re conducting a usability study, some of the participants will not follow the product’s user flow in the way that you might expect. For example, a user might click through the menu, select a folder, and then select a subfolder to complete a task you assigned them, when there’s actually a simple hyperlink on the homepage that could have saved them time. In addition, some participants may use assistive technology to navigate the product and might follow an entirely different flow.

It’s important to let participants follow their own paths through your product, without interrupting them. Interrupting a participant while they’re experiencing your product will deprive you of useful data that can help you understand how to improve your designs. Instead, ask participants to narrate or break down their user journey with your product, as they move through the flow. This will allow you to better understand their thought process as they navigate through your designs.

Consider users’ tone and body language. You’ll work with many different users and participants throughout your UX career, and part of your job will involve interpreting their nonverbal cues, like vocal tone and body language. To avoid experiencing implicit biases, which are based on the collection of attitudes and stereotypes you associate with people without your conscious knowledge, it’s important to clarify when you think you’re getting mixed signals from a participant.

For example, imagine you’re conducting a one-on-one interview, and the participant has their arms crossed over their chest. This can be interpreted as a sign of feeling defensive or insecure, which might contradict the positive feedback they are sharing verbally about your product. This is a great time to ask the participant questions, like “Is any of this making you uncomfortable?”, which can encourage them to explain that it’s cold in your office and they’re just trying to warm themselves up. Always ask questions if you’re unsure about the intention of a user’s tone or body language!

For this feedback process to work, however, it’s important to make sure participants are comfortable sharing their thoughts with you. Before the research begins, ask participants about themselves or make light conversation. Starting with easier questions can help reduce anxiety or awkwardness throughout the study.

Be careful of your own body language and reactions. You also have to be mindful of your own tone and body language while interacting with participants. Social desirability bias can happen when a participant answers a question based on what they think you want to hear. If you ask a question to a participant, and they notice you exhibiting a visual or audible clue that suggests your own opinion about the question, they might answer in a way that they think will please you.

For example, imagine you’re describing a feature of the app you’ve designed that really excites you, and your tone of voice changes. If this happens, it’s likely that the participant won’t be honest about their negative opinions of the feature, since you’re so positive about it. If you want the data you collect to be useful, the user has to feel comfortable sharing their true, unfiltered feelings about the product. It’s your job to guide them through the process without accidentally influencing their answers. One way to do this is to reassure participants that their answers won’t hurt anyone’s feelings and that you really want to hear their honest opinions in order to improve your work.

Plan your research effectively. Tight deadlines are inevitable. But as a UX designer, it’s essential you get enough time to recruit the right users for your research. Availability bias occurs when you rush the user recruitment process or skip screener questions to attract a bigger pool of users, even if they don’t fit the qualifications or characteristics that you’ve already determined are present in your ideal user.

The research that you collect is vital to your product design process. So interviewing users that don’t fall under your personas won’t give you the data you need to improve your designs. If you’re having trouble recruiting the right users before your deadline, offer a better incentive for participating in your study, adjust your recruitment strategy, or ask your project manager for more time. Don’t just take any user who’s available.

Remain open minded. One more tip: When you’re conducting research, you have to work hard to treat all information equally to avoid both primacy bias, which is remembering the first user more than others, and recency bias, which is most easily remembering the last thing you heard. To help combat these biases in your own research, it’s helpful to space out the scheduling of interviews, ask your colleagues to join you during interviews to provide additional opinions, and take careful notes.

Combating bias as a UX designer

Although having biases is normal, it’s essential to try to eliminate bias from your research process to get the most accurate understanding of your users’ needs. Knowing the types of biases that exist and how you can avoid them will help you recognize when it’s happening, so you’re already off to a great start!

If you’d like to learn more about biases in UX research, check out this article on overcoming cognitive bias in user research from Design at NPR.

One tool that can help you identify and explore your own implicit biases is the Implicit Association Test (IAT), created by researchers at Harvard University. The IAT is not intended to be an exhaustive assessment of bias, but it can provide valuable insights that can help you better serve all users.

Take a Test (harvard.edu)

To go deeper in the topic

Source: Types of User Research Bias and How to Avoid It in Your UX Design (playbookux.com)

Other types of bias in UX research are:

  • CULTURE BIAS: This happens when a researcher interprets results based on their own cultural beliefs or attitudes rather than from a neutral point of view. It can also involve a moderator making involuntary suggestions to a participant which affects the way a participant answers.
  • SOCIAL DESIRABILITY BIAS: This type of bias occurs when participants answer what they think a moderator or business wants to hear. This bias could involve a participant giving similar answers to questions that seem alike. Similarly, they could also use their opinion of your brand to answer questions in a positive rather than a neutral, objective way.
  • THE HAWTHORNE EFFECT: This happens when participants are very aware that they are being observed. So, they focus more on what they are doing and try harder to solve any issue. But, this is not always what a real user would do.
  • AVAILABILITY BIAS: This type of bias happens when a researcher lowers the recruitment filters or avoids using screener questions so that they can source the required number of participants in a short amount of time. Similarly, stakeholders or sponsor companies could choose participants who they favour as they are more proactive. Both these instances could reduce the likelihood of getting objective insights.
  • WORDING BIAS: Also known as the framing effect, this form of bias takes place when a researcher frames a question in a certain way that suggests an answer.

Six tips to avoid user research bias in your UX design

  • Note down your assumptions before you begin the study. When conducting user research, be aware of any general and specific of any general and particular assumptions you have concerning a project. Using an assumptions map to list these out together with the input from the rest of your team to avoid user research bias.
  • Choose participants who are representative of your target audience. It’s not possible to recruit the same number of participants for each usability study. Rather than focusing on numbers, you need to determine how many participants you need based on the number of target personas you have developed for your brand. In this way, the insights you collect can apply to all of your target audience and avoids user research bias.
  • Banish user research bias by learning how to structure and write a user test script. When sourcing a user’s opinions, intentions and preferences about your product or service, your brand should ask open-ended questions that don’t present answers only based on your assumptions. In the same way, you can also present a task as a goal or a scenario so that you can learn more about how they interact with your website or app.

Users should never be pushed into confirming a specific outcome or problem. If you do this, you might not uncover any other issues. When writing your script, your wording should also be clear, neutral and straightforward. In this way, you can probe further into the mind of the user with follow up questions so that you understand what is important to them.

  • Collect a mixture of quantitative and qualitative metrics. Using quantitative metrics forces you to look objectively at insights gained from a study. Couple metrics like time-on-task and the system usability scale system with qualitative data such as sentiment analysis. You can also use annotations and time-stamps when watching your videos to cluster similar issues or problems together.
  • Gain additional perspectives from competitor research and other team members. Conduct a competitor analysis in addition to unmoderated or moderated interviews of your product, or other similar brands. Discover what participants like or dislike about your competitors so that you can use this data as another point of reference for improving your user experience. Consider the perspectives of your team members by sharing any videos or insights from studies with them.
  • Avoid user research bias by listening as well as watching your body language. When moderating an interview, get rid of user research bias by letting your participants talk more than you do. If you want to clarify something that they’ve said you could ask them a follow-up question like ‘why do you think so?’ Learn to mask your emotional reactions so that participants are encouraged to reveal their true feelings about a product or service.

Your brand will never be able to remove a very part of research bias from your UX research. However, being aware that perceptions, attitudes and culture can affect the results of a study will help you look at the data more objectively. All team members should be involved in the UX research process so that together you can get more accurate data for your product or service.

--

--

Genís Frigola

I am a young UX researcher & designer with 3 years of European experience. I am a team player and straightforward, with experience in virtual teams and projects