How to get started with user research

Kira Chung
VMware 360
Published in
8 min readDec 9, 2021

--

Co-author: Jen Mead

User research, and why?

User research forms a crucial part of the design process. In order to design a truly meaningful and relevant user experience, we need to first understand our users, their motivations, and their goals.

First, let’s demystify user research.

Simply put, research is all about solving for what you currently don’t know.

Think of it as the air traffic control tower of runways. The tower knows what type of planes are on the runways, when they need to take off, where they are coming from, and more. They carefully coordinate with pilots (your designers) so that pilots will have everything they need to get their passengers where they need to go. Without research, your team may be at risk for tackling the wrong problem or flying blind by basing solutions on assumptions only.

All your work and time may be for naught if you build something that nobody ends up using.

There are three main reasons for doing user research:

  1. To align product and business strategy with the real needs and goals of users
  2. To set the foundation for initial design work and validate design direction
  3. To evaluate designs and address biases and assumptions

Research should also not be siloed.

Involve everyone who has interest and investment in the research goals, including product managers, designers, engineers, and more. Communicate the value of user research to internal stakeholders by creating an actionable but flexible research plan that demonstrates all that can be achieved with your research. We believe that everyone wins with research, so the more we invest in it, the more successful our users will be as they use our products.

Include your stakeholders in every step of the process, including kickoffs, planning, sessions, and presentations of key results. We found that it was helpful to build relationships with other members of the product team and increase usage and visibility of research findings.

Research Process

It is important to plan a user research study based on your objectives so that relevant data can be collected using the appropriate research method at the right time.

It’s best to start your initiative by putting together a research plan. Outline your project from start to finish and identify its focus, goals, stakeholders, and method of execution. Spend the time upfront to understand various methodologies and best practices. In addition, think about how to streamline processes by creating templates and to-do lists for each method to help speed up cycles. Be prepared for questions like “what methodology is best”, “how do we test this”, or “how long will it take to conduct a test?”

Your process can be organized into four stages:

1. Discovery

  • What questions are you trying to answer?
  • What are the knowledge gaps you need to fill?
  • What are your assumptions?

2. Define / Plan

  • What do you want to come out of the study with?
  • Who should you recruit?
  • What research method is most conducive to the data you need to answer your questions?

3. Conduct the study

4. Analyze the results

  • Follow-up and answer your research questions
  • Validate or disprove any hypotheses
  • Transform insights into ideas, recommendations, or improvements

To be successful, it is important to choose only the most relevant research method based on the type of problem you’re looking to solve. Types of research can be divided into two primary groups — qualitative research and quantitative research. Either can fit into any part of the design process, but typically qualitative measures are used to discover users’ needs and goals, and then quantitative measures can be used to help test your designs.

Common Methodologies

Because there are so many different types of research methods, we wanted to highlight some of the most common methods we’ve used and how they’ve helped us gather the insights we needed.

In addition, we’ll discuss some of our learnings conducting these types of methods as do’s and don’ts.

Surveys

A survey is an effective way of gathering information on a large scale from a targeted group of users. However, designing a survey isn’t simple. You need to think about the structure, number of questions, flow, and many other considerations. Therefore, it is essential to define the goals and objectives of the survey and stay on topic to generate meaningful data and maximize completion rates.

To create an effective and engaging survey, structure the questionnaire by grouping and ordering questions in a logical manner. Craft questions that are short and clear and include a right mix of closed and open-ended questions. A great way to mix those questions is to form an open-ended question followed by a closed question.

For instance, we recently conducted a survey to understand aspects of working remotely during current times. One of the questions was related to identifying the challenges people face while working remotely. We provided several choices and asked participants to rank them from most difficult to easiest, followed by a question asking about any other aspects. That way, we were not only able to analyze the results easily but quantify the responses while still gathering more context behind them.

Before sending out the survey to the actual audience, remember to test it with pilot users. This helps to make sure that your questions are clearly worded and unambiguous, and validates the estimated time for completion if you’ve included this info in the survey intro. If everything looks good from the pilot test, send out the survey and start watching the data pour in.

A survey can also be paired with another research method to help quantify results. For example, a usability test may be followed with a survey.

Survey Do’s and Don’ts

  • Do: Start with easy questions, gradually moving to complex ones.
  • Do: Ask only a single question per response. For example, instead of asking “How was your experience with feature X and Y,” split it into two questions.
  • Do: Pilot your survey by having others review it. They may see things that you’ve missed.
  • Do: Explain how participation in the survey will improve the product or how results will be used. Give participants a sense of how their time will help.
  • Do: Use incentives if you have them available. These may help increase your participation rate or increase the willingness for participants to take longer surveys.
  • Don’t: Make the survey too long. Think about how long it may take you to complete the survey and aim to keep the time around 5 to 10 mins. Longer surveys can result in higher drop-off rates.
  • Don’t: Use absolutes in your questions, such as “always”, “never”, “every.”

Usability Tests

Usability tests involve participants being asked to perform a certain set of tasks using a product or design while facilitators observe their behavior. We’ve used this method to help discover how customers use our products and to help validate (or invalidate) our design assumptions.

Recently, a few designers on our team raised some concern about a potential usability issue in one of our products and we used moderated usability tests to assess the validity of their claims. They had some concerns about a workflow in one of our products and wanted to propose a different solution.

The designers created a prototyped mockup of what they believed was a more intuitive design and shared it with the research team to compare with the current one. We tested their assumptions by testing the current design (in the live product) and the proposed design (in a Figma prototype). We asked users to perform a common set of tasks on either design (each participant tested a single design), and we collected observations on how users tackled each task.

Through the usability study we learned that new users were able to successfully complete more tasks in the proposed design than in the current one. They made fewer mistakes, and they felt more confident about how they completed the tasks.

We consolidated these insights and presented them to the product team and kicked off discussions on how to improve the workflow.

Usability Testing Do’s and Don’ts

  • Do: Invite your stakeholders to observe the study including designers, product managers, engineers, etc. Sometimes, seeing is believing.
  • Do: A pilot test to work out any issues. Practice makes perfect. If all goes well, you can use this as a real session.
  • Do: Have a notetaker to jot down interesting insights. This will help you focus on facilitating the session, and the info will be useful during the analysis phase.
  • Do: Make the participants feel comfortable. It can be intimidating to be a participant even if you explain that it’s the design you’re testing, not them. You can begin with small talk if you like.
  • Do: Use gentle reminders to ask participants to think aloud. They may forget to do this throughout your session.
  • Do: Be okay with an awkward silence. Be patient, and don’t immediately jump in to help. Let them explore as if you weren’t there.
  • Don’t: Bias users by asking leading questions. For example, rather than use questions like “how much did you enjoy using X feature,” try using “share your experience using X feature.” Stay neutral and be conscious of your tone.
  • Don’t: Jump to conclusions or immediately think of solutions.
  • Don’t: Stare. It can be nerve-racking.

Interviews

Interviews are used to get direct, one-on-one insights from a user or stakeholder to a facilitator. They can be used to make connections or comparisons between the experiences of multiple participants, and/or discover the attitudes of a persona.

For example, in one of our studies, we conducted several interviews to understand the motivations and workflows of a particular retail person. Our goal with this study was to discover common workflows where our app could present useful quick actions when they need them, in order to streamline their physical workflows.

We used this research to provide insights on what was top of mind for our users throughout their workday and where the team needed to focus our attention.

Interview Do’s and Don’ts

  • Do: Communicate the goals of the interview at the start. You don’t need to spill all your beans, but some general context will give your participant some insight into what kind of information you are looking for.
  • Do: Use a general set of questions across users. You want to be able to compare apples to apples. You can ask follow-up questions when applicable.
  • Don’t: Bias users by asking leading questions.
  • Don’t: Use hypothetical scenarios. What they say they might do might not be as trustworthy as what they actually do.
  • Don’t: Read directly from your script. Use your script as a guide but conduct your interview as more of a conversation. Follow up and dig deeper where the conversation flows.

This may go without saying, but the key with incorporating more research into your design process is that you just need to get started. Start somewhere, even if it’s a quick win or low hanging fruit and build on your researcher experience. It may take some time at the beginning but know that it gets better each time. You’ll be more confident, have more focus, and continue to positively impact your product team.

Thanks to the team who helped us put together this piece. Special thanks to Lakshmi for the inputs.

--

--