4 Steps to Get the Best Results from User Testing

Practical guidelines based on countless user testing sessions

Daria Michalska
Netguru
9 min readAug 9, 2019

--

Photo by Thom Holmes on Unsplash

Products need to serve a purpose. Everything we do as designers, developers, managers, and founders should have clear objectives and aim to be useful and valuable for recipients and customers. It’s fair to say that one of the main goals of product design is to ensure the right product-market fit. And one of the best ways to verify a product’s usefulness is to conduct user testing sessions.

User testing is basically what the name suggests — it facilitates checking how end users interact with a given product or service. What’s really great about user testing is that the product itself doesn’t have to be live (but of course can be!). Instead of spending valuable resources on development — without being sure if the product will be received well by end users — one could build a prototype and conduct user testing on it. Such approach would allow for UX validation of the key scenarios and use cases among product’s most important user groups before the actual product hits the market, and potentially save lots of money and effort. Even if testers end up being confused and dissatisfied, you’ll have all the needed information on what should still be worked on and improved. Also, testing a product that’s already live has similar benefits and can set you on a right path to refining it. However, it requires a careful planning and right testing methods to make sure you get constructive and unbiased feedback.

The 4 main phases of User Testing

Basically, user testing can be divided into 4 main phases: Preparations, Recruiting Users, Conducting Sessions, and, finally, Data Analysis. In this article, we’ll dive into the details of each phase, so you can follow the whole process step by step and ensure you’re conducting the research like a real pro.

1. Preparations

First things first. No matter how eager you are to jump into showing the prototype to your users, it’s best to cover the basics and create an environment that will support the whole user testing process:

1. Create a document or a spreadsheet where you’ll keep all your prototype links, testing scenarios, session schedule, and so on. This will come in handy in the later phases.

2. If you want to validate and get user feedback for different options, prepare different sets of prototypes you’d like to test.

3. Give each prototype a unique name according to a pattern of your choice so that you don’t get lost when trying to find out which version should be shown to which user.

4. Prepare user testing tasks and scenarios, then choose the right user personas for them. Think of what you exactly want to get tested. It’s good to start your research by testing the most important functionalities of the product. Keep in mind that giving users certain tasks already makes them aware that they can do them in the first place. For instance, giving testers a task to add an item to the basket will make them look for an “add to basket” button. But would they notice such functionality without you providing them with a related task?

5. Choose the software for your testing sessions that suits your needs. Some recommendations that let you record both users themselves and their interactions with prototype:

Remote testing:

GoToMeeting, Skype, Lookback, UserTesting, Google Hangouts (Enterprise plan)

Onsite testing:

Quicktime, Lookback, any camera on a tripod (it can be your phone)

An example of a preparations document
An example of a preparations document

2. Recruiting Users

After you’ve got all the preparations covered, it is time to decide whether you’d like to proceed with moderated or unmoderated user testing. In the moderated approach, the researcher can communicate and interact with the testers, while unmoderated tests rely on users trying to complete the tasks they are given without any external influence. Both approaches have their pros and cons. Moderated user testing provides more control and can lead to the discovery of new insights from the users, but is more costly and prone to bias. On the other hand, the unmoderated approach allows for cheaper and easier tester recruitment, but it doesn’t give you any chance to ask valuable follow up questions.

Setting up unmoderated sessions is relatively easy, as a third-party service will do all the hard work for you. In such case, you can give UserTesting, TryMyUI or PingPong a go. However, when opting for moderated user testing, you might encounter some unexpected issues such as the difficulty with finding the right testers or trying to work around scheduling meetings with severe time-zone differences. Here are some tips on how to make the setup process a breeze.

1. The general rule of thumb says that you only need 5 users to test your prototype. However, it is important that all participants match the product’s target audience, so, depending on the project, finding the right testers might turn out to be very easy or quite hard. After all, in the case of moderated tests, the ball is in your court. While looking for the right users, don’t be afraid to make use of social media, classified ad websites, and forums. Do some research and try to pinpoint the best places to reach a particular target group.

2. For hard-to-find candidates, it’s best to prepare some reward for their participation — usually gift cards for Amazon or Spotify (or any other equivalent service) will work quite well. You can also consider sending some of your company’s swag as a token of appreciation.

3. When posting the actual message that you’re looking for user testing participants, remember to include the most important information :

  • Who are you looking for?
  • What is the subject of user testing?
  • What will the sessions look like?
  • Will the session be recorded?
  • How long will each session take?
  • In which time-zone are you located and what time-frame are you aiming for?
  • Is there any participant reward?
  • Do the participants need to prepare for user testing in any special way?

It will let all potential candidates know what exactly they are signing up for and will save you the trouble of realizing that some of the testers do not meet all the criteria.

4. If you’re planning on testing more than one prototype with each user, prepare all the prototype links for them beforehand. It’s good to show prototypes in a different order to each participant to avoid making them biased towards the first shown version.

5. Get in touch with candidates and schedule each user testing session using an online calendar, so you can smoothly manage all the meetings and easily refer to each session. It’s also great for keeping track of different time zones, as most of calendars (Google, iCal) automatically adjust the meeting time zones to the participants’ locale. Remember to include the following in the invitation: the description of the event, the information on where and how the session will take place, how long it will take and that it’s going to be recorded.

3. Conducting Sessions

Found the ideal participants and got all the sessions scheduled? Perfect! Now it’s time for the real deal — the tests themselves. At this point, you should have everything ready, but some pitfalls might still occur that might affect the test’s objectiveness if you went with the moderated user testing route and you’re going to interact with the participants directly. Asking suggestive questions or allowing for personal bias are among the most popular mistakes. The simple tips below will allow you to avoid such issues.

1. Start with thanking the participants for their willingness to take part in the user testing. Be transparent about the intent of recording sessions and be open about with whom the data will be shared. In order to share the recordings with stakeholders, you’ll need to have your testers’ consent.

2. Explain the user testing scenario and your expectations regarding the course of the session. Tell participants you’d like them to share as many thoughts as they have and that this is still a work in progress, so their input is valuable. Don’t forget to turn on the recording!

3. Try to talk as little as possible during the testing. Some users are less willing to share their thoughts than others, so you might want to interact with those to get some more information. There are a few interview probing techniques that might help you get the information you need, we will discuss them below.

4. When you want to get information about some particular areas of the prototype that user didn’t comment on so far, ask open and non-leading questions. Below, you’ll find some good and bad examples of interviewing questions.

Leading vs objective questions:

What do you like about using Twitter?

This will only get you the answers what’s good about Twitter, not the whole picture.

What’s your experience with Twitter?

The respondent will actually share their thoughts on any positive and negative experiences they might have regarding Twitter.

The difference between a leading and objective questions

Closed vs open questions:

Do you use Slack?
This can be answered by just
Yes/No and thus can give you a very short answer.

How do you find using Slack?

Here we ask the interlocutor to share all their experiences regarding Slack tool.

When dealing with the question from the user about the prototype, it’s better to ask “what do you think?” first, rather than responding with an answer right away. You want to learn what people think about or struggle with insead of guiding them through the designs they might not understand.

4. Data Analysis

After all the hard work that has been done so far, you still need to analyze the findings and prepare the final report with all the conclusions and recommendations — in the end, that’s the whole point of prototype testing! Again, even such a fairly easy process as documenting a user’s behavior and their experience with the prototype can come up with unexpected challenges. Here are several good practices that will help you avoid them.

1. It’s best to analyze the recordings with a partner or even more than one personto avoid including your personal bias while processing the information and writing down conclusions and theories behind user behavior.

2. Listen carefully to what people say and in what context they say it. Write down the most important statements and quotes for each prototype/screen.

3. Keep your documentation well organized, with all the prototype links and recordings available for each prototype. This comes in handy when sharing the documentation with other designers or stakeholders.

4. It’s qualitative data, but try to keep track of the numbers. If some issue is reported by all the participants it’s good to have the math on your side and keep track of it. It might also be useful to include numbers in the report, as they are going to support your findings and insights.

5. Prepare a summary report with all the insights from the user testing. It’s a good practice to include the key takeaways at the beginning of your report so that stakeholders who are not interested in reading the whole thing can easily pick up on the most important conclusions.

Applying all the good practices to your user testing process should give you an unbiased overview on your prototype’s performance and help you establish the next steps. As with everything, it’s likely that you’ll make some mistakes — we all do! Just remember to take a moment and reflect on how each session went and what could be improved. Thanks to that you’ll become a more proficient user researcher with every session you conduct.

Originally published at https://www.netguru.com.

--

--

Daria Michalska
Netguru
Writer for

A designer, a geek, and a people lover. Right now rocking product design @ Netguru.