11 tips to run successful usability testing sessions

Adrien Lassalmonie
Zeelo Design

--

This article is aimed at Product Designers and Product Managers who are new to usability testing, giving practical tips to organise and run successful face to face sessions. Whilst not an exhaustive list, these are my go-to tricks that I have used time and time again over the past 10 years of practice.

Why I tend to favour face-to-face interviews (instead of scripted automated tests)

As many of you probably know already, Norman Nielsen demonstrated that interviewing 5 users was usually enough to uncover 80% of the flaws in a prototype, which isn’t too hard to recruit in a short time. Most of the time I prefer to organise face to face interview as it gives me direct access to our end users, and I can control how the conversation flows, instead of relying on a scripted task flow on platforms such as usertesting.com. I found that participants are all different and the tests never fully go as you’d expect; for example you might want to spend longer on a screen with someone who’s struggling, and dig deep to understand their mental model. I can also use what I learnt about them in the session to tailor the test and get better feedback.

Now onto the good stuff…

TIP 1

Gather insights during a warm up chat

Let’s be honest, organising and running research can take a lot of your time, especially when it’s not your core job as a Designer or Product manager, so I try to make the most of these sessions. Whenever I have enough time, I start the session by asking the participant a few questions about themselves to collect insights into their habit, what similar experience they’ve had already, how they think about and approach specific topics . It’s a time to listen and be curious, and usually it proves to be a goldmine of insights. I have often used these conversations to build personas profiles later on, or to influence a product roadmap. You can ask questions like:

  • Why do you use…
  • What have you already…
  • How often do you…
  • Who does…
  • How often…
  • What do you like/dislike about…

TIP 2

Plan a realistic task flow

To start planning your test, write down a list of all the things your team needs to test with users. Now looking at that list, can you think of a realistic, sensible flow that you could ask participants to go through? You don’t want to confuse them with unrelated tasks that require an understanding of completely different contexts. When that’s the case, consider splitting the test into several phases (each with their own task introduction) or consider downsizing the scope of the test.

TIP 3

Give context and define what is known

Put yourself in the shoes of your participant. In real life, what would a user know already about the service? What tasks have they already completed before? Where do they arrive from?

Give context about who they are and establish what they already know or have done before.

TIP 4

Test your conversation guide and prototype

You don’t want to find out during the tests that you forgot to implement a key screen in the prototype and some links are not working, or that the questions in your guide are too leading, unclear or ambiguous. You should aim to “test your test” with someone who hasn’t got too much knowledge of the product being tested to get a candid opinion, and to give yourself an opportunity to fine tune your questions and prototype ahead of “the big day”.

TIP 5

Use the 5 Ws (and 1H) to get users to elaborate.

It’s inevitable, some users won’t be very chatty and just want to get the test done ASAP. In these situations, I formulate my questions in a way they can’t be answered with a simple “Yes” or “No” by the participant (which wouldn’t be that insightful).

The trick is to avoid “Do you…” and instead start your questions by How, What, When, Why, Where, Who… forcing your participants to elaborate without even realising it!

TIP 6

Use the “Seeing-Meaning-Doing” test structure

Many times I have found that this simple conversation structure helped me gather a lot of detailed feedback at each step of the test.

Here’s how it goes:

(User opens new screen..)

  1. What are you seeing?
  2. What do you think it means?
  3. What would you be doing next?

Warning: This is not the most “realistic” way to test an interface (you won’t be standing behind their shoulder in real life asking them to verbalise what they’re seeing). I’m asking users to pay more attention to a screen than they would usually, which could skew the results of your test, so use this approach if you feel it brings high value to the test. Personally I often get some value from getting the participants’ to verbalise their first impression of a page, what information jump out to them, what they make of it.

TIP 7

Mirror the participants’ questions before helping them.

Very often, users will naturally ask for guidance during the test. They might be unsure about what they’re seeing or doing, and don’t want to verbalise their guess: “What does this button mean? Where do I go from there?”.

If you help them too quickly, you might lose an opportunity to learn from them. Unless the user has been stuck for a while and nothing can be learnt in that situation anymore, my advice is to reflect the users’ questions back at them in a warm and friendly way.

For example, “I’m interested to hear what YOU think it means”.

TIP 8

Keep the test on track

If a user goes off track or start talking about something really irrelevant to the test, gently bring them towards the purpose of the test. It can feel uncomfortable to interrupt a participant who’s trying to help, but your time is limited with them!

Jump in as they catch their breath and say something along those lines: “Okay this is great feedback but for the purpose of the test I’d like you to focus on..” (with a smile 🙂)

TIP 9

Don’t use words shown in the interface.

This is an easy mistake to make. You don’t want to skew the results of your test by revealing to the participants where they’re expected to tap next. Don’t say “Okay now how would you Continue?” If the button says “Continue”. Find a synonym instead.

By writing up a testing script, you can plan ahead which words to use during the session so you don’t have to come up with synonyms on the spot.

TIP 10

Collect general product feedback

I like to collect product feedback after the test’s task is complete. This is a great way to finish off the session, collecting issues and insights about other areas of the service. I prefer to do that when the test is complete to avoid bringing unrelated issues in the testing task which could skew the results.

A typical product feedback question goes like this: “Can you remember a recent time where you have felt frustrated about…”

And finally…

TIP 11

Keep track of your participants

To speed up user recruitment for future research, I keep a log of our participants, what research they were involved on, and most importantly leave a comment about how clearly they could articulate their thoughts during the session.

Keeping track of participants

❤️ Follow me to be notified of future publications! I post regularly about Product design tips, Design Ops, Design management and User research. Thanks!

--

--

Zeelo Design
Zeelo Design

Published in Zeelo Design

Thoughts, ideas & processes behind designing the company making shared transport a compelling alternative to private car usage