Usability tests, common mistakes

Briefly, allow me to share six tips for better usability sessions

Humar Santos
4 min readMay 24, 2023

Hi! Today, let’s talk about a very common research method across product teams: the usability test!

Run a usability test image. Written in a post-it

A usability test is a qualitative research method to test and validate solutions. The interviewer uses a prototype to identify any usability issues while interviewing and observing the user attempting to complete a task or set of tasks. Normally, a note taker is also present to record insights, improvements, and feedback from the interviewee to allow the person who is leading to be focused on the user.

The learnings that I will share came from a lot of practice, intensive courses lectured by research departments, and reading very good material available online for free — you just need to be curious!

Briefly, allow me to share six tips for better usability test interviews:

Meme about teaching

Avoid giving it all at intros

To handle this, I always provide a short clickbait of context when inviting users for this type of session without uncovering the tasks the user will perform to prevent biases. In the session, introductions serve to get to know each other and what will be expected from the user.

Hi [Person name]! Thank you for accepting our invitation to collaborate with us on these early experiments. My name is [name], I’m a [role] at [Company], and let me introduce you to my peers [present coworkers]. I’ll guide you during this session with questions and, later on, some tasks on top of a prototype to gather your feedback.

There are no rights and wrongs; feel free to speak your thoughts honestly while answering questions or doing the tasks, as we’re not testing you but our own solution.

Avoid leading questions

That’s the number ONE mistake that we might fall into. I did it a couple of times.

Do you agree that our login experience is easy for you?

Do you agree is leading the user to answer “yes, I agree”, because people tend to be nice and avoid direct confrontation. Another leading question could be something like this:

How much did you like the designs?

Who told you that the interviewee liked the designs?

This can guide an entire team to build the wrong solution, based on nudged data. Saying this, consider rephrasing the previous questions to:

  • Can you describe your last login experience? — This is an open question that will allow them to talk about good and bad, share their feelings, and explain everything that happened during the login. Just stay quiet and listen.
  • What are you thinking as you view this page? — You start by asking a very broad question to see if the interviewee understands the purpose of the page, and if it might raise something you weren’t expecting. As a follow-up question for short answers, you could ask: “Can you describe the purpose of this page?”

Avoid closed questions

Something like “Is it correct?“, will make the user answer “Yes” or “No”. You want to gather all possible details from the session, so if you fall into the trap of getting an answer such as “Yes” or “No”, you can ask a follow-up question: can you elaborate on that?

Oh yeees!!

Avoid LONG questions.

If the interviewee asks, “Can you repeat again please?”, normally the reason is that you asked a bunch of things at the same time. Prefer “Any recent feedback from your customers about transactions?”, over this: “Speaking about transactions, any request that came to your table from your customers from the past 30 days, where you have been in daily contact with them by microsoft teams or email during your working hours?

meme about “I like to talk, I like to talk a lot. Want to talk? That’s okay, I’ll do the talking

Prepare a script of questions

Ensure that you are ready for the interview, it’s critical. Prepare and digest the script of questions beforehand.

If it’s online, my suggestion is to always split the screen in two: one with the notes and the other with the call. You might not ask all of the questions, as some could become obsolete or unnecessary as the call goes on.

How do I measure if it went well?

Did the interviewee complete the task? How well did he or she perform? Did the interviewee got stuck? Did I need to help him unblock the task? These previous questions and reading all the notes helped me understand what went well, what still needs improvement, and finally, what didn’t work.

Last but not least, I always ask at the end:

“Any further feedback for us? We are almost ending our session, but feel free to talk to us (…)”

I personally avoid asking if they liked the flow or experience, as measuring the prototype is my main goal. Asking this at the end usually answers me with constructive feedback that was not registered during the session.

Was is it useful for you? Let me know in the comments. Have a nice day!

--

--

Humar Santos

Senior Product Designer @Anchorage Digital :: Curious about things, and how there's always something to learn! Knowledge never ends.