How a quick usability test helped us validate an idea

Image for post
Image for post
Usability test | Henrique J. Tramontina

Image for post
Image for post
Screenshots of the 3 prototypes
  • Use our resources efficiently.
  • Build better products faster.
Image for post
Image for post
Internal test | Henrique J. Tramontina

Running quick usability tests

As mentioned before, this wasn’t the first time we’d conducted usability tests. These are a fixed component of our design process, allowing us to learn from user feedback, iterate and improve our products. This is how we run a quick usability test to validate a hypothesis:

Getting ready

First, we clearly identified the problem. This was our problem definition from the internal field test:

Image for post
Image for post
Craft: InVision’s prototype plugin for Sketch
  • We repeatedly tested the prototypes internally to check that all the interactions were working and that everything made sense in terms of the script we had written.
  • Other preparations: ensuring the phones were charged, organising the room for observation and interviews, grabbing a few drinks and snacks and also finding some company t-shirts for the users as a way to thank them for helping out.
Image for post
Image for post

It’s test time!

  • We planned 30 minutes per user. After a brief introduction and background questionnaire, we asked them to perform two tasks for each prototype. Using questions from the System Usability Scale we then asked them to evaluate each prototype. Later we positioned the results on the SUS curve to see which prototype performed better.
Image for post
Image for post
Prototypes in action
  • Our product manager was the observer, taking notes on all the answers, comments and feedback.
  • We shared a Google Meet link with our team, so those who were not fully involved at this point, like the developers, could at least listen in and hear the users’ reactions.
  • We always asked for open feedback and took notes on everything. As a side note, this isn’t the time to be offended if something doesn’t work the way you expect or the users don’t like the idea you’re proposing. After all, at the end of the day, they are the ones who will be using your application and your service.

Analysis and results

Normally, this can be be really exhausting. Going through all the notes, recorded user interactions and audio files isn’t always fun. But it’s also exciting, because you tend to discover a direction for the problem you’re trying to solve. In our case, seeing the final results was quite interesting.

  • The current version received an SUS score of 90.83, while the newly proposed idea scored 85.42.
Image for post
Image for post
Our SUS results
  • The current version was more convenient.
  • One user mentioned it was better to be able to set the number of passengers right on the first screen before searching.


In the example above, it took us less than two weeks from initially framing the problem to the final results.


We are the door2door design team, working together on the…

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch

Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore

Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store