Usability testing out in the forest
Usability testing is crucial to any user-centered design process. Putting a design together and getting opinions (other than your own) is vital. However, I find that the real learnings come from understanding not what users say, but what they do, and sometimes you can only achieve this by testing designs with real people.
As part of the user experience design team here at Wiggle (an online tri-sports shop), we use a variety of tools to understand users. An example would be online usability testing videos. These are convenient and easy to setup. Coded prototypes can be validated and vital feedback fed into the design process.
There is, however, a drawback. Having watched a number of videos from a number of users around the world, the feedback albeit useful, doesn’t give me a clear idea of how they feel.
Asking our customers directly fills this need. When they follow a user journey, we can see their facial expression, witness any reason for hesitation and note down any questions they may have during the session.
We design to meet user needs, and users today are experiencing Wiggle through mobile phones more than desktops. We bring a number of different devices to test with. Phones, tablets and laptops, all connected over mobile WiFi to our hosted coded (responsive) prototype.
How we run a usability test
In the past, I usually leave sessions with a list of observations. While useful, this provides just some of the information we need.
I recently watched a video by Danny Hearn, who explained some of the methods he’s used at John Lewis. He recommends having a test card and a learn card.
A test card states what we are testing. A learn card is what we fill in for each usability test. It can prove quite tricky filling in during, but we can often fill in the gaps once the user has left the session.
Not just for us
Through these sessions we receive feedback on the whole Wiggle customer experience. Feedback on the cycling event they’ve just completed. Comments on the latest order they received. Their shopping habits and anything else about them. Getting this feedback direct from customers is like oxygen for teams like ours, and with anything we receive outside of our remit — we can pass to other areas of the business.
Testing designs with real users provides us with good quality feedback that we can take back to the office and iterate our designs with. And from my point of view, design isn’t about more hours in front of a computer. Meeting users and doing this research teaches us to be more empathetic to our users, understanding their needs and motivating us to work harder to achieve truly great work.
Plus, every time we do it, we say “we should really do this more often!”