Tips for running moderated usability testing online

This is a shorter version of the post I wrote for dxw. There’s a bit more explanation on why we did it and more detail on different tools and things we tried out to run these sessions effectively.

What is it?

In this context it’s usability testing when you and the participant are not in the same room. It is not unmoderated usability testing (I’m generally quite sceptical about these, but there’s a time and a place where they can give real value. Any research is better than no research!)

Why do it?

Moderated online testing is very effective if the participants are hard to reach on time and money is tight.

It also enables access to greater variety of users and can provide a more realistic setting of use, because people connect from their own devices and are in their natural environment.

People don’t need to take the extra time to travel, so you might get better attendance levels (though it can also have the opposite effect. People feel less guilty no-showing an online session than not meeting a real person).

Some challenges and shortcomings

Not doing user research in person is simply harder to run- it’s sooooo much more difficult to improvise because you rely on technology throughout.

Also the depth of qualitative data you get is different- nonverbal cues are missing so it is not possible to properly read body language apart from limited facial expressions. It is also hard to judge moments of silence, because it could be a connection issue rather than the participant having trouble with the task or forgetting to think aloud.

So I spent quite a lot of time thinking, planning and trialling …

Planning moderate online usability for Thames Valley Housing Association

Some tips:

  1. Automate the recruitment process allowing participants book, update and cancel their time slots (the longer post has more detail on tools).
  2. Do get in touch with the participants before the day to establish trust and confidence in something that is already ‘unknown and unnatural’ to most people.
  3. Select a tool which allows participants to test out the session (and remind them to do this), gotomeeting is great for this!
  4. Brainstorm ‘What ifs’ and prepare for plan B- devices, browsers, locations, internet connection, no shows, etc.
  5. Pilot sessions from various devices and places with someone who isn’t familiar with the tools and ideally is less confident with technology altogether.
  6. Give clear instructions for what to do if participants are having difficulties connecting.
  7. Allow more time to introduce yourself and ‘break the ice’.
  8. If you can, complement online testing with in-person sessions to gain deeper contextual insight.

At the end we only had 1 no-show and 1 late cancellation. We did 6 sessions in 2 days, which is plenty to catch most problems.

So with a little extra planning and trialling to start with, this type of testing can run smoothly and add great value at little time and low cost.