5 lessons learned on how to do remote usability testing
Testing our ideas in product design is crucial. Without it we’d be lost and or ideas would be useless. However testing can be time consuming, expensive and requires sometimes certain very specific testing participants with skills not found everywhere. Sometimes testing with people while there at home and you’re… well somewhere else, can be a good option. This article is about how you can do it without too many hiccups.
Testing remotely – is it even that desirable? Shouldn’t we always try to test with our users face to face rather than on a crappy video link? You loose much of the subtle things in communication (such as body language or tone of voice) when you are not in the same room. In the best of worlds remote testing isn’t ideal but in a recent Government project I was on it was the best option. I needed to get testing done within a fairly constrained time and my target group was scattered all across the country, you sometimes just have to get it done by any means possible. Especially when that country is Australia and my closest testing participant was a 2 hour flight away.
This article is useful for you who wants some tips on how to do remote usability or any other type of digital product design research done remotely.
The most important lesson is: You need good Internet connection as a minimum! Make sure to have the best you can. In Australia this is a challenge I’ve noticed. (Thank you NBN…or the Australian Gov’ment…or both!!). But give it your best try. In the Government example my home connection seem to be working better than the one in the office believe it or not. I wasn’t sure if it was because of my connection though or the participant changing. Check which one works better for you.
…and the longer version
The first thing you might ask is why didn’t you just find participants in the same city if you’re testing for the Government? In this example our target group was extremely specific (we needed recent examined doctors within a year or so of going through their registration with the Australian Health Practitioners Regulator. or medical students close to finishing their final year in Medschool, both having experience with registering with this agency.) What we wanted to learn was also comparative so previous experience was necessary. We couldn’t really go out on the street and get feedback from anyone.
However if you have the opportunity to see your users face to face – DO IT!
Lesson 1: Remote testing makes everything at least twice as difficult – but not impossible
There are things that can go wrong in a regular user testing session: your laptop doesn’t work, the connection is crap, you have misplaced the passwords for the prototype, the participant is grumpy or unhappy during interview. You have certain backups in place for this to make the test run smoothly. However the amount of things that can go wrong increases when all done remotely. You don’t only have to manage all the technology and logistics on YOUR side, you now also have all the participants with all kinds of hardware and unupdated software. You also have possible observers that need to be able to watch the test, and they might not be in your location. This was the case in our Government example. I was in Sydney, the participant was in Queensland and the observer was in Canberra. The probability of something going wrong drastically increases as soon as you add another device into the mix.
Being aware of that makes you better set up for next lesson.
Lesson 2: Be a photographer – Develop something beautiful from the negatives.
Now we’ve learned that much can go wrong in a remote user testing session. To avoid being caught with the beard in the letterbox (Swedish saying which means ‘caught off guard’) you now have to think about everything that can go wrong and think about what you can do instead. Have backups for the technology you’ve chosen, even prepare backups for what to do if you won’t be able to test anything. Maybe your prototype test just have to be a conversation about the problems your participant is experiencing. Even if you do manage to start recording and it works smoothly, take notes as a backup on the side. Consider having a guided test, as we did in the Government example. If the test is on a conceptual level it might give you enough, instead of having the participant directly interacting with it. In our case we got what we needed without having the participant clicking around on their own.
A study guide with options for when the problems occur is invaluable.
Lesson 3: Connect and pray to the NBN gods!
Again make sure to have a good connection. (In this case my home connection was better, but I only learned that after the 1st test). To make sure that your participants have decent connection, you could even put it into the screener (the shopping list used by recruiters to find people to talk to: they have to have decent connection. Check! They have to have Chrome installed. Check! They have to be in Sydney. Check!).
Again be prepared for things going awry: see the lesson about “Being a photographer”.
Lesson 4: Get the right tool for the job
At the time I had a few different tools to choose from and had various level of experience using them. Skype was unreliable. Lookback was at the time new and a bit dodgy. Fuze was inflexible. Silverback I didn’t trust after have lost a couple of test sessions before the real test. Which tool is then the best to do remote testing with? The answer is of course: it depends. but what I learned was to at least not use Fuze.
The major advice I would give is to look at the different alternatives you have and do a ‘pros and cons’-analysis. Maybe the least attractive option at a glance is the most useful one by comparison (Lookback ended up being my preferred option because of it’s connection to InVision).
Here are some possible alternatives for software tools;
- Lookback (lookback.io) is a web based tool for user testing. LookBacks Live Feature enables you to invite the participant to a test and which then guides them through a few screens to set up their camera and microphone and it captures their screen during the test. This is great when and if the prototype you are testing can’t be shared with others but your participant need to click through a prototype. This then get recorded and uploaded to the web tool where you can make comments on the video, share highlights of the video, export into face video or screen video. It also works on mobile or laptop. It was pretty good for what I needed it to do.
- Skype + Lookback. In my opinion Skype isn’t the best option since it’s a pain to setup (yes, don’t argue with me!) and it would shut down a few times during our tests (again I somewhat blame the NBN, but I chose to give Skype some of the honour), but it was in the end the best option we had at hand. This was because Skype has a sharing screen option which is free and all of our participants had Skype installed. We made sure our participants had Skype in our screener. We then recorded the test through Lookback. Make sure you use a tool that is available for everyone. Since we were ok to test our prototype as a guided tour this was our best option.
- Fuze. Great and handy if you have access to it, but hard to invite people from outside your company. I had issues with the sound and video, AND you can’t download the videos locally. Is probably good for something else but not for us in the case of Government.
- Phone. Simple, everyone has it. But it limits you when it comes to recording calls. You can’t record a call at the same time as you’re having it. And recording was a must have. There is also no way to test prototypes through regular calls, yet.
- Silverback (silverbackapp.com). Try it and get back to me. I still haven’t had one good experience using it.
- Google Hangouts. Wasn’t really an option because of the possible limitations with a Google account. Maybe this would work fine, I don’t know.
The most important thing is to make a list of the things the tool has to support; screen recording, face recording, raw format video download, ease of commenting during analysis, accessibility etc. Rank them in order of most and least trade off able and go find a tool.
Lesson 5: Stay clear of the FOMO-nster
Wanting to get the most out of your testing session is a good idea, but it can lead you down a road with too many variables ultimately not really getting anything tested properly. This is just simple FOMO which we are all victims of at times. This is a general lesson for any testing but it’s a good one to highlight in this case.
Try to focus on testing a few things instead of everything.
In our Government the prototype UI had polished aspects which weren’t really needed for the things we wanted to learn about, which where on a conceptual level, for example the forms had validation. The prototype was prepared for a detailed usability level test (which is quite far down the stream in development). What we really needed to learn about was conceptual which meant we didn’t necessarily need to have a detailed prototype, but it helped.
In summary…and some more things to think about
Here are some questions you can use as a starting point when you’re thinking about doing some testing which can help you cover off the most important things. It doesn’t cover everything but it can help you to get started with thinking about your backups.
- What software will the participant need? Is the software free? Are they on a mac or a PC?
- What do you want to learn? Are you testing the high-level idea or are you testing the details of a design? Probably the most important question to ask before any test – Remote or face-to-face. This will help guide the type of test needed.
- How will the findings be presented? Is it a short video summary or a report? Does it have downloadable formats? This might drive the decision on what software you use.
- Do we have someone who will observe the test? Where are they located? What are their technical limitations? This will help you choose the right type of software to use.
- Do we have a room booked? Do we have enough time between each test for uploading of any video or unforeseen delays? Do you have all the additional documents handy (consent forms, login credentials, schedule etc).
- Could you get support from someone else in the office to get you coffee? Always be at least 2 when testing.
- Does the participant have to sign a consent form to join your test? Think about if you can do the signing as a sound byte or if it needs to be printed and signed.
- Who is going to handle incentives (if at all) and all the communication with participants if anything changes? Are you using a recruiter or will you find all of your participants yourself?