How to Test Your App Anywhere?
Last week I wrote about the importance of testing everything, if you missed that article, check it out here:
One of the types of testing we did was usability testing. Usability testing is the process of testing how usable your app is: whether the placement of screen elements makes sense, whether the app flows correctly, and so on. The goal is to keep refining your app to make it simple, intuitive and easy to use.
Each test took roughly an hour to run with a participant, with more time before and after to set up and analyse results.
In this article, I’m going to write in-depth about how we actually conducted those usability tests.
Recruiting test participants is the most important part, and is something that should be done as early as possible. Luckily, because Rory and I had just quit our jobs to start this company, and hadn’t told anyone what we were working on at the time, there was a lot of intrigue among our friends, so we had enough people willing to help out.
How many people should you have for usability tests? Answer: Five. Any more and you start to exceed the maximum benefit-cost ratio of usability testing. (This rule-of-thumb comes from Jakob Nielsen.)
Because of the lockdown in the UK, all our tests were conducted remotely. So the only requirements for our participants were for them to have:
- laptop / computer
- a good internet connection, so we could video call and screen share
- an hour of free time
If we were testing in-person, we may have done it with a test device — but remote-testing made it easier, as we could be flexible with arranging times, and Rory and I could both be on the call making notes.
Google Calendar and Google Meet
We use G Suite for everything, so booked and scheduled meetings with Google Calendar and conducted video calls with Google Meet. Google Meet is a bit less smooth than Zoom, but we had unlimited length meetings included in our plan, and recordings were automatically processed and saved to our shared Drive.
Starting a test session
Here’s how it worked:
- Facilitator and participant both join a video call
- Facilitator receives verbal consent to record video call
- Facilitator sends a link to Figma prototype
- Participant then opens the prototype and shares their screen
We love Figma. It’s free, you can invite unlimited viewers to a file, it’s lightweight, it works in a browser with nothing to install, and it has a robust prototyping mode.
An app prototype is like a glorified slideshow: you click on bits of the screen, and different screens transition or animate in.
Figma has basic prototyping, so we couldn’t have any complex transitions, but it's quick to get going. As we were working in low-fidelity when conducting these usability tests, so we weren’t worried about things looking perfect.
We just had to share the link to a prototype and we were ready to go.
We had a mammoth Google Sheet which had our entire test script in, broken into three parts.
- Pre-test information
- Test tasks
- Post-test questionnaire
Here’s the full pre-test script.
Are you okay if we record this? [Start recording]
Thank you so much for taking the time to chat with us today. We’re going to ask you complete a set of tasks for our new app, Sound Off. Our goal is to see how easy or difficult you find it to use.
We are testing in low-fidelity which means none of the colours or logos or design is in the app — its very grey on purpose, just so we can evaluate the flow.
Moreover, as it’s only a protoype, not everything will function, and there are no sounds. However, please do pretend that it’s a real app!
For each task, please speak as much as possible — think out loud through everything you experience. There are also no right or wrong answers.
The context is: You’ve just downloaded Sound Off, an app for recording what’s on your mind each day as a private voice note to yourself.
Do you have any questions before we start?
You can see how this sets up the context for the session.
Each of our tasks was either a ‘screen impression’ or a ‘do something.’
For a screen impression, we asked a participant for their initial impressions of a screen, what the purpose of the screen was, and where they’d click first.
For ‘do-something’ tasks, we’d ask them to actually carry out a task. Some examples of our tasks were to make a new recording or play a previous recording. The prototype was set up so the screens would move through the flow if they did the correct thing.
Our participants would share their screen, and we’d watch how they responded to a task. We could see if they looked puzzled, how long they took to figure something out, and where their mouse immediately moved to. We learnt a lot from watching people repeatedly click on the wrong thing, or take a while to figure out something. These were bits of the app we knew we could improve on.
We also broke our set of tasks into two sections, which respectively covered the recording flow and the listening-back flow of our app. This allowed us to reset the context. For the first half of the test, our users were asked to imagine that they’d just installed the app. For the second part, we asked them to imagine that they’d been using the app for a few weeks.
Adapting the test
After conducting four test sessions, we realised we needed to change some things. So, we made changes before the remainder of our test sessions.
We also duplicated our Figma test file each time we made changes, and when working on it between tests, so that we didn’t break any
After the main part of the test was over, we finished with a short questionnaire. This included questions like:
- What was your overall impression of the Sound Off prototype?
- If you could make one significant change to this app, what change would you make?
- If you were to describe what Sound Off does to someone in a sentence or two, what would you say?
These allowed us one final chance for participants to raise any comments they might have.
Everything was recorded in our aforementioned mammoth Google Sheet. Each row was a participant, and we documented all comments and behaviours in giant cells. Keeping everything documented like this meant we could go through the results and pull out key findings, without having to rewatch the recordings.
Thanks for reading through this write-up!
If you’ve got any more questions or want advice about writing and conducting remote usability tests, feel free to reach out.
We’re always looking for beta testers as we work on the Sound Off app. If you want to help out with testing, drop a comment below and I’ll reach out!