User Research in the Covid Era

It’s 2020 and we’re locked down — most of us can’t even leave the house to collaborate with colleagues much less dive deep to observe and understand our users.

I’m doing a series on how we’re trying to remain user-centric during this tough time — how we’re finding new ways to do user research and usability testing even as we’re kept apart.

Remote Testing in Manila

We’re pretty fortunate. Even as many places shut down transport services, we partnered with Toyota Mobility Foundation (TMF) to launch new transport services for healthcare workers in 3 new markets — all in a span of 2–6 weeks.

Bangkok, Manila, Jakarta — three of the toughest but most lucrative mega-cities in Southeast Asia. It was our privilege to work on such a meaningful project, but also a nail-biting experience. I felt like we were flying blind.

Before Covid-19 struck, we typically took 2–3 months to pull the trigger on launching in a new city. We would carefully work with our clients to study the current transportation solution, fly down to observe any existing transport services and conduct user research to understand behaviours and expectations in each market. This would be followed by a series of road tests to calibrate our speed maps and refine our service parameters (I’ll write about that another time).

With the TMF deployments, we trained the bus operators and drivers remotely, and trusted our partners on the ground to get things up and running and ensure that eligible users figured out how to use the service.

Usability Testing with Maze

We’re planning some new features and changes to the booking flow for our Just in Time product line and wanted to test them with our users in Manila.

We picked Manila as we know our user base speaks English and it was easier not to have to deal with translations for this first experiment.

My old school brain was thinking we would push a notification to a sign-up sheet and arrange virtual meetings with various people and get them to play with our Figma prototype. Basically a safe distancing-friendly version of how we typically conduct user testing.

Enter our awesome Designer Leonard with his prototype on Maze — Maze is a tool which enables remote testing. You set up the prototype and send it to users. They follow the instructions on screen and complete a series of tasks and/or answer questions which help you test usability.

Maze allows you to set up “Missions” which basically track task completion, measuring it against an ideal path which you set. It also allows you to set questions with screens to test user understanding.

At the end of the testing, Maze generates a report with heat maps, screen view time and other statistics from which you can gather insight into the overall usability of various flows, points of friction, and areas where users got tripped up.

For missions, it provides statistics on task completion (“Success”), splitting this into “Direct Success” and “Indirect Success” — the former indicates that users followed your ideal path and the latter indicates that they deviated (“Off-Path”) but managed to complete the task nonetheless.

Testing the Prototype

Leonard spent a lot of time designing the test missions and questions. I spent a lot of time testing his Maze prototype.

The first prototype worked perfectly on Leonard’s iPhone X but it looked like this on my iPhone 8:

#responsive I know

Basically, we needed to account for the additional space occupied by the browser and adjust the dimensions of the prototype accordingly. After some adjustments, it came out looking alright.

We also had some issues with certain interactions which were limited by the fact that this was a prototype — for example, map interactivity such as drag and drop was not possible in the prototype.

Another issue was performance/responsiveness on older phones. I tried our test on my dad’s Samsung J3 and faced huge issues with lagginess and general responsiveness of the prototype. On my iPhone 8, interactions with the time wheel were slightly laggy too.

We were a little nervous and at one point even considered scrapping the remote testing exercise. In the end we decided to go ahead, figuring we would learn something from it no matter what — even if the conclusion was that either Maze or remote testing wasn’t working for us.

The Response

I was quite worried that no one would bother tapping on our notification and going through the 10 missions/questions which we’d set, so I was thrilled when we got 51 click-throughs and 39 individuals who spent enough time to go through all of the 5 missions and 5 questions.

We currently have 785 registered users in Manila but only about 300 users regularly use our service (the rest fall out of our service area unfortunately) so depending on how you calculate it, we either had 6% or 16% response rate. To put this in context, the typical response rate for B2C surveys is 13%-16% while mobile surveys typically only get 3%-5% response.

I think our shameless click-bait title helped.

The Results

We got some good insight into which of our new flows had higher friction and the heat maps were really useful in helping us understand where users were getting confused (or their expectation of what they should be doing to complete the task).

The other interesting insight was that users were learning through the testing/repeated tasks. We had two similar booking tasks, around the start and end of the missions sections — average time on the task decreased by over 30% for the second task compared to the first. We also saw a decline in wrong taps, with only 6 across all users.

Tips for getting the most out of Maze

We’re definitely not experts on Maze, but I thought I’d put down some simple tips in case you’re like us, and just getting started.

Think about what device your testers will likely be using

  • If you’ll be recruiting testers through a push notification, they’re going to click through straight on their mobile device so your test needs to be designed to be mobile-friendly
  • If you’re emailing B2B testers, you’re probably much more likely to get users who are opening your test on a desktop — that opens up more possibilities for complex testing. It also has implications for how your set your interactions — for example a scrolling interaction on a time wheel is almost impossible to execute with a touchpad or mouse
  • Generally, you should try to use channels which increase the probability that users will perform testing on the device you believe they’ll actually be using

If testing is going to take place on mobile phones, you need to adjust the screen size

  • As we mentioned above, testing with Maze isn’t like testing on Figma directly — you need to adjust the dimensions of your prototype to account for the screen size of various phones

Design your missions with the limitations of the platform and prototype in mind

Remember what you’re trying to test and ensure that you find the balance between realism and practicality. For example, we removed an interaction which required users to edit an address — this involved tapping on the address bar, deleting/editing the address, and then tapping on the keyboard while would automatically enter the new address (the prototype doesn’t have a working keyboard). We were pretty confident users knew how to complete this portion of the flow, and without a working keyboard, we found during our pre-testing testing that this was quite confusing and frustrating for testers since it defied their normal expectations. Basically, we risked wasting precious time/attention span testing a part of the flow which didn’t really need testing.

Design your tests knowing that users aren’t going to retain much information from your instructions

  • On a desktop, the instructions appear on the side of the screen while you’re completing the task
  • On mobile, the instructions appear separately from the mission screens which testers interact with.
  • Maze has a neat feature where users can “pull up” instructions by tapping on a very subtle purple bar at the bottom of the screen.
  • I understand why Maze made the bar subtle, but we realised that one of our tests was rendered difficult to interpret — we couldn’t tell if users were struggling with the booking process or struggling to remember what they were supposed to be doing
  • Maze does alert users to the feature at the start, but there will always be users who don’t register the instruction or forget and just play around with the prototype to see what happens — that’s going to mess up your data and prevent you from figuring out where there are real points of confusion and friction
  • One way to overcome this is to test smaller portions of the flow for each mission so that users don’t have to remember too much information to complete the task successfully.

Do we think we’ll use Maze in the future?

I was really impressed overall with the Maze platform. I find it extremely relevant to our times and useful for remote usability testing at scale.

Ironically though, it made me feel the loss of not being able to do usability testing in person even more. When users didn’t behave as we’d expected, we weren’t there listening to them talk through what they were doing and there was no way we could probe more deeply on why they were doing what they did.

Post-Covid, I can still see us using remote testing to hone in on the areas we need to study more deeply through face-to-face testing or video calls. Alternatively, we might use this to do a final set of larger scale validation before we start development.

If you’re on the fence regarding remote usability testing, I’d say it’s definitely worth a shot. Best of all, our first test fit comfortably within the free tier for Maze so we got to try it out without even having to commit to purchasing it (it’s $25/seat for the first paid tier).

You can read more about our new deployments in Manila and Bangkok here.

Lastly, if you enjoyed the article please clap or leave a comment on what you’d like me to write about next. I have a few thoughts on future articles including other ways we’ve continued user research remotely as well as our adventures scaling across APAC (we’ve entered Australia, Vietnam, Indonesia, The Philippines, Thailand, and Japan in the last year).

Written by

Product Manager at SWAT Mobility — we’re working on changing transport so that it’s more efficient, affordable, and sustainable. linkedin.com/in/melanietanliwen

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store