Lessons in Mobile VR Usability Testing

Things to consider when conducting usability tests in mobile VR

Designing in virtual reality is a fascinating and terrifying experience all at once. There are few guidelines and seemingly a million possibilities. Your artboard isn’t just a rectangle, it’s a 360 space. You have a whole new dimension to worry about, and a new set of navigation options. Oh, and your design is strapped to someone’s face. Not the most comfortable of experiences.

Beyond that, people also have a huge range of experience with virtual reality. There are many people who haven’t even heard of it. Others who have heard of it may not have any first hand experience with it. For Fulldive, a majority of our users come from searching “virtual reality” on Google Play. They usually don’t already own fancy, high end headsets —they more likely bought a cheap headset and are encountering VR for one of the first times in their lives.

Our aim at Fulldive is to introduce people to a range of VR media through curated videos and apps. To make sure we achieve this in a way that’s enjoyable, the design team put the Fulldive VR app through usability tests. Along the way, we learned some valuable lessons on how to user test in mobile VR. We hope this will be helpful if you’re also designing a mobile VR experience and looking to do some testing yourself!

Structuring the Study

The breakdown of experience we got from our screener survey.

Determine the level of experience

Aware that people have a wide range of expertise that may skew the information we got, we decided to screen for a certain expertise level. In our screener survey, we asked people for their previous experience in VR. We found having specific answers for people to choose from served our purposes best. A range of ‘familiarity’ proved to not tell us much about people’s actual experience. We decided to provide the following choices for users:

  • I have never used virtual reality
  • I have used virtual reality once or twice
  • I have used virtual reality several times
  • I use virtual reality on a regular basis

The focus on actual experience with VR was intentional, as we wanted to know what familiarity they had with VR navigation rather than knowledge of VR in general. We were looking for people’s experiences with being in a VR space. Therefore, differentiating between mobile VR and higher end experiences wasn’t necessary.

Given our user base, we selected people with less experience in VR. Before starting the tasks, we also asked people for more detail so we had context for their responses. One person we tested had never heard of mobile VR; her lack of familiarity with VR meant that the interface was more overwhelming to her than for some of the other people we tested. Her feedback gave our team some needed outside perspective on designing for people completely new to the space.

Decide what headset and controller to test on

Mobile VR serves a diverse group, who use a range of headsets and phone models. We considered how different phones and headsets may affect a user’s experience. One factor is whether their headset has a controller or built-in clicker. Another is whether their headset straps onto their head, or is handheld.

We decided to focus on the “worst case scenario,” so that we could ensure a smooth experience for the most people. This meant considering the case where users don’t have a controller. Instead, we had our testers use autoclick. To click, the person directs their gaze to an item and then waits for the reticle to load. As you can imagine, there are many issues that arise from this. The friction for navigation is higher, meaning people get tired faster. We also discovered that people accidentally click into things when they’re simply trying to look at something.

Loading animation on Fulldive for clicking without a controller.

Consider the testing environment

In qualitative research with our users, we also knew that many people used VR at home while sitting down. To simulate this, we had user testers remain seated while completing the tasks. The chair we provided could spin, something that we knew some users had at home. We did notice that people were hesitant to spin in the chair regardless! In the future we may want to use a stationary chair instead, to observe how users respond in those cases.

Conducting the Study

Set up screencasting and screenrecording

A challenge unique to testing in mobile VR is that you’re not able to see the user’s screen like you can with user testing in other mediums. To make sure we could see the user’s experience in real time, we used a mobile screencasting program. For iPhones, connecting to Quicktime Player is an easy way of doing this. For Android phones, we found the program Airmore served our purposes.

We also used Quicktime Player to screen record so that we could look back at specific moments. The cool thing about mobile VR is that the reticle gives you an idea of where the user is looking on the interface.

Have a backup plan in case of technical difficulties

The issue with using so many programs was technical difficulties. Running both mobile VR and screencasting puts significant stress on a mobile phone. When we tested our setup, the screencasting program would quit every 10–15 minutes or so.

To account for this, we made sure to be familiar with how to resolve connection issues. We also came up with a set of talking points for when we would have to ask the user to take off the VR headset. While the main facilitator focuses on talking to the user, the other can fix the connection.

Schedule breaks

Ideally, the VR experience would be smooth enough that breaks wouldn’t be necessary. Unfortunately, the reality of strapping a headset on still isn’t the most comfortable, regardless of the VR application. After running a test session, we realized that 40 minutes of completing tasks in the headset might be too much. To account for this, we set aside time for breaks during the session. At the beginning of a session, we reassured people that they could let us know if they felt uncomfortable. We also made sure to have water and snacks on hand. Taking a break from tasks is also a great opportunity to ask some qualitative questions.

Make your testers feel as comfortable as possible!

An interesting aspect of mobile VR is that the user can’t see you in the session. During the sessions themselves, we put additional focus into keeping our tone of voice friendly and neutral.

We realized it may be disconcerting for the user to hear our typing noises whenever they made a comment. In future sessions, we plan to use pen and paper for notes. Setting up a spreadsheet of the tasks with a system to mark completion rate can also cut down on the notes needed.

Our color coded feature completion spreadsheet for a session. We used this spreadsheet to compare how successful users were at various tasks. This helped us prioritize the features we needed to tackle.

Overall, we learned a lot from our users through usability testing. VR is a new space for many people, which makes testing more important than ever. Even with our limited resources and simple setup, we were able to glean many insights from our participants. If you don’t have the time to go through detailed usability tests like these, just throwing your design into a headset and looking at it can be very helpful!

We hope our experience gives you a good idea of what to consider when usability testing in mobile VR. We’re excited to see how VR user experience evolves as people continue experimenting with VR!

Many thanks to Grant Lin and Anna Nguyen! Want to try out Fulldive VR? We’re on Google Play.