Get Moving: Locomotion in Virtual Reality

Matt Marshall
Pixel Tours
Published in
8 min readJun 23, 2017

Vast landscapes of impossible worlds are the golden promise of virtual reality: to be fully immersed into another place with endless possibilities in every direction.

When building VR applications it’s easy to let your imagination run wild with visions of holodecks and bounding over mountains. The reality is that VR is still solving one of the fundamental design questions behind its promise:

How do users move in VR?

VR designers everywhere are experimenting with this problem every day. If you’re exploring VR applications for your business, understanding the ongoing work of movement in VR can make the difference between a great user experience that embraces the constraints of the medium and a poor one that buys into and fails to deliver the golden promise.

Before talking about the design of movement, let’s examine the major types of VR consumption and their limitations.

Room-scale VR

The Lamborghini of current generation VR. Tethered to a high-powered PC, the user can walk around a small 3m² virtual area using the raw horsepower of a good PC to create high-fidelity imagery. Additional tracking units(a.k.a. lighthouses) provide millimeter-accurate positional tracking not only for the headset, but also handheld controllers allowing the user to interact with the world.

Being able to see the controllers mirrored realistically in virtual spaces helps the user interact with the world.

The often talked about HTC Vive and Oculus Rift live in this category.

Mobile VR

Being able to throw a seemingly boring smart phone into a Google Cardboard is a great way to awaken an unsuspecting victim to the virtual revolution. In fact, it’s how we’re introducing VR into some of the world’s biggest architecture firms as a powerful sales tool.

Without positional tracking, most mobile VR experiences are what we call 3-DoF: three degrees of freedom. The user can look within the world but not move. Meaning, you can look down towards a ledge but you cannot lean over it.

Later in 2017, this will change very quickly as Google implements more of their groundbreaking Project Tango technology into the Daydream platform. Mobile VR will start to transition into the next category:

Free Roam

Inside-out positional tracking uses computer vision to build a model of the environment and track the position of the headset. As this doesn’t require outside tracking stations, users can wander through real spaces that are larger than the room-scale VR counterparts.

This might seem somewhat like that golden promise of VR, but the threat of obstacles in “meatspace” (the real world) interferes with immersion. This why the free roam experience better lends itself to Mixed Reality (MR) or Augmented Reality (AR) where objects in the real world become interactive in the virtual world.

Now that we’ve captured the flavours of current generation VR, there’s one glaring problem with the golden promise of infinite wondrous landscapes:

All the immersive power of VR is still bound by the walls and furniture of the real world and the limitations of technology.

While this might seem like a bit of a letdown for VR, there is still immersive power in being taken to another world. Considering the constraints is the starting point for a compelling user experience.

Exploring virtual worlds is still central to what makes virtual reality innovative and exciting, so let’s talk about all the creative ways VR is getting users exploring those worlds.

Gamepads

Virtual Reality has and will continue to be influenced by the gaming industry. It took a long time for games to refine the first person controller and now all the rules have changed in VR with a new interaction model.

Many VR experiences borrow on the familiar first person movement achieved with a traditional video game controller. The player moves their head freely about the room, but locomotion is achieved the same way video games has been doing it for a decade: by nudging a joystick.

While novel at first, the controller presents some user experience problems:

  1. The user cannot see the controller, which is still a complex device for many and frustrating to casual users.
  2. Locomotion from the controller feels unnatural and can induce nausea in sensitive users if the acceleration is nonlinear.

It’s worth examining why traditional locomotion is necessary for the experience and if the gains of immersion outweigh the aforementioned friction points above. It’s unnatural to wear a headset on your face and invisible controller operation can be an overload for casual users who don’t hail from the gaming demographic.

Fortunately, VR has moved aggressively into custom controllers with position-tracking so that virtual representations can be seen by the user.

Teleportation

Teleportation comes in many flavours but the mechanic has been commonplace since Valve unleashed the Steam Lab. Using a handheld controller, the player can point and teleport across the landscape. Usually, this is indicated with a long arcing tether towards a new glow-tastic spot on the ground.

Some experiences modify the teleporter to suit the gameplay. The stealth-driven Budget Cuts allows you to preview the surrounding area at your new location before you pull the trigger, allowing the user to make tactical decisions before they commit.

The player’s orientation is retained and the transition is aided by a “blink” fade to help the brain retain spatial context. Cloudhead Games’ Gallery evolves this idea a bit further, indicating the new state of your room volume and where the play area bounds will land.

Teleportation is widely used to work around the burden of physical walls and space, but it does present substantial UX hurdles:

  1. Users don’t have existing design grammar with the teleportation mechanic. Substantial onboarding scaffolding is required to make users comfortable. This is especially true for non-gaming users in business applications.
  2. The teleportation interface breaks immersion by highlighting the limits of the medium — often referred to as “hypermediation” or “alienation effect”, but best known as “breaking the fourth wall”.
  3. The design patterns for this technique are still emerging and the implementation varies from application to application.

Teleportation is a viable way to traverse large terrains and waypoint-based teleportation also allows the experience to be curated around compelling or important areas. The user will still experience some freedom but the experience is still directed enough to be useful in business applications that need to consider the casual user.

The Diegetic Room

Diegetic rooms make the physical space (meatspace) of the user a tangible piece of the virtual world, much like diegetic sound in film.

This can take the form of a floating platform, elevator, or some sort of fixed space like a seated cockpit or an office cubicle. The locomotion is driven by the player controlling that virtually motorized object as a vehicle for their own meatspace (or the room doesn’t move at all).

These interactions achieve reasonable immersion as large scale locomotion happens by directly interacting with the world through objects like elevator buttons. Smaller movements are the result of movement in meatspace. All controls are a part of the virtual world, meaning there is no need for intervention of “non-diegetic” (foreign) interfaces to move from point A to B.

These kinds of motorized interactions are also very familiar to users: flying a plane with a joystick, driving a car with a steering wheel, or pushing elevator buttons, unlike teleportation which is an interaction we don’t encounter in everyday life and has to be learned when stepping into VR.

The downside is that not every experience allows for flying a plane or using an elevator. This technique also hinders immersion by creating the sensation of being trapped in a glass elevator.

World Manipulation

Another approach is to directly manipulate the world itself by sculpting it to the size and position you want. You achieve locomotion by shifting the world around you rather than by relocating your own meatspace representation.

The user will feel like a supreme being, but will also experience fourth wall effect. However, it works remarkably well in applications like Google Earth VR or Tilt Brush. Map-dragging is an interaction already very familiar to users of Google Maps for quickly navigating across the Earth. Drawing with brushes is familiar to anyone who has used Microsoft Paint.

This locomotion category is quite intuitive for tools in VR. The design doesn’t need to meet immersive goals as those feelings will emerge naturally from creating and exploring within the virtual environment.

And for a bit of extreme world manipulation, check out Google Earth VR’s approach to day/night cycle.

Astral Body

Why just limit things to world manipulation when VR allows for the bending of reality itself?

Astral Body (and other movement mechanics in this group) transition between points of views to perform different functions — much like an “out-of-body experience”. Interact directly with the environment in first person, and then switch to a third-person camera to pick up your avatar and move them to a different part of the environment.

It’s clever and we want to play SimCity in VR this way right now, but it does present significant UX obstacles. It can be difficult for non-gamers who now need to learn multiple interaction languages based on different points of view and how those systems relate to each other.

It’s a big ask for casual users and requires substantial onboard scaffolding and testing of that scaffolding.

Hardware Locomotion

Hardware innovation could solve VR’s locomotion problem, but not yet.

In our work with Yulio, we’ve found that even room-scale rigs like the HTC Vive can be intimidating for real-world deployments as sales and collaboration tools without constant technology support. Adding locomotive hardware amplifies this intimidation factor and makes sales and design teams less likely to implement VR into their workflow.

What about Mobile VR?

Mobile VR occupies the largest install base through products like Google Cardboard and Samsung Gear VR. As Daydream and Project Tango-like technologies become more common in everyday smart phones the locomotion mechanics will mirror their room-scale counterparts.

In the meantime: locomotion in mobile VR is a clumsy affair. Diegetic rooms like cockpits and elevators are effective as they creatively negate the locomotion problem.

In our UX Design work with Yulio, we implemented a gaze-based hotspot navigation mechanic that was simple for the user to learn, and allowed sales teams to focus on telling the story while using VR to convey space and feeling to increase impact.

Conclusion

There is plenty of innovation to be done in locomotion. The variety of experimentation paired with the lack of established design patterns for VR makes for a difficult user experience — especially when already grappling with nausea and other psychological effects of VR.

Ask yourself what the core immersive experience needs to be and how much locomotion is truly needed to support that experience. As users become more familiar with virtual reality, the locomotion toolkit will expand.

Thinking about user movement early in the design process can help constrain the blue sky dreaming that the golden promise invites, and create applications that make a true impact for your users.

What is Pixel Tours?

Pixel Tours is a product design and strategy consultancy based in Toronto. We bring sensible UX and technology intelligence to complex digital products from web to mobile to virtual/mixed reality.

Interested in working with us? Say hello!

--

--