Designing VR Tools
The Good, the Bad, and the Ugly
VR is a double-edged sword. While many experiences can feel like a digital wonderland, one misstep can create a nausea-inducing assault on the senses. Which one will you create?
At a recent Designers + Geeks talk, Jody Medich and Daniel Plemmons talked about some of the discoveries our team has made (and the VR best practices we’ve developed) while building VR experiences with the Oculus Rift and the Leap Motion Controller.
Here are some of the essential insights they want you to know about designing VR tools and experiences:
0:00. As designers, we’re in the same boat as we were in 1994 with web design, because we’re still trying to figure out the basic principles involved. We don’t know what the equivalent of the blue underlined link is yet.
1:58. There are two types of products emerging: minimum viable products (e.g. Google Cardboard and the DK2), which focus strongly on one aspect of the experience, and big bets (e.g. Hololens and Valve) that are trying to use multiple sensors to anchor content to the real world.
3:35. What is AR/VR bad at?
- Legible text. Due to refresh rates and lensing effects, the eye finds it hard to comfortably move along lines of text. Publishing on concave surfaces, bumping up anti-aliasing, and increasing contrast are all techniques that you can use. This is getting better with the next generation of headsets.
- Tactile feedback. Fortunately, sound effects and visual animations can help to replace the feeling of touch.
- The real world. Moving around in a real-world space can be dangerous with no real-world positional cues, so it’s essential for developers to think about building experiences that limit real-world locomotion. Again, this is being advanced with the next generation of devices.
10:14. What is AR/VR good at?
- Space. We’re now able to put digital objects into the real world, and the implications are staggering. Our brains are designed at a fundamental level to work in a 3D world, and VR can take advantage of this deep level of understanding.
- Spatial relationships. Human beings understand how different objects relate to each other by their relationships in 3D space. Jody shows how NASA scientists are using 3D techniques to create maps of Martian landmarks that change how cartographers think about the space.
- Multitasking. There’s a massive productivity gain in letting users spatially organize their tasks. This can lead to a 40% improvement in productivity, and VR offers the opportunity to have a vaster canvas.
- Simulations and situational therapy. Elite Dangerous is an amazing example of how we can be dropped into normally inaccessible environments. VR can also help people recover from trauma by creating safe simulated spaces.
- Fake limbs. Human spatial cognition is surprisingly flexible, and VR can bend your mind in powerful ways. One study involved getting people to adapt to having a third arm.
18:15. Best practices for VR design
- Don’t make me sick! Latency, taking camera control from the user, moving horizon lines — these are all things that can cause sim sickness.
- Virtual safety goggles. Objects should always be a comfortable distance away, so the user’s eyes can adjust and they don’t feel like they’re about to get stabbed.
- It’s more like designing a room full of tools than a screen with buttons. Thinking about human ergonomics and 3D affordances is essential to creating a smooth experience.
- Iterating on Arm HUD. Our UI Widget for Unity had some interesting challenges along the way. (Get the full story in our blog post.)
- Affordances. This concept from industrial design is a key problem for VR designers. (More on that in this infographic.)
- Depth is an illusion. Lighting cues, shadows, blurring the background, and other tricks from the cinema trade can help reinforce the sense of 3D space.
- Use sound. Since VR designers don’t have control of the camera, we need to help guide users towards UI elements. Sound in VR is also great way to reinforce the sense of space and the experienced responsiveness of buttons and other widgets.
- Pay attention to the user. We can combine different sensors to gain a new understanding of the user’s intentions.
- Use human scale. This helps immerse the user in the scene and take advantage of the cognitive powers we mentioned earlier. (You can play with this rule, though — sometimes it can be fun to be Alice in Wonderland.)
42:50. Ease people into AR/VR
We have profoundly powerful devices, and it’s possible to create wonderful applications or an assault on the senses. Even experienced users should be eased into the experience (e.g. skydiving demos that start you off riding in an airplane, looking at a wall a few feet away).
45:40. We all know what blue, underlined text means
But “the hyperlink” isn’t something hardwired into our brains. It’s a common piece of our visual language that has built up over time. Right now, we’re just starting to build these commonly accepted cues. Progressive disclosure, as we see in the early parts of most games, is a key technique for VR designers.
More from Jody and Daniel:
- What Would a Truly 3D Operating System Look Like? The next giant leap in technology involves devices and interfaces that can “speak human,” with 3D operating systems that let us unlock our brainpower.
- Introducing Planetarium: The Design and Science Behind Our VR Widgets Showcase. As an amateur astronomer, there are a lot of limits to what you can see through inexpensive telescopes. But what if you could explore the sky in VR?
- Quick Switch is Here! Swipe Between VR and the Real World in Unity. Our Unity assets include Prefabs that make it easy to integrate Quick Switch functionality into any Unity VR application.
- Rethinking Menu Design in the Natural Interface Wild West. Over the years, traditional menu design has developed along the lines made possible by hardware. Here’s what we’ve learned building menus for Leap Motion.
Originally published at blog.leapmotion.com on May 15, 2015.