This year I made my first foray into AR/VR, and one of the things that’s most intrigued me is how UI/UX will look like in a future where there are no more “screens.”
When computers and smartphones were invented, screens were created as a way for us to interface with the virtual world. Virtual objects lived in their own space, disconnected from our reality.
Now, we’re entering a new age of AR/VR/XR where virtual objects are superimposed on top of the real-world and feel like physical objects that we can interact with.
With that, we’re on the verge of a revolution in UI design.
Traditional 2D UI on the screen doesn’t work anymore.
Nearly every game you play on desktop has UI elements on the screen overlaid on top of the game. Since the monitor remains static on the desk, all you had to do was turn your head to see your health bar or ammo.
Now imagine being in VR — the display is mounted to your head. Wherever you turn your head, the screen moves with you. That’s pretty annoying if the health bar is at the corner. Sure, you could swivel your eyes to look at it, but as humans we’re more naturally inclined to turn our entire heads when we want to see something.
Therefore, the first step into designing UI for VR is to put aside everything you know about game UI and rethink the process.
Place UI elements in world space.
In the virtual world, you must pretend that there is no screen between you and the virtual world. You are in a place where the real world and the virtual world coexist. Whatever you do in the real world is also applicable in the virtual world, so treat the UI elements as if they were physical objects.
A common practice in VR games is to spawn a tablet on one hand that you can look at or interact with using the other hand.
You can also place a console on the floor with a display and buttons.
Remind yourself over and over again that you are in a 3D world and that there is no “screen.” Oscar Falmer, for example, realized that in a 3D world there’s no point to scrolling because content is not confined to a small smartphone screen. Instead, you could spread it out in 3D space.
There are exceptions.
Health, for example, requires immediate attention from the player. Some VR games place the health bar right smack-dab in the middle of the screen like you would do normally with cross-hairs — very obstructive.
A better approach is a fading red or blood-like texture that surrounds your view based on how much damage you’ve taken. This tried-and-true method is intuitive and not obstructive.
Desynchronize the avatar for positive reinforcement.
Most VR games have fully synchronized avatars, meaning however you move your hands, the avatar that you embody in the virtual world moves its hands in the exact same way.
If you wanted to achieve avatar embodiment, that sounds like the best thing to do, right?
Well, after a lot of play testing, Survios actually learned that players found very little satisfaction in their in-game actions because the visual feedback did not meet their expectations. In a VR boxing game, for example, players felt underwhelmed that they were in the body of a professional boxer but punched like somebody who had no boxing experience.
Let’s look at the following two VR boxing games.
In Thrill of the Fight, your avatar is fully synchronized with your actions. Wherever you move your hand, the avatar’s hands move with you. If you stop your punch mid-swing, the avatar also stops mid-swing.
Now look at Creed: Rise to Glory. Survios employs a tactic they call Phantom Melee Technology. Depending on the action taken, the avatar temporarily desynchronizes from the player. If you took a swing at the enemy, your avatar’s hands wouldn’t stop mid-flight to match the true positions of your own hands; instead, it follows through like a punch swung by a real boxer and makes a full-force impact. After the action is over, the avatar’s hand retracts and synchronizes back with that of the player. The end-result is extremely satisfying and allows the player to feel truly immersed, as if s/he is a real boxer.
Here’s another good example from Survios’ Sprint Vector. When you attempt to touch an interactable object, like a ledge, the avatar’s hand automatically snaps to the object so that it doesn’t clip through, even if the player’s hands are physically too far ahead. In doing so, the player is given visual, reinforcing feedback that lets the player think “cool, I did the right thing.”
At the moment, there are no accepted practices. What’s going to be the universal quit button for AR/VR experiences? Nobody knows yet, so designers and developers are getting creative.
Owlchemy Labs introduced the popular “exit burrito” concept in their game Job Simulator, in which you pick up a burrito and stick it to your face to take a bite and the game quits. It’s not necessarily clear, but it’s fun, piques the player’s curiosity, and is in some ways intuitive.
This concept is similary used in a VR helicopter simulator, where the player can pick up a bucket and barf into it to quit the game.
Whatever technique you use needs to make sense.
As fun and as creative as the exit burrito is, it doesn’t make much sense if it’s used in a sci-fi or western game.
Chris said that when he played the tutorial for Thrill of the Fight, the trainer talked way too much. So purely out of curiosity, he walked up to the trainer and punched him in the face.
Guess what. It worked! The trainer was muted.
As with the exit burrito, it wasn’t perfectly clear, but the solution fits the theme of the game and could be figured out on the spot.
Accepted practices will take time to develop.
When Nintendo first released the N64, players were baffled by the three hand grips. I personally remember going to Target when I was a little kid to play the demoes—I held the controller on the far left and extended out my thumb to touch the middle joystick. It was so uncomfortable. Once I realized I could hold the middle grip, I was absolutely mind-blown.
Fast forward a couple decades and now all controllers are nearly identical with two grips, a joystick and D-pad, triggers, and (almost) the same button layout on the right side.
Like video game controllers, UI/UX in AR/VR/MR will take time to develop. In the past few years, VR has made huge advancements. AR is next, and eventually we’ll achieve full immersion with MR.