Gear VR Experience Design Handbook
Note: VR is a constantly evolving platform & best practices are being defined as the ecosystem develops. This particular guide was created for the NYC VR Game Jam on April 2016. For Unity Development related tips, check out this article
1:1 is about making everything you design in VR as close to how you think a user would interact with it in real life. If your player is sitting in real life, then think about this as a design constraint. Should they also be sitting in your game? If your game is designed around full 360 movement then you should probably prompt them to stand up, or at least sit in a swivel chair.
However, this 1:1 matching of the real and the virtual is limited by the current technology. GearVR doesn’t have hand controllers or positional tracking, so you will need to emulate actions like shooting, throwing, or walking with the GearVR touchpad or a Bluetooth game controller.
Esper did a very good job of using the touchpad on the side as a constraint for the game design. The user touching their temple to move things with their mind was clearly inspired by the way we’ve been shown mind readers practice their craft.
There’s still plenty of room for novel experimentation in VR, specifically how to move people around a space when they can’t walk on their own. At the moment there seems to be 3 main strategies for navigation in Mobile VR.
- Waypoints In this strategy the user looks at a beacon in the space, taps the touchpad, and then floats to the beacon. This is best illustrated in Land’s End, the VR game from ustwo. Movement from one beacon to the other should be graceful and constant. No sudden movements or tilting of the horizon.
- Teleportation The player looks at their new point in space, and taps the touchpad. There’s a quick “blink” and the player is now in their new position. This has become one of the most used standards so far probably due to lack of motion sickness and speed in moving around larger spaces.
- Gaze movement The player is either constantly moving, or moves when the touchpad is pressed. The player moves forward in the direction they’re looking. The Night Cafe is a good example of a VR experience that uses this style of navigation. This style is good for a game with a wandering exploratory nature.
If you want to do mobile VR game development the Gear in our opinion is THE place to do it at the moment. Sure, there are tons of Cardboards out there…but they have pretty big UX limitations as far as a single button interface & an overall sub par experience.
Mobile VR will never be PC VR. But that’s not all bad. There are significant advantages to being on a mobile based platform.
To name a few:
- No wires!
- Much cheaper for the end user
- Portability (Take your game around with you and show it off / game test it)
- Connect to friends very easily (sms, twitter, etc are typically already authenticated on the phone and ready to be tapped into)
Remember that you are blindfolding your user & then completely taking over what they can see. As such, traditional 2D or 3D UI should be rethought. Think about how you can use the world & its objects to progress the story. Teach players your game mechanics by having them actually do things. Use text and panels sparingly. People don’t learn how to cross the street from signs…they learn from watching others & doing it themselves.
Reticles can be implemented in many different ways, but they are vital to getting a player to understand how to interact in game. It’s been fairly standard in mobile VR to use an “always on” reticle, but more sophisticated methods of reticle display have emerged and are clearly going in the right direction.
For example, Lands End needs a player to look at targets to not only move, but to solve puzzles. However, they only show the reticle when it is within a small radius around the target. Furthermore when the reticle is inside this target area, a very small arrow helps direct the player even further towards the bullseye.
By removing the reticle from view when it’s not actively being used for interaction, the player is given an unobstructed view to take in the beautiful landscapes in the game.
A shooting game like Gunjack is going to do the opposite as it makes sense to have a targeting reticle in your face at all times when you are controlling a huge gun turret.
In short, reticles should enhance the users understanding of the controls or get out of the way & let them see the world you created!
Don’t forget about Sound!
Sound design has historically (& sadly) been low on the totem pole as far as game development is concerned. It’s very easy to grab a bunch of assets on the Unity store and just make everything go “pew-pew” last minute. However when it’s done well…many traditional games have sounds & music that are burned into our subconscious for all time.
In VR, sound can make or break immersion. If a sound is not correctly spatial, doesn’t fit within the space, or is overall of poor quality it will directly contribute to your player being removed from the story you’re trying to tell. Budget time for sound to be properly designed & placed. If the headset is the blindfold, the headphones are the earmuffs.
Some great mobile VR games
Keep Talking and Nobody Explodes : Super smart game design that came out of another hackathon. Designs around the fact that only 1 person can use a headset at a time…but you can make a multiplayer game with that constraint.
Esper/Esper 2 : Really nice puzzle game that intelligently utilizes the touchpad on the Gear VR.
Gunjack : The team that made EVE built this very nice arcade shooter. It looks amazing and is a fun and simple experience.