Listen to this story
As a kid, I wanted to be Ender Wiggin. Not because I wanted to save the world like the boy-wonder protagonist in Ender’s Game, but because I wanted to play what sounded like the best game ever: Two teams face off in a zero-gravity arena aboard a space station, with the objective to pass through the other team’s goal.
This summer, I finally got to play. There’s a new virtual reality game called Echo Arena that is a close replica of the game described in the book. Instead of passing through the other team’s goal, you throw a disc through it. And instead of stunning opponents with a gun, you punch them in the head. I’ve spent hours careening around in zero gravity, only to bump into my couch and remember I’m actually in my living room.
It’s a piece of the future I never expected to see in my lifetime. But the even more surprising thing is how well it works. It feels natural to slingshot myself off a block, trigger my thrusters for small trajectory adjustments, and calculate the future position of a wildly ricocheting disc.
If VR has taught me one thing, it’s the adaptability of the human mind. VR drops users into previously impossible situations where they can teleport, fly, and create matter out of thin air. For the most part, our brains don’t seem to mind. But it relies on carefully crafted design and attention to human emotional and physical states.
As natural as it feels to sail through space in Echo Arena, it’s actually a tricky feat to make a virtual reality flying experience that doesn’t end with users feeling like they want to puke. Flying in loops in a massive EVE: Valkyrie spaceship battle made my stomach do flips like I was on a roller coaster. Windlands, a game where you use grappling hooks to swing between trees and explore an animated world, almost made me swear off aerial games for good.
Game developers know these are the early days of VR, and they are paying attention to users’ experiences. Michael Buckwald, CEO of Leap Motion, explains it like this:
We, as humans, have an innate understanding of the physics we encounter every day in the real world. Most people can pick up a basketball and have a general sense of how to throw it toward the hoop, plus what path it will take once it’s airborne. A physicist understands the mathematics behind a basketball shot, but that doesn’t make them a better basketball player.
Bringing those innate skills — which Buckwald calls direct natural interaction — into VR is the surest way to keep people immersed and confident. You can even apply them to things people can’t do in the real world, as is the case in Echo Arena.
The team behind Echo Arena at the Ready at Dawn Studio, which also created a partner game called Lone Echo that follows your life as a space robot, knew they wanted to make a game unlike anything that already existed. They watched videos shot aboard the International Space Station and noticed the ease with which astronauts pulled themselves from room to room, according to Ru Weerasuriya, the studio’s chief creative officer and president.
The team began to design an experience that would simulate what it feels like to be an astronaut, but they ended up taking some liberties. They found it was more important to monitor when users’ bodies and minds became uncomfortable with the VR experience than to exactly replicate an astronaut’s movements.
Part of the answer is to measure the distance between the human eye and hand and stay true to that in virtual reality. If you’re looking at a water bottle, you can close your eyes, reach out, and still grab the bottle. We naturally know the relationship between our eyes and hands, and replicating it gives us confidence that we can move and act in VR and expect a reasonably predictable outcome. That means you can’t change someone’s body too much in VR, or you risk a disconnect between their real body and what’s happening in the virtual world.
“If you were to build a giant robot you can control, you wouldn’t want to play the robot. You would want to play the avatar that controls the robot,” says Weerasuriya. “As long as the thing you are controlling is more akin to you and not something that is so vastly different than you are, then you are already making the body comfortable in believing that what you built is something your brain should be comfortable with.”
Even if you get the look and feel of an avatar exactly right, it also matters how you make it move. Echo Arena calls for flying through an arena at high speeds. After a turnover, players need to make a 180-degree turn as quickly as possible. They can physically spin their body around and speed after the disc (that’s how I end up running into my couch), or they can use the joystick on their handheld controller to shift their field of view a few degrees at a time. The latter is safer and often faster.
The jerky joystick shift is found in all sorts of VR games. Our bodies don’t mind a disconnect in forward-back movement, but they are very sensitive to side-to-side movement that doesn’t match up with the real world. It turns out that the little jerky shifts in field of view are fast enough that the inner ear doesn’t register the change and cause discomfort. Continuously turning to the left or right, which is actually closer to how we turn in the real world, causes the inner ear to kick in.
Echo Arena does break one piece of advice its creators received: No fast movement.
Players can travel at 10 to 11 meters per second. The key to keeping them comfortable is keeping everything else in the arena stationary. The walls, blocks floating in the air, and goals all remain in place. Linden Lab, a San Francisco–based company famous for creating a social virtual world called Second Life, has found similar results in VR by blurring the corners of users’ vision.
“You need to be careful that there is not that much happening in your peripheral vision,” says Weerasuriya. “If you are in the middle of space and boost yourself at even 1,000 meters a second and there is nothing to really relate that information to your brain, you don’t feel that sick. But then again, if you are in a tunnel and going at that speed, you definitely feel sick.”
Bjorn Laurin, vice president of product at Linden Lab, was driving over the (real) Golden Gate Bridge when he found himself doing something strange: raising his hand and attempting to teleport.
The gesture comes from Sansar, a Second Life–esque universe built specifically for virtual reality. The only way to get around many parts of the universe is to walk. Or you can take a shortcut by teleporting.
Spend long enough in VR and many of the gestures and controls start to feel as real as those used outside a VR headset. Earlier this summer, I was at Leap Motion’s ultramodern office in San Francisco’s SoMa neighborhood. But I was also standing on a dark virtual plane inside a VR headset. By pinching and pulling at the air, I could create blocks and other shapes. I sent them up into the air and then back to the ground. Matter was born and bent at my command.
From Microsoft Paint to The Sims, technology has always given us a way to create something out of nothing. But VR is the first medium where developers are melding magic with real-world physics. It no longer feels like mixing and forming pixels. It feels real.
Right now, high-end VR headsets like Oculus Rift and HTC Vive use handheld controllers to bring users’ hands into the virtual world. Play long enough and they feel natural. They melt away and leave you with what feel like real digital hands. Meanwhile, Leap Motion is working on a hand-sensing system that lets you accomplish the same thing without any controllers at all. Just put your hands out and grab virtual objects without having to push any buttons.
It’s tempting to treat gesture controls as an entirely new language. Adding more gestures adds more things you can communicate to your VR headset. But Leap Motion has found that simpler is better. By using gestures that almost everyone on earth already knows—pinch, grab, throw—they’re easier and more natural to learn.
Buckwald says that’ll be even more important when augmented reality becomes widespread. Unlike virtual reality, which covers users’ entire field of view, augmented reality lays a virtual layer over the real world. It could replace all existing electronics — and change time and space, according to Buckwald. Users will need to be able to reach out and interact directly with their new mixed reality.
“There is a sense of us being able to adapt to new things and over time get more and more comfortable with it,” says Weerasuriya. “It’s absolutely true that our brain has the capacity to extrapolate much further than what we even think today.”
About this Collection