Virtual, Meet Reality: AR Will Change The Way We Experience The World
Today Apple unveiled its latest devices. But for those paying close attention, a few references were made to a recently-announced, under-the-hood developer feature that’s bound to be much more significant over time. It’s called ARKit, and it’s going to set the course for a radically different future where people and technology interact in exciting new ways.
“Reality” is our massive, messy, physical three-dimensional world. We are hardwired for this world — our brains adapted to sensing, interacting with, understanding and navigating it. Yet nearly all of our digital interactions (which account for an increasingly large percentage of our time) happen on flat, 2D screens. The content and experiences we rely upon to work, communicate, entertain and express ourselves are all trapped inside the tiny portals we carry in our pockets, like little alternate universes we visit through a 5" window.
iOS 11 sets the stage for a completely different future. For the first time, hundreds of millions of phones will suddenly be able to project pieces of those digital universes back out into the real one all around us.
This excellent piece by Matt Miesnieks dives into the technical details of how ARKit, which relies on a technique called Visual Inertial Odometry, actually works. In short: using little more than a built-in camera and gyros, plus a dash of algorithmic wizardry, your phone will able to seamlessly pinpoint and track a real position in 3D space and then overlay digital content using that position as an anchor. The real world suddenly becomes the canvas for an infinite array of information.
Beautiful AR apps already exist, like those that let you explore celestial bodies in the night sky.
The Pokémon Go craze turned the real world into a treasure hunt for imagined virtual creatures.
What’s different now is that every developer will be able to build these experiences without unique skills. In classic Apple fashion, the company took something complex and made it unbelievably easy. A huge barrier to entry can now be climbed with just a few lines of code.
Kevin Kelly, the brilliant technology philosopher, has spent years studying the history and pondering the future of technology. One of his most salient points is about ubiquity. In his book, What Technology Wants, he asks: “In the course of evolution every technology is put to the question…What happens when everyone has one?” Here’s what I think happens:
First Order Effects
- Shopping for physical products will change. AR lets our brains work the way they were supposed to, by seeing an object as it would appear in the real world. This is especially important for items like furniture, art, and clothing, where “fit” in a space (or on a body) is critical. Since many products are already designed with 3D software, a few simple steps will make it easy to visualize them in the physical world and close the imagination gap for customers.
- Physical locations will become more interactive. Imagine seeing a piece of history right before your eyes, overlaid on the present-day. Or uncovering a secret message hidden in an alley just for you (taking the idea of Geocaching to an entirely new scale). This brings new opportunities to retailers and other physical venues which can create exclusive experiences linked to their actual location. Consumers will come to expect more information and richer experiences everywhere they go.
- Many experiments will be tried. AR is an entirely new way to experience the world. Constraints on capability and the huge number of environmental variables will require a new way of thinking about design. Companies will create amazing demos that never amount to more than a failed feature. Others will get lucky with an unexpected hit that flips our common understanding of how to build successful applications. (Follow Made With ARKit on Twitter for a glimpse at the possibilities).
Second Order Effects
- Our phones will begin to not just see, but understand the world around us. In order for ARKit to function, it needs to determine some basic things about a scene, like what surfaces are flat and how the scene is lit. With the advent of depth sensors and machine learning, scene understanding comes next. Contextually-aware apps will not just project over a scene, but interact with it in much richer ways.
- Humans will be holoported. The depth cameras coming to our phones to help them understand the world will also let them capture it in 3D. This is already possible with expensive professional setups, but will be available on devices in the near future. ARKit lays the groundwork to project captured people in nearly any environment. Lady Gaga or your aunt Linda will literally be able to show up on your living room floor.
- AR on your phone sets the stage for AR on your face. ARKit is just a glimpse of what’s to come. There’s no denying some of the UX challenges in designing for this medium, especially when it requires a user to take an object out of their pocket and hold it up in front of the world. As interactions are refined and consumers accept and expect AR experiences in their daily lives, it will become a natural progression to desire those experiences more seamlessly. Several hardware limitations make this challenging today, but don’t be surprised in a few years when Apple launches its first pair of glasses. The company will already have products in your ears (AirPods) and tracking your hands and heart rate (Watch). And soon your senses will be sent signals that continuously blur the line between what’s real, and what’s not. This sci-fi future is much closer than you might think.
The race is on. Shortly after Apple’s announcement at WWDC this summer, Google answered with a preview of ARCore. Facebook announced AR Studio. Microsoft has been working on its own head-mounted AR experience in the form of HoloLens for several years now. No matter what, it’s going to be interesting.
My partner Josh likes to say we have 0% certainty of what our next investment will be, but 100% confidence in where it will come from: the problems and possibilities being explored at the boundaries of our own portfolio. No less than eight of our companies are eagerly experimenting with this tech, with applications ranging from consumer commerce to enterprise training to entertainment that rivals Hollywood studios. Needless to say, we’re paying close attention.