Apple released another part of their AR headset

…and no one noticed.

Neil Gupta
9 min readDec 12, 2019

Everyone is waiting for Apple to release their augmented reality headset, but it’s possible Apple has been slowly building their augmented reality headset for years now and distributing pieces of it to us over time.

In 2016, Tim Cook told analysts “AR can be really great. We have been and continue to invest a lot in this. We are high on AR for the long run.” At the end of that year, Apple released their first generation Airpods. Since then, Apple has gone on to sell 50M+ units of this audio and voice component of an augmented reality headset they will one day release.

There are a lot of definitions for augmented reality, so let’s start with the simplest one I can think of: augmented reality is about adding a layer of magic to the world we experience.

In The Beginning

You could say sunglasses were the first augmented reality devices. They took us from being blinded by the sun to being able to see! Or maybe prescription glasses were the first AR. Some of us may take it for granted, but giving people with poor vision the ability to see with a couple of curved lenses is pure magic. And then there are transition lenses can even react to the environment! The thing is, expectations for technology changes over time. So let’s modify our definition slightly so it feels more inline with what we envision today:

Augmented reality is about adding a (digital) layer of magic to the (physical) world we experience.

Most of what has captured the media and public’s attention has been things that augment solely what we see — gender swap filters, Pokémon GO, and trying on digital objects at home before you buy.

Snapchat allows users to digitally try before you buy via their augmented reality features

It’s important to note that augmented reality is not an additional layer on top of what we see, it’s a layer on top of everything we experience: what we see and hear and feel and taste and smell. You know what was adding digital magic to our physical experiences long ago? A record player.

Digital experience superimposed on the physical world, but where’s the augmented reality?

If the only time you heard music was when you went to a live performance, this piece of technology had to feel like pure magic. But again, expectations change with time, and sitting next to a record player doesn’t really feel like the augmented reality future we’re all picturing. So, yes, augmented reality is about adding a layer of digital magic on top of the world we naturally experience, but practically speaking this means there are three components to AR that have to reflect where society’s expectations might be: form factor, user interface, and the experiences generated.

AR = Form Factor + UI + Experience

Augmented reality promises convenience and technology that either disappears or is fashionable. It’s clear that record players don’t meet this. How about the Sony Walkman? It certainly comes close: form is there in terms of being portable, but we still had to carry physical cassettes. The UI was lacking as there were limited track controls, and the experience itself was extremely limited because you only had as much music as the number of cassettes you felt like carrying around with you. I do concede that no technology has come close to matching the magic of making or receiving a mixtape of songs you recorded from the radio for or from a crush. But in almost all other ways, the Walkman was cool but not enough.

And Then It Clicked

Then came the iPod, and it changed the game. It improved upon the Walkman’s form because we had no more separate physical media. And yes mp3 players already existed, but none had solved the formula for:

Form Factor + UI + Experience

For form they were portable but had to sacrifice storage to do so, and the UI was just uninspired track controls and a simple display with no haptic feedback. Steve Jobs (or possibly Jon Rubinstein, then Apple’s senior VP of hardware) famously used Toshiba hard drives to solve the form problem and let users store 5GB or 1,000 songs onto the device, allowing iPods to effectively feel like infinite music storage in a portable form factor. Apple also applied their world class design skills to the UI, allowing users to choose and control music in a way that literally felt delightful with each haptic click of the wheel. Delightfully choosing from an infinite source of your favorite music certainly satisfies the experience part of the equation as well. It all came together to feel like magic. It was society’s first taste of a modern augmented reality experience, and we loved it.

The power of form factor + UI + experience

Again, portable music players existed, but Apple’s ability to package form, UI, and experience into something magical is what separated their product from the rest of the devices, and continues to be their source of excellence today.

Sidebar: I say AR is about adding a layer of magic, and magic only works if you believe it’s real. When you consider that one of the goals for augmented reality is to digitally recreate something so perfectly that you believe it’s physically there, audio is way ahead of visuals. I can put on a headset and it will sound like the Beatles are really right there, but you’re going to have a hard time showing me any technology that makes it really look like John, Paul, George, and Ringo are in the room with me.

Top of Mind

Wireless headphones have been around for a while. Wired wireless headphones (two earbuds that are tethered together) have been around for a while. Even truly wireless earbuds, where it’s just two buds you stick in your ears, had been around for a few years before Apple started making them. So why are Airpods so amazing? Let’s apply our (form + UI + experience) equation again.

The form worked because they made something distinctly Apple. Sure it was ridiculed at first, but Apple’s brand is so strong that anything they produce will inevitably become a fashionable flex. While the form worked, it wasn’t a massive improvement or upgrade from other existing solutions. The UI leverages the #voicefirst revolution and continues to improve in terms of how a user can access Siri without relying on the phone (other devices just don’t seem to pair with Siri as naturally/smoothly), but I believe it was experientially where Apple really differentiated themselves because they recognized the massive pain points with existing solutions. Apple nailed the way Airpods seamlessly pair with the phone, and introduced a new feature that pauses the music when an earbud is removed. This is not something other headset makers missed, it’s just exactly where Apple’s prowess in technical design and ownership of the vertical stack separates them from any potential competitor.

Exploded view of Airpods Pro, engineer’s perspective (src: iFixit Teardown)

The custom H1 and W1 chips Apple put in their Airpods are what allows them to pair with your phone after you take them out of the case before you even get them in your ears. A proximity sensor detects if the bud is in your ear, and sends a command to pause the music otherwise. The experience of having Airpods connect without even thinking about it and having the music pause when you remove an earbud is a huge step towards a full-fledged augmented reality experience where the technology starts to feel invisible.

Where you might look for Apple to improve is within user interface— Siri is still widely considered to be the least advanced amongst the voice assistants. However, I am not convinced people are looking for complex queries or conversations, nor have we established enough trust with AI for assistants who can make phone calls on our behalf. We primarily use voice assistants for music, messages, and maps. Siri is truly good enough, even if she’s not great.

Still Airpods, even with hands free voice assistants, seamless pairing, and auto-pause, didn’t feel like augmented reality devices. The “experience” of seamless pairing and auto-pauses quickly began to feel like commoditized functionality. Again, society’s expectations change quickly! So what makes something feel like it’s augmented reality then? The folks at AWE have a phrase I like — augmented reality gives us superpowers. While Airpods gave us convenience in form factor and UI, Airpods Pro added something new for users to actually experience— active noise cancellation and transparency modes. Airpods Pro let us change how we hear the world around us. They give us superpowers.

Airpods Pro are augmented reality devices for our ears.

With noise cancellation and active transparency mode, Airpods Pro no longer solely provide digital information superimposed blindly onto my physical world. Now they provide a layer of digital information that responds to and changes my experience of the physical world in a way that feels invisible.

This is magic.

This is augmented reality.

Exploded view of Airpods Pro, consumer perspective

Soon, active noise cancellation and transparency will also become commoditized, and it won’t feel like magic anymore. What might future Audio AR experiences look like? We’ve already seen some hints via Apple’s patents, and I’ll dive into this in future posts in this series, including an analysis of traditional speaker company Bose’s effort to create a developer ecosystem for their BoseAR headsets.

Enjoy the journey, sure, but also get to your destination.

Now I did start this article by saying that Apple released another part of their AR headset, not just that Airpod Pros are augmented reality devices. We will see Apple’s augmented reality headset come at us in the same two ways Hemingway described how to go bankrupt: gradually, then suddenly.

It is my belief that Apple will not release an all-in-one headset device, but instead provide magical augmented reality experiences through a suite of wearables: a pair of lightweight glasses that work as part of a suite of wearables with your Airpods Pro for audio and voice, Apple Watch for gestures, and iPhone for connectivity and computing power. The glasses will only need a display, cameras, eye-tracking hardware, and batteries, allowing them to be lightweight and long-lasting, easily taken on and off without removing you from your entire wearable ecosystem, and to be available in many stylish and customizable options from Apple or many partners.

This is the essence of the wearables era — synergistic value that comes from having multiple wearables that can work together. Purists will say we need an all-in-one headset for augmented reality to be real. Their AR future is far off. From my perspective, there is already magic everywhere.

TL;DR

Interested in Audio AR and wearable computing? Starting next week I will begin posting a multi-part series on these topics. We’ll start by defining some terms you’re going to see more frequently in the coming months, then we’ll look at the approaches other tech companies like Google and Amazon are taking, how traditional speaker companies like Bose and Sony are trying to stay relevant, audio AR apps and ecosystems for various industries, and how we might soon have a completely different relationship with all of the sounds around us. Follow me on Twitter and LinkedIn to get notified when the first article is out!

Neil Gupta is a Venture Partner at Indicator Ventures and President of NMG Consulting, where he spent the last year consulting for Bose on their BoseAR initiative. This is not a sponsored post.

--

--

Neil Gupta

AR & Emerging Tech Strategy Expert | Venture Partner @IndicatorVC | Founder @BostonARmeetup