How Apple prepared for their Augmented Reality headset

Putting the pieces together for a new product in the Apple ecosystem

Jonny Daenen
The Visual Summary
7 min readMay 30, 2023

--

As Apple’s new Augmented Reality product launch approaches, let’s explore what pieces Apple has put into place over the past years to make this happen.

In this post, I’ll try to connect the dots between some facts and speculative ideas, resulting in my personal view of what this new product could become. We’ll look at possible Hardware, Operating System features, Ecosystem Integration, and Apps and see which aspects could be used to create the new product.

Hardware

Over the past decade, Apple has introduced many hardware components that might find their way into a new augmented/virtual reality product.

Overview of potential hardware components that could form the basis of the new XR headset.

Apple Silicon, a low-power but high-performance family of chips, is a solid foundation for an XR system. The recent growth in computational, graphical, and AI/ML capabilities will make it a key component of the upcoming headset.

More “standard” features like advanced mic arrays and camera system updates have already received many iterations. Complemented with more recent systems such as LiDAR and TrueDepth cameras, these components could provide the foundation for a solid input-capturing mechanism.

The advanced head tracking technology — available in the AirPods Pro — offers a gimmicky Spatial Audio experience. Together with the more recent introduction of high-sensitivity sensors (gyroscope and accelerometer) in the iPhone 14 and Apple Watch 8 for crash detection, this could unlock its true potential: we might reach the precise measurements we need for an amazing XR experience.

Head tracking technology and the introduction of high-sensitivity crash detection sensors could potentially unlock a solid XR experience.

Next to that, we have user input components such as the Digital Crown and the Action Button on the Apple Watch, Touch Control on the AirPods Pro to regulate volume, and a smaller Touch ID button on the iPad. These hardware mechanisms could provide a tactile basis for interacting with the headset.

With respect to sensors, next to head tracking abilities, we could see advanced eye tracking or even a more accurate temperature sensor. I’m not expecting much more, as biosensors will most likely remain the focus of the Apple Watch, which is taking measurements all day long instead of only during a specific session.

Finally, a headset needs power. We’ve seen MagSafe battery packs for the iPhone, which could provide an easily swappable power mechanism to allow users to roam freely.

Operating System

The operating system: the part that will offer user input and output and provides the core components to build applications.

Potential Operating System features to provide a good experience to both users and app developers.

In this case, I expect the return of existing functionality, such as Notifications, Spatial Audio, and potentially Visual Lookup and Live Text. The latter two could support Accessibility features, which Apple is quite good at, but is not that commonly known.

But what about input? Apple could present something I’d like to call “AirTouch” to capture user input. Minority Report-alike gestures might finally become a reality — without uncomfortable gloves. An in-air keyboard seems the bare minimum. At the same time, Apple Watch’s AssistiveTouch already has gesture detection, which allows you to “pinch” and “clench” your hand. It seems like a no-brainer to extend this to an XR experience, provided there are no technical limitations that cause the interaction to be too laggy or unreliable. Maybe this is something for the next iteration?

Minority Report-alike gestures might finally become a reality, without uncomfortable gloves.

I’m also hoping for some specific XR features. I’m thinking about “Big Screen.” Indeed, a big virtual screen to watch video content, but also the existing Picture-in-Picture feature. Next to that, a Heads-up Display would be awesome to quickly check the current temperature, your heart rate, and other metrics you choose to track. Maybe we’ll even see Dynamic Island-like integration that shows when your food delivery will arrive?

Ecosystem Integration

Apple already has a tightly integrated ecosystem. Over the last few years, Apple has clearly shown its goal of providing an experience that spans multiple devices.

Apple could integrate several key components from their existing ecosystem into their new headset to unlock a next-level seamless user experience.

A feature I hope to see is an extended version of Universal Control. Imagine creating a new virtual screen and moving your mouse beyond your device. Or, re-claiming screen real-estate taken by Stage Manager and moving the window overview outside your screen. Again, if technically possible, at some point, this could provide a whole new experience of how we interact with our devices.

If technically possible, this could provide a whole new experience of how we interact with our devices.

We’ve also seen major steps in capturing our surroundings and ourselves. The TrueDepth camera system can turn you into a moving Memoji, again a gimmicky feature that could be used here to bring Facetime to a whole new level: I’m thinking hologram-like functionality. The same holds for the camera system, which can use its LiDAR system to detect depth. Today, the iPhone can already track what your hands are doing on your desk — Virtual keyboard, anyone? Oh, and Continuity Camera allows other devices to use the iPhone’s camera system seamlessly. Battle-tested on Mac, ready for its real role.

I expect the upcoming headset to integrate seamlessly with FindMy, both to find your headset but mainly to detect things in your environment with Precision Finding. If accurate enough, AirTags might serve as beacons that help increase the accuracy of the overall environment tracking, who knows?

Finally, I’m hoping for more deep integration with Apple TV. The apps are already built for big screens and have a simpler input model/navigational structure (most of them, at least). Imagine we could pop out the apps from our TV, interact with them in the air, and watch content in HDR, supported by Spatial Audio powered by HomePods. If this experience is not straining, but comfortable, our big TV screens might be on their way out.

Apps

As for apps, we’ve seen several things pop up over the years. In 2019, Apple showed us demos of an AR game being played on a table. In 2021, Apple Maps got “immersive walking directions”, a feature where you can see your directions on top of your surroundings. Both seem limited by the iPad/iPhone’s limited screen real estate, but they allowed Apple to experiment with maturing technology. I expect some games and Apple Maps to be available on the new device from the start.

While several apps are clear candidates for an XR system, we could envision some recent additions making it to the new OS in the future too.

Measure, the app that allows you to, indeed, measure objects in the real world, is an ideal fit too. As is the Translate app, which now supports Auto-translate and Translate Camera. Just like Google Lens offers, text in your view could be replaced in real-time with another language.

Next to some of these more obvious apps, the future might bring us a few new experiences. Imagine the “Race against” Workouts feature where you compete against your augmented self. Or, the Apple Fitness+ instructors popping up in front of you, so it’s easier to mimic their moves. In the long run, our in-car experience might radically change as we have a heads-up display powered by CarPlay integration.

Imagine “Race against” Workouts where you compete against your augmented self.

The Freeform app, which was launched in 2022 and supports creative brainstorming, could open up more creative options in the future. For example, it could allow 3D painting or sculpting without expert tools or 2D limitations. And finally, the Weather app could project information on the sky, including star information.

To infinity … and beyond

In this post, my goal was to create a conceptual view of what Apple’s
upcoming XR headset might look like. When we look at what Apple has done over the past decades w.r.t. their hardware, operating system, ecosystem integrations, and apps, we can imagine how these building blocks could shape a new XR product and how this will evolve.

Some of the ideas I presented might seem a bit far-fetched, but I challenge all of you to provide some crazy ideas in the comments below. Imagine what this new product could do for us in the next decade.

Apple’s “Reality Pro” Concept by Jonny Daenen.

The contents of this post are based on my personal view and are purely speculative. I am not affiliated with Apple and have no access to internal information.

--

--

Jonny Daenen
Jonny Daenen

Written by Jonny Daenen

Data Engineer @ Data Minded, AI Coach @ PXL Next - Unleashing insights through data, clouds, AI, and visualization.

Responses (3)