Apple’s AR play isn’t just a device — it’s their ecosystem

Why Apple’s approach to AR has to be fundamentally different from its competitors

For decades, Apple has proudly claimed to have spearheaded the future of personal computing. It’s no wonder — Apple has played a key role in shaping today’s technology landscape, and their achievements have stood the test of time.

Now a new wave of personal computing stands on the horizon: VR/AR. These technologies have made great progress over the past years and there’s growing consensus that they represent the future of computing. Almost all major tech companies have launched new VR/AR hardware, but not Apple: their participation has been confined to mobile AR and not much else.

So what’s going on? When is Apple coming out with a device of their own? What will it look like? What will it do?

To understand Apple’s approach, you must first look at their AR play within the context of their ecosystem.

Apple doesn’t want to simply launch a new headset — they want to complete a decade-long puzzle and connect all parts of their already existing ecosystem into a single, immersive computing platform — you.

Their ecosystem is their AR play. And all that it’s missing is a display.


One Ecosystem To Rule Them All

Out of all tech companies in the world, Apple has the most all-encompassing consumer ecosystem available. It includes everything from smartwatches to personal computers, and many of these devices are already on your body:

  • On your hands and pockets: extremely powerful computers with world-class cameras, depth sensors and a growing array of computer vision technologies. Soon to support new 3D cameras.
  • On your wrists: powerful smartwatches that capture your biometric data 24/7.
  • In your ears: a new powerful set of context-aware earbuds (AirPods)

While these are all standalone devices, they’re all connected to one another, each one interfacing with a different portion of the user’s body.

These also happen to be many of the essential components required of an immersive computing platform. So for a moment, let’s imagine we have a set of AR glasses will all of the necessary sensors — what could each one of these already-existing parts become?

  • The iPhone — Mobile Processing Power & 5G: The heart of the operation, it receives sensor data from the glasses and provides the necessary computing power for AR. Additionally, it provides users with always-on internet connectivity (soon to be 5G) and continues functioning normally as a phone, providing a touch-based interface for the AR apps that require it.
  • The iWatch — Biometrics & Input. These continue to provide reliable user biometrics and an alternative touch interface, serving a foundation for hand-tracking and neural control interfaces in the future.
  • The AirPods — Contextual, 3D Audio: Provides context-aware, 360 audio, and an always-on voice interface for software and A.I. assistants.

Notice how a lot of Apple’s already existing ecosystem can be repurposed into a powerful, all-connected AR platform — and how each one of these elements also works perfectly outside of an AR context, giving great value to AR users and non-AR users alike.

This grants Apple an unparalleled ability to smoothly transition itself and their customers into the world of AR without having to design a single device that does it all. By the time their full AR ecosystem hits, most users will already own most of its parts and have years of experience using them. There’ll be practically no change of habit required once the glasses arrive — just the added convenience of an AR interface.

Apple hasn’t been sitting out of the VR/AR game as much as they’ve been slowly releasing and iterating upon the necessary pieces for their upcoming immersive platform. And the fact that most of its immersive ecosystem is perfectly useful outside AR context is essentially important, specially given that initially we won’t be using AR for more than a few hours a day.


Closing Thoughts

One thing’s certain: Apple isn’t sitting out of the immersive game. Their strategy is ongoing and many of its initial pieces are already on your body right now. And every seemingly meaningless new ARKit update, iWatch release and new iPhone camera adds a new piece to that decade-long puzzle.

Apple will take its time. Historically, they always have. They have the most complete consumer ecosystem available (and a kick-ass dedicated retail distribution network), so to them it’s essential that all of those pieces come together.

Ultimately, Apple’s final AR product offering won’t just be a set of glasses — but an interconnected ecosystem that can itself become a single, immersive computing platform. One that’s an extension of you and your body — whether you’re wearing glasses or not.

Now all we need is the display.


Thanks for reading!

Lucas Rizzotto is an award-winning Immersive Experience Designer, Artist and Creator.

You can follow him on Instagram, Twitter, Facebook,or contact him through his website.

Also, sign up to my mailing list and support me on Patreon so I can make more stuff like this!

This story is published in The Startup, Medium’s largest entrepreneurship publication followed by +424,678 people.

Subscribe to receive our top stories here.