3 Teases To Apple’s AR and VR Headset We Saw During WWDC 2021

Apple has quietly laid the foundation for its next big thing and you probably didn’t notice

Anupam Chugh
Big Tech Talks
4 min readJun 17, 2021

--

Apple AR Headset sneak peek during WWDC 2021
Images from Iconscout by Rock n Roll 3D, No results found 3D, and FaceTime Logo — edited by author

When Apple first announced the date for WWDC 2021, we were greeted with a bunch of Memojis staring in a Macbook.

It was a reference to Apple SVP Craig Federighi’s iconic Macbook stare from last year’s event. But it also gave a tease at Apple’s long-rumored and much-awaited AR Glasses and VR headset.

The hint was surely intriguing as most Memoji people were wearing glasses. Understandably, many were expecting a hardware toolkit for developers as a bare minimum this year.

However, all that pre-buzz met a disappointment as the WWDC 21 event folded without a mention of Apple’s AR/VR plans.

But the upcoming iOS 15 technology does give us a sneak peek into how Apple’s Mixed Reality plans will play out in their ecosystem.

FaceTime’s Spatial Audio and SharePlay Interactions

Sound is a crucial factor in creating immersive virtual experiences. So, I cannot stress how important Spatial Audio is in Apple’s Mixed Reality ambitions.

A year ago, Apple had already unleashed the power of Spatial Audio in AirPods Pro by leveraging head motion tracking data. With iOS 15, the iPhone maker harnesses that technology in FaceTime video calls too.

Now, from a distance, spatial awareness might now seem like a gimmick to boost FaceTime calls at the moment. But as soon as you add in the possibilities of a hologram in Apple’s future headsets (yes, it’s doable on Apple Glass), Spatial Audio becomes a game-changer.

Imagine, the number of things you could do by using head turns as gestures in VR-based social interactions.

The other FaceTime feature, SharePlay lets you share screens and media across screens with complete synchronization across devices.

Today, holding a phone at arm’s length at a FaceTime movie party might seem like overkill (or an arm-kill?). But, think how interactive it would be to share Apple Glass virtual screens. Multiplayer games for VR headsets will only get better and more collaborative from here.

Live Text and AR Maps Lookup

Live Texts is another feature introduced with iOS 15. By pointing your camera at a piece of text, you can instantly copy it into a clipboard. This also works in case you want to search text in your Photos app. In fact, Spotlight introduces a search integration to quickly index and do a visual lookup in your Photos.

More than Apple’s ploy to be less reliant on Google Lens, the Live Text feature paves the window for your next-generation eyeglasses. Think how easily it can transform you into a multi-linguistic person.

With the improved Translate app, Live Text will only enhance translating text in front of your eyes. For this to work in a real-world scenario, I reckon AirPods would play a huge role in transcribing speech to live text.

Similarly, maps will probably find their way into the first version of Apple Glass. With iOS 15 bringing richer 3D navigation details, an AR lookup for the street view, and more 3D city landscapes, it’ll get even more interesting to view your favorite routes in a real-world environment (and hopefully recognizing your destination accurately?).

Hand poses

Machine Learning and Computer Vision might not have delivered any major changes during WWDC 21. However, it’s hard to ignore the stage-time hand gesture tracking and poses received in the event. As a piece of evidence just take a look at this screengrab:

From WWDC Session

iOS 15 introduces two new built-in model training classifiers, namely Hand Pose and Hand Action classifier. So, training customized hand tracking models using Apple’s no-code Create ML tool just got a whole lot easier. The catch in the latest hand gesture classifiers is their ability to detect head pose data too.

Before you wonder about the chances of a camera-powered Apple Glass, let me clear the air: we probably won’t get a camera sensor due to the privacy and surveillance implications it’ll carry.

However, you can still leverage head poses in front of your iPhone camera. According to AppleInsider, the Cupertino-based company already has a patent for monitoring head-wearable electronic devices.

From touch-less scrolling your phone display to adjusting controls and opening apps via finger tracking, there are so many gestures you can perform when using Apple Glass and an iPhone in combination. And with VR, the possibilities for interacting with elements in a scene are limitless.

But, when is it coming?

More than the teasers, it’s the mystery that’s killing Apple fanboys for years now. Are we gonna see an AR or VR headset first? Some believe it might be a mixed headset. But why is it delayed for long?

Gladly, Tim Cook clears a bit of mystery by speaking: “We try to fail internally instead of externally because we don’t want to involve customers in the failure. We develop things and subsequently decide not to ship [them]”.

Smart Glasses haven’t got widespread adoption among the masses till now. Google Glass was a disaster. So, it makes sense for Apple to do the groundwork first by giving developers tools to build AR apps.

Hopefully, we’d get a sneak peek into Apple Glass sometime late this year.

--

--

Anupam Chugh
Big Tech Talks

iOS and Android Developer. Online Writer. Editor @BttrProgramming. Marketer. Wannabe Filmmaker, and a Funny Human bot!