What Apple’s WWDC means for Spatial Computing

Thijs Morlion
In The Pocket Insights
6 min readJun 23, 2020

It was that time of the year again. No, I’m not talking about a big white-bearded and friendly old man, all the mistletoe and presents by the tree, but about Apple’s WWDC. As the AR Lead at In The Pocket, these are exciting times to live in. Apple is one of the main discriminators in today's technology landscape and last Monday they treated us as it was indeed Christmas.

Spatial Computing is a human interaction with a machine in which the machine retains and manipulates referents to real objects and spaces.

“They even didn’t mention AR…”, one of my colleagues mentioned somewhat disappointed after seeing the WWDC. As a matter of a fact, he was totally right. Apple didn’t mention it at all. Or did they? A lot is moving in the world of spatial computing. For people that are not familiar with the term, I’d love to give you a quick heads-up. It actually got coined by Simon Greenwold. He described ‘spatial computing’ as a human interaction with a machine in which the machine retains and manipulates referents to real objects and spaces. The broader public probably is more familiar with Augmented and Virtual Reality, which are excellent applications of Spatial Computing, but it goes much further than that. It actually is a beautiful marriage of a lot of different technologies. From computer vision, to 3D scanning, to artificial intelligence and sensor fusion. Thanks to a combination of these technologies, devices start to understand the world around us. Moreover, we, as humans, can start to interact with these devices in a new, more natural way, but that is food for a whole other blogpost.

Because spatial computing entails a combination of a lot of different technologies, we need to look a bit deeper into this month’s WWDC. I’ll highlight 4 key elements that are of great importance for our spatial future: On-device speech recognition, App Clips, Spatial Audio, and a very big focus on privacy.

On-Device Speech Recognition

It’s almost a decade ago. The launch of Siri on the iPhone meant the start of having a personal assistant on that small device that fitted in your pocket. You just had to summon her by saying “Hi Siri” and you could ask her any question at which she would respond to the best of her abilities. To be able to understand human speech, and that in several languages, the audio samples had to be sent to the cloud for analysis. This often raised privacy concerns. A couple of years, Amazon and Google had to sail a rough sea when they were accused of misusing voice samples of people talking to their personal home assistants.

So why do we desperately need speech recognition? Well. One of the devices that might change the future for good, is AR Glasses. It is a pair of glasses that is able of recognizing the world around you and gives you specific and helpful information when you are at a specific place in a specific context. The most efficient way of interacting with that device is speech recognition. You can understand that when people use this technology more and more, it has to be privacy proof. To tackle this issue, it is a must-have to bring it to the device itself instead of somewhere in the cloud, and preferably to a device as small as an AR Glasses.

App Clips

The second, and personally I think it is a very neat feature, are App Clips. Not having the correct app installed at the moment you need it, is perhaps a situation you are familiar with. It then takes time to search through the App Store for it, waiting for it to be downloaded and when it is finally installed, you probably already don’t need it anymore. With App Clips, this annoying situation will be something of the past. It is actually smaller parts of an existing app that can be used via eg. a QR code or a webpage. Instead of downloading and searching for a full app, you’ll just use the smaller part of it in the specific context you’ll need it for.

This shouts spatial computing all over it. Let’s image we have one of those AR Glasses again. Our glasses will understand the world around us and the context we’re in. To be able to service you to the max of its ability, it will use these App Clips or perhaps even App Services to extend its functionality. This needs to happen all very quick and swift. It is also a plus when you, as the end-user, don’t have to look for a specific app in the App Store. No, your glasses will do al the heavy lifting for you.

Spatial Sound

Spatial computing is not only a visual thing, but it also applies to audio. Recently, we saw one of the pioneers in this field, Bose, close its Bose AR division. It was unfortunate because Spatial Sound will play a major role in building our spatial future. I was delighted to see that Apple is integrating spatial sound into their Airpods.

When you are having a VR experience, it is quite clear that surround sound is very important for being emerged in your experience. As a human, hearing is a very important sense. It is thus spatial by nature. Knowing where a sound is coming from can save your life. Although we don’t need to be aware of predators anymore, it can still be a very useful sense to have. Implementing spatial sound into technology can and will enhance the blend we are making between our real lives and the digital layer we are mixing it with.

Privacy

Last but not least, the elephant in the room when we are talking about spatial computing: privacy. The past decade companies have been building complete business models around the data of its customers. Data really was the new gold. The last couple of years some scandals were uncovered. Think of Facebook and the Cambridge Analytica scandal. Earlier in this blog post, I have been talking about Amazon and Google apparently not taking privacy too seriously with their voice assistants.

On last year’s WWDC, Apple showed us its anonymous login with Apple system in which a user’s email address was obscured for the developer, but the user could still make use of the login functionality of that specific app. It was a delight to see that Apple’s focus on privacy again in this year’s WWDC. Privacy is also one of the reasons Apple chooses to use LiDAR on its iPad Pro and probably also later in its AR Glasses. When we are entering the era of spatial computing, we all should be very aware of the implications it has on our privacy. Only companies that are taking this very seriously will prevail. Personally I’m glad to see Apple, as one of the big tech giants, is picking up its responsibility. Hopefully, more will follow.

The era of Spatial Computing is near

Although my colleague was very right. Apple did not mention AR in its keynote, but it mentioned plenty of technologies AR, and by extension spatial computing, will need to really thrive. All of these are an indication of something bigger Apple is aiming for. My humble guess would be that they are preparing for the AR Glasses I mentioned a couple of times throughout this blog post. They are combining a lot of technologies, making them smaller and most of all doing all this while taking their customers' privacy into account. I can’t wait to see what the future holds. At the end of this year, and the beginning of next year, we probably will know more.

--

--

Thijs Morlion
In The Pocket Insights

Making impact by supporting people, teams and organisations to become the best possible version of themselves!