Spatial Computing, a revolution 16 years in the making

Othman L.
5 min readJun 8, 2023

--

With Apple’s latest announcement, spatial computing took center stage. This new technological paradigm is set to transform our digital experiences. Way before today’s big event, Apple had discreetly laid down the foundations of a radical transformation. In this first chapter, we will cover how spatial computing came to be, through the lens of Apple’s innovation history. This series will explore how spatial computing is set to permeate our reality by going beyond vision and embracing multimodality. Developers, legislators and users should seize this new way of computing, as it has the potential to upend a decade of digital trends while instilling new life into local economies and social circles.

Threading the fabric of the spatial digital world

While Apple’s embrace of spatial computing might seem like an inflexion point, evidence for it it was hiding in plain sight. And it goes beyond the new Vision Pro headset’s ambitions.

Behind the scenes, a complex framework is ready to support spatial computing experiences. Photo by FLY:D on Unsplash

In the last decade, we have seen a confluence of spatial innovations that will operate as the backbone of the spatial digital world.

In 2007, the first iPhone was already endowed with a sense of local spatial awareness through an accelerometer and a gyroscope, a true miracle of microelectromechanical systems (MEMS). It could detect angular rotation velocity and tilt angle with respect to gravity. The next year, Apple introduced global spatial awareness with GPS functionality in the iPhone 3G. In fact, spatial awareness became so critical that a dedicated motion coprocessor appeared alongside the iPhone 5s in 2013. At a fraction of the power cost, the M-series coprocessor allowed continuous monitoring of the sensors to keep track of the device’s position and orientation.

Since then, spatial awareness was relentlessly improved across a whole range of devices. Most notable is the introduction of dual-band GPS, and even more precise inertial motion sensors to the Apple Watch Ultra. It paved the way for an unprecedented tracking precision in urban areas, and a new backtracking function, based on dead-reckoning (i.e the integration of velocity to update the device’s position, even when the GPS signal is lost).

A new frontier. Photo by Tyler Lastovich on Unsplash

The last piece of the hardware puzzle came to light in 2019, with the release of the iPhone 11. Apple introduced the U1 chip with Ultra Wide Band (UWB) radio technology, allowing localisation down to the centimeter. This adds to the ability of Bluetooth 5.1 to act as an indoor GPS system. It opened the door to the FindMy network, a user mesh that leverages Bluetooth, UWB and GPS capabilities at the global scale to locate every Apple device in space. Users could even attach AirTags — low-energy spatial beacons — to any real-world object in order to locate it anywhere on earth, as long as there are other users within its range.

Finally, Apple had meticulously mapped the world, by using satellite imaging and car-mounted cameras to bring an enriched 3D update to Maps. This has also coincided with the addition of LiDAR to iPhones, and improvements in integrated machine learning acceleration, allowing visual landmark recognition on device to further refine its localisation.

It is now clear that this new approach to computing was crafted with striking intentionality, a good omen for its success.

Since 2021, Apple Maps features a three-dimensional city experience. Source : Apple.com

Behold ARKit, the new substrate of spatial computing

These hardware innovations created the perfect storm for the new spatial computing backbone, ARKit. It combines all of the previous hardware innovations to reconstruct your device’s position in space (from your location on earth to your location within a room) and the position of other objects around you. Its promise is simple : app developers will only need to communicate with this interface without needing access to all of the sensitive data that was processed on device to reconstruct these positions. The apps will only need to know the device’s 3D coordinates, as well as a 4D representation of its orientation.

16 years of hardware innovations come together to form ARKit, the fabric of Apple’s digital space

ARKit operates with an intuitive logic and abstracts away all of the complicated details. World anchors allow to pin virtual objects to precise locations in space, such as on a living-room table. ARKit can remember maps, update them, or create a new one when it detects a novel environment. The ability to recognise a previously established map means that any anchored object will reappear where you left it whenever you come back to the space where it lives. This grants virtual objects the property of persistence, making them seem more tangible than ever. Other features such as hand tracking and surface detection allow these objects to interact with the space and with the user’s movements.

The users can also explore objects anchored within the outside world. Imagine walking back home on your birthday and being notified that a “pin” (an AR object, a message, a picture, a music) was left at your doorstep by one of your friends. It makes the interaction more enticing since they were physically here to drop it. It’s like receiving a postcard instead a phone text.

This alone may make today’s event sound like great news to Meta, Snap and other social network companies. More on that in another post.

Atlas carrying the globe. Photo by Afif Ramdhasuma on Unsplash

Today’s announcement marks not the start of a journey, but the culmination of an arduous quest. While the Vision Pro is a technological gem that will elevate AR experiences to unprecedented levels of realism, it is resting on the back of an even more impressive platform. One that is coming to every device.

This story is setting the stage for a grander purpose : exploring how spatial computing will transform devices beyond the virtual confines of AR headsets, examining its societal impact and how we can steer it to help technology deliver on its unmet promises.

Part 2: BeyondVision | Spatial Computing comes to the entire Apple ecosystem

--

--

Othman L.

Computational neuroscientist and tech enthusiast. I welcome new innovations with a holistic view and cautious optimism.