The Evolution of Spatial Computing: From Geographic Systems to Apple’s Vision Pro

Alexander Glukhov
Leta Capital
Published in
5 min readFeb 29, 2024

Moving beyond buzzwords and leaving the past behind, Apple’s ‘Spatial Computing’ signifies a meaningful change. It moves away from dystopian metaverse concepts and cumbersome mixed reality, focusing on blending technology seamlessly with our environment to enhance, not escape, reality. Apple introduced this innovative approach as a distinct shift. But what does it truly mean, and how does it differ from current technological jargon? Let’s figure it out!

The Rise of Spatial Computing

The term “Spatial Computing” apparently originated in the field of geographic information systems around the 1990s and was centered around the scale of continents and cities focusing on so-called geospatial information. During the 1990s and early 2000s, Spatial Computing began to be used on a smaller scale. In 2003 Simon Greenwold referred to this term as

“human interaction with a machine in which the machine retains and manipulates references to real objects and spaces”

In 2010, Spatial computing was used by MIT Media Lab alumnus John Underkoffler in his TED talk called “Pointing to the future of UI”.

His speech is closer to what Apple means by Spatial Computing, but he didn’t mention any headsets or anything like that. Instead, Underkoffler discussed his MIT research on tangible interfaces, emphasizing the integration of input and output in a physical space. He showcased prototypes of “luminous rooms” where users interacted with virtual objects using hand gestures. But Underkoffler envisioned a future of immersive interfaces integrated into the environment, accessible to everyone, which was ideologically consistent with today’s understanding of the term “Spatial Computing”.

However, until 2022, when Apple announced the Apple Vision Pro, “Spatial Computing” had nothing to do with any kind of virtual reality helmet

Apple’s Interpretation of Spatial Computing

  • Beyond VR/AR: Spatial computing encompasses a broader range than just virtual or augmented reality, focusing on the seamless integration of digital elements with the physical world.
  • User-Centric: Interactions are natural and intuitive, adapting to users’ movements, combining hand gestures, eye movements, voice commands, and spatial awareness for a richer and more immersive experience.
  • Privacy Focus: User data and control are paramount, prioritizing privacy-centric experiences unlike some metaverse visions focused on interactions.
  • Gradual Integration: Apple emphasizes gradual adoption through existing devices, like spatial video recording on iPhones, integration with Mac and avoiding a sudden jump into a fully virtual world.
  • Ecosystem Expansion: Spatial computing is not just about a headset, but about creating a cohesive ecosystem across various Apple devices and services.
  • Long-Term Vision: Apple sees Spatial computing as a transformative technology, potentially as impactful as the iPhone, and is committed to its long-term development.

Why Spatial Computing Is Not Metaverse

Spatial computing integrates digital elements into our real-world surroundings, enriching our physical interactions. In contrast, the Metaverse envisions a futuristic internet where people connect in a shared virtual space, focusing on social experiences rather than merely enhancing our physical environment with digital components.

Spatial Computing focuses on practical applications to enhance daily tasks, improve productivity, and offer new ways to interact with the physical world through technology. It’s used in navigation, education, manufacturing, and healthcare, providing tools that supplement reality with helpful information or functionalities. The Metaverse, however, is more about creating a comprehensive digital experience that can mirror or extend beyond real-life activities, offering a platform for virtual communities, economies, and immersive entertainment.

Why Spatial Computing Is Not VR

Spatial computing involves integrating digital elements into the real world through devices such as AR headsets, allowing interaction with both physical and digital objects. In contrast, VR immerses you in a completely virtual environment. Spatial computing encompasses technologies like AR and MR, whereas VR represents just one type of immersive experience within the broader realm of spatial computing. Therefore, they are not the same; spatial computing is a broader concept, with VR being a specific (I would even say tiny) part of it.

A Departure from Metaverses and Mixed Realities

The VR/AR market is still facing challenges after the initial excitement and hype about metaverses. So, instead of using conventional industry jargon that often leads to misfortune, Apple is choosing to introduce and popularize novel terminology championed by the company itself. Even though Spatial Computing is not the same thing as Metaverse and VR, there are no major differences between Spatial Computing and Mixed Reality, Apple just wanted to make Spatial Computing a part of its ecosystem — who doubted that Apple would actively build its headset into an ecosystem?:). This strategic decision not only sets Apple apart but also positions the tech giant as a trendsetter in shaping the discourse surrounding VR technology and simplifies the task of promoting Apple Vision Pro: no more controversial metaverses and web3, we are building a really useful device that will change the way we work with computers.

A Glimpse Into the Future

Given the fact that the first version of Apple Vision Pro costs as much as $3499, this is really a Pro device for geek early adopters. But it is already clear that this is a groundwork for the future and the company is working on the next versions of the device.

The mid-term future of Spatial Computing refers not only to making the headset a bit cheaper and adding more applications from internal and external developers but also to merging physical and digital worlds into an immersive environment.

This concept is similar to the idea of “phygital,” which represents the blending of physical and digital elements in our reality. By championing novel terminology and embracing the future of Spatial Computing, Apple is poised to shape the way we interact with technology and further blur the lines between the physical and digital realms.

If you want to learn more about phygital, please see our annual reports:

State of Phygital 2021

State of Phygital 2021

State of Phygital 2022

State of Phygital 2022

State of Phygital 2023

State of Phygital 2023

Share your insights below: why do you think Apple opted for Spatial Computing over Mixed Reality? And what possibilities do you envision for the future of Apple Vision Pro? Curious to read your comments!

Do you run an innovative tech startup? We are investing in early-stage revenue-generating software startups and would love to hear from you! You can reach us at info@leta.vc or fill in the form here.

Follow our Medium blog https://medium.com/letavc and be the first to get our useful tips, insights, lists and news.

--

--

Alexander Glukhov
Leta Capital

Analyst at LETA Capital — Late Seed/Series A VC investing tech startups globally. aglukhov@leta dot vc