Fieldnotes from the Metaverse — The Metaverse Roadmap

Dirk Songuer
Fieldnotes from the Metaverse
10 min readMar 30, 2022

This article makes the jump from the 1990s into the early 2000s. In 2006 an interdisciplinary group of people got invited to participate in the first public long-term forecast study to map the Metaverse and created one of the most inclusive models, which is still widely used today.

Setting the scene

2006 was one year before the first iPhone was announced, Nokia, Motorola and BlackBerry Featurephones ruled the market. While these phones were connected to the internet and able to send messages, electronic mail, or surf the Web, they used simplified versions of these capabilities.

Google only recently overtook Yahoo as the most visited website. MSN and MySpace were the biggest social media platforms. YouTube was only launched one year prior, Twitter in July 2006. Social media was primarily seen as “persistent messaging”, not as content creation yet.

World of Warcraft was released in 2004 as a theme park experience, hitting eight million subscribers by the end of 2006 and starting the MMORPG craze of the late 2000s. Star Wars Galaxies and EverQuest 2 were still prominent MMOs at the time. Second Life was released earlier in 2003, gaining popularity as the prominent sandbox platform, having around two million users.

And during 2006, an interdisciplinary group of people got invited by the Acceleration Studies Foundation to participate in a public ten-year forecast and visioning survey to map the Metaverse.

The Metaverse Roadmap

This was the first public long-term forecast and envisioning of the Metaverse space. The list of invited participants included futurists, researchers, designers, developers, product managers, writers and more from a wide selection of fields.

The team started with the following premises:

What happens when video games meet Web 2.0? When virtual worlds meet geospatial maps of the planet meet pervasive web video? When simulations get real and life and business go virtual? When you use a virtual Earth to navigate the physical Earth, the internet swallows the television, and your avatar becomes your online agent?

They evaluated the questions in the context of emerging 3D Web technologies, new types of apps & app distribution on mobile platforms and emerging markets like India and China fueling global technology adoption and distribution, looking at potential technology, business, and social impacts.

The result was the Metaverse Roadmap Overview, published in 2007. It’s an extensive report, looking at the Metaverse as a potential duality: Referring to both a particular set of technologies and experiences, as well as a model to think of life in a blend of physical and virtual worlds.

Metaverse Roadmap Overview, 2007

The report looks at the landscape of the emerging Metaverse and divides it into two fundamental perspectives: world-focused vs identity focused and real-world vs. constructed worlds.

  • Intimate / Identity means experiences focused on the identities or actions of an individual or object, acting in self-interest
  • External / World means experiences focused on the world at large, providing information and control of the world around the user
  • Augmentation / Real Worlds are experiences that add experiences to the perception of the physical world
  • Simulation / Constructed Worlds are experiences that model parallel realities, offering constructed, simulated worlds

Note that the Metaverse Roadmap does not look at the modality of an experience. It might be text-based, 2D, 3D, abstracted or realistic. It might be experienced on a mobile device, PC, console, headset, or wearable.

It also didn’t define the form or purpose of an experience. It might be a toy, a game, a tool, or a sandbox. It might be for fun, entertainment, competition, work, or learning and exploration.

Combining these two perspectives gave them the four key areas of their Metaverse landscape:

The four Metaverse areas, as per Metaverse Roadmap, 2007

Virtual worlds: These are scenarios where virtual representations of individuals explore virtual, simulated worlds.

Today, we first think about online 3D-based games where player avatars run around on fictional maps, interacting with each other — exploring, fighting, socializing, collaborating. For the Metaverse Roadmap, this quadrant includes all modalities, styles, and purposes:

  • MUDs: Text-based, roleplaying & character exploration, narrative storytelling (see Fieldnotes from the Metaverse — Multi-User Dungeons)
  • Minecraft: 3D on monitor, mobile and VR, highly abstracted, sandbox, world building, emergent stories
  • Gather: 2D on monitor, abstracted, exploration, socializing, collaboration
  • Fortnite: 3D on monitor and mobile, stylized, game, PvP, sandbox
  • Call of Duty: 3D on monitor, realism, game, PvP, competitive
  • AltspaceVR: 3D in VR, abstracted, meeting spaces, events, collaboration, shared experiences
  • Microsoft Mesh: 2D and 3D on monitor, mobile, AR and VR, real to abstracted, meetings, collaboration, shared presence

Mirror Worlds: These scenarios create virtual representations of the real world, essentially “reflections” of the physical, that actors can interact with.

Today we usually talk about Digital Twins, acting as virtual representations or “referents” of real things in virtual dimensions. There is a direct connection between the real and the virtual, to the point where changes in one would propagate to the other.

According to the Metaverse Roadmap, the data can represent real products, services or even environmental data that gets “virtualized” to be used within a virtual world or experiences. The data can bubble up into mobile experiences or stationary ones, monitors, mobiles, or headsets:

  • Bing Maps / Google Maps: A virtual representation of the real world where users can explore their vicinity or navigate parts of the world that they are not currently present
  • Foursquare: Virtual representation of real locations, including their metadata
  • Lyft: A virtual representation of available mobility capacity & demand
  • Amsterdam Schiphol Airport APIs: Virtual representation of the Amsterdam Schiphol airport, including in- and outbound flights, wayfinding, and occupancy
  • Philips Hue Lights: Virtual representation of light bulbs and the environmental data they influence

Augmented Reality: These are scenarios that augment the real world for the actors within it.

Today we sometimes think Augmented Reality being linked to headsets, however the Metaverse Roadmap included every modality that somehow augments (or annotates) reality, for example fixed heads-up displays / HUDs in cars, mobile phones, see-through AR glasses or pass-through Augmented Virtuality devices:

  • Google Maps AR Navigation: Ability to navigate with heads-up information on a heads-up-display, monitor or mobile phone
  • Swarm: Ability to digitally annotate, bookmark and share the physical space around the user via a mobile phone
  • Pokémon Go: Interacting with fictional, game-controlled characters as well as real players in the real environment via a mobile phone
  • Snapchat AR filters: Experiential changes to the perception of the real world, though a mobile phone or headset
  • Dynamics 365 Guides: Using a headset to see step-by-step holographic instructions for training purposes and live procedures on the actual objects

Lifelogging

And finally, these are scenarios that augment the identity of individuals and objects within the real world.

In 2006 “lifelogging” became a thing. Today we are familiar with the way we utilize and augment various aspects of our personality in different virtual platforms, especially in social media.

The model looked at real-time streaming, uploaded content, video, image, audio or text on a phone, PC, TV, driven by private individuals, professionals, or social media influencers:

  • Twitter: Broadcast short thoughts & opinions with a large, typically anonymous audience
  • Facebook: Share various parts of your life as bite-sized updates with selected friend circles
  • LinkedIn: Share specific, work-related updates and opinions to a related professional audience
  • Teams / Slack: Communicate, share, work together with a select circle of collaborators
  • Twitch.tv: People sharing their experiences live to an audience, for example them playing a game, watching a show, or just talking and interacting with the audience

Impact

The Metaverse Roadmap introduced a model that separated the Metaverse as a concept from representation and modality, thus including text-based, audio, video, 2D and 3D experiences on any type of screen, headset, or indeed sensory device.

This breaks with the collective understanding that a “Metaverse” must be a real-time rendered 3D experience in VR.

By ignoring modality and purpose, the Metaverse Roadmap pulls many different scenarios and platforms into the Metaverse and is thus one of the most inclusive models still today. It can locate experiences like AltspaceVR, Fortnite, Minecraft, Roblox AND Google Earth, Foursquare, Uber AND Pokémon Go, HUDs, Dynamics 365 Guides AND Facebook, Peloton, and Teams within Metaverse space.

Their argument was that the technical and conceptual approaches to creating any experience are the same across all quadrants. This was supported by the observations of virtual world designers, from MUDs to MMORPGs to early social media platforms, all struggling with the same technical and social issues.

Technical implications

The only difference between Virtual Worlds and Mirror Worlds from a technical point of view is that one is using fictional maps and the other is using real maps. One is created by an artist with creative freedom, the other is scanned, re-created or reproduced from reality — from a data perspective, both are the same.

The same is true for actors on the maps. Technically, a monster in a MUD is the same as a Zombie in Minecraft, is the same as a restaurant in Foursquare, is the same as a car in Lyft, is the same as a flight leaving Schiphol airport — they are all just spawns on a map, or rather: Database entries with context-related metadata — one of which might be location.

The only difference between Mirror Worlds and Augmented Reality from a technical point of view is that the latter is location-bound to the perspective of the current user. The map is the same, just scaled to 1:1 size and the virtual camera is aligned to the viewport of the observer. In Augmented Realities the users act as themselves, moving around in an augmented world, drawing in the “reflection data” from Mirror Worlds.

And the only difference between Augmented Reality and Lifelogging is that one looks outwards, a user looking into an augmented world, and the other looks inwards, the world towards a specific augmented individual. Technically it’s a camera switch.

You can think of all these as the same type of contextual transparent multi-user systems. The assumption that the Metaverse Roadmap made was:

Technical knowledge and approaches are the same or at least transferable between the stated Metaverse areas.

Conceptual implications

While the contexts and intents of actors differ between the Metaverse areas, the actors stay the same. It’s fair to assume that a user might start in one quadrant and as the scenario unfolds switch to another one. This might not even require leaving a platform as they might be able to serve experiences in different quadrants (see above).

Users will navigate between areas based on context and intent, similar to a customer journey

From an experience design perspective, the question becomes: “For a specific actor, task, and context within this Metaverse, what is the best area, modality, and representation to facilitate or amplify a specific experience? And how does this change as the desired scenario unfolds?” Just with a regular digital customer journey we assume that a process can have many different touchpoints and the same will be true for Metaverse scenarios.

The other assumption is that the Metaverse can be defined so broadly because experiences in either area follow the same rules for social dynamics — the people using them are the same. And they use them the same way.

The Metaverse Roadmap suggests that fundamentally neither representation nor modality matters in terms of virtuality design. The degree of realism or abstraction of an individual or the environment does not seem to matter as much as we think in terms of how people act within these experiences and how experiences can be crafted.

Like with technology, the Metaverse Roadmap stated:

Approaches, methods, and mechanics to design experiences are the same or at least transferable between the stated Metaverse areas.

And if technical and conceptual approaches are the same or similar, does it even make sense to artificially limit the Metaverse to a specific modality and technology? The Metaverse Roadmap argued against this, pointing to the Internet and Web, that also transcended a singular modality, evolving into a ubiquitous digital layer.

Afterthoughts

I love the Metaverse Roadmap. It’s a refreshingly open perspective in a field that is unfortunately contested, full of gatekeeping, where the term “Metaverse” is claimed by many groups for their own purposes, driven by individual incentives.

The model allows us to map existing scenarios and platforms, locating them along the Metaverse areas, to then imagine how they evolve as they expand and take over / move into other areas.

This is often an Aha-moment for people that want to start learning about the Metaverse, as it brings together many of the different perspectives and claims, making sense of the many (and according to the Metaverse Roadmap: needlessly restrictive) definitions of the term.

Locating modern platforms along the model, my subjective version

Such exercises help to look at expansion paths for individual platforms and their potential trajectories. It is also useful to identify whitespaces and areas currently not widely explored, uncovering potential opportunities.

The Metaverse Roadmap as a model helps us to find common ground in designing, developing, and operating Metaverse experiences. This is perhaps the biggest impact of the Metaverse Roadmap: A suggestion that no platform within this broad definition needs to figure things out all on their own. An encouragement that we can learn from each other, across fields, across scenarios, across paradigms. And a suggestion that by working together we can offer better, safer, more secure, inclusive, engaging, and fun experiences across the Metaverse.

Back to series index

For the Metaverse Roadmap I want to leave you with this talk by Raph Koster, who was one of the original participants in 2006. At the GDC 2017 he elaborated on the roadmap, how he sees it today and what we learned since then.

About the series

The term “Metaverse” is currently claimed by many groups, driven by different incentives. Some groups attach the term to specific technologies (for example VR, AR, XR, Digital Twins or Blockchains), others see it as a future vision or narrative (sometimes dystopian, sometimes utopian). Some groups talk about the coming Metaverse, others argue that it already exists.

Fieldnotes from the Metaverse” is a series that discusses the history, visions, perspectives, and narratives of the Metaverse: Specific milestones, their immediate impact and how they shaped the discussion going forward. The goal is a holistic and inclusive view of the Metaverse space, separating visions, signals, trends, and hype.

--

--

Dirk Songuer
Fieldnotes from the Metaverse

Living in Berlin / Germany, loving technology, society, good food, well designed games and this world in general. Views are mine, k?