From Virtual Reality to Customized Reality — Why headsets are NOT the future.

We assume the path of technology will be linear even though history teaches us that it moves in spurts of genius. The HTC Vive and Oculus Rift headsets we use today for experiencing virtual reality will be as far removed from the “next” iteration as a mouse is removed from Alexa as a user interface.

The big theme I am trying to get at here is that we are unduly fixated on how headsets or haptic gloves or any other body attachment is going to allow us to experience alternate realities, when it is just as likely that our physical world will adapt to us and reality will be decided only at the point of interaction between our senses and that world.

To start, there is no reason why a VR or AR experience has to be generated by a headset. The only requirement for an “alternate” reality is that we see something that is not ALWAYS physically there. That could be a result of a chemically enhanced state delivered by a designer drug, or it could be something that we see while dreaming. We already know that dreaming is the closest thing to reality that we can experience now, so why would it surprise us that it would be tailor made to create a controllable experience?

Inevitably many of the altered realities we encounter in the future will involve some kind of mixture of light and optical trickery. But that is not a requirement. Imagine a giant foam that you can walk around and through. If the characteristics of that foam can be programmed to represent a vignette of the ocean, then that is an encounter with an alternate reality.

In this way we see that overlays are not the same as independent exhibitions or self encompassed experiences. One can imagine gigantic bubbles floating on a cityscape, each one containing its own mini-world creates through programmable nano-particles that change color, hue, and texture on command.

It wouldn’t be too much of a stretch to envision a multitude of media that could take on enhanced properties. If the walls of a skyscraper project a landscape on mars, and it looks real to a pedestrian strolling by, how is that different from viewing the same landscape through a headset? In both cases, what you see is not permanent. When observing the building, one could say that it is more “real” because many people see the same Martian landscape at once, while only one person sees the hologram generated by a Hololens. But that distinction doesn’t hold up to even the most basic test. What if the whole world had a Hololens on that was programmed to show the same thing? Then multiple Hololens wearers would experience the same landscape on the wall at the same time regardless of how it is being rendered.

These kinds of thought experiments bring up the distinct possibility that as reality becomes customizable, others who create vastly different realities to our own will somehow be in a different world even when they are sitting right next to us.

How would this play out? Imagine that you and your friend are both next to the Martian landscape that is projected or programmatically embedded into the wall of the building in front of you.

When you push your hand against the wall, it provides no resistance. You walk through it like a ghost. The particles rearrange themselves dynamically so that you feel no friction, no contact. When your friend reaches out to the same wall, her hand rests firmly against it, as she expected. She leans her back on it and it is cold and firm. How is this possible?

This is a building that acts differently depending on the person that is interacting with it. It adapts to the reality of the observer. We’ll skip over the questions about engineering integrity and all that — it’s the future so let’s assume someone figured it out. In this scenario what is real- the firm wall or the permeable wall?

We can make the same argument about sound. If I walk through a forest and hear classical music but you hear chirping birds instead, is your reality more legitimate than mine?

This issue of superimposed realities stems directly from abandoning the headset as the filter and instead allowing the physical world around us to react differently based on our preferences.

You might think that this task is monumental, perhaps even impossible, since two people couldn’t possibly experience a separate physical reality at the same time! You might ask: “What if you and your friend are running through the woods at the same time? Both musical soundtracks would need to play at the same time. How do you ensure that only you hear the soundtrack that belongs to you?” Great question!

We can ask the same about the Martian landscape on the wall. What if your friend leans on it while you walk through it? How could that be? It can only be either passable or immutable at any one moment. Technically these superimposed states are possible because of the underlying quantum mechanical nature of all physical objects. But it’s too far of a stretch to assume we will create quantum probabilistic states in the near future.

Thus, I envision a world with a reality spectrum, not very different from the light spectra, in which a range of realities are projected or embedded into objects and they only commit to a particular cross section of the spectrum when we physically interact with them. Some kind of brokering system would need to exist to make sure I don’t hear your music, and you don’t fall through my false wall, but it’s not too difficult to conceive of an identifying mechanism through which objects can determine the reality you want to experience. It could be as simple as a line of code on your smartphone.

I personally hope that the era of devices that provide a filter for our senses is only temporary. It’s too difficult to trick the human body into thinking it is truly somewhere else when simply removing a device brings you right back to reality. Now, if that physical world changes as we interact with it, that isn’t a virtual reality at all, that is a new reality altogether.