VR<=>AR<=>xR

Karan Singh
3 min readFeb 18, 2018

--

The posturing between Augmented and Virtual Reality is blurring. The various strains of augmented, virtual and mixed reality are inevitably collapsing into a collective brand, xR. My personal favourite, TR or Transcendent Reality, is a cynically spiritual mix of technology and Zen.

It is easy to imagine headset VR as AR and vice-versa: users and their environment, scanned and rendered live and in-situ in 3D, enables a virtually constructed Augmented Reality; equivalently an AR headset with increasing opacity, occludes the real world leaving the viewer with an entirely Virtual Reality. Most AR headsets today have a percentage of VR mixed in by virtue of the light lost due to their construction anyway. Conceptually though, a headset is but a tunnel visioned view of AR and VR. xR in general relates to media for information transport and presentation between the physical and digital world and the richer the sensory IN and sensory OUT, the better. Mark Billinghurst has a nifty article laying out the space of AR, VR and xR based on “A taxonomy of mixed reality visual displays” by Milgram and Kishino, two fine gentlemen with whom I was fortunate to contemplate a virtual kampai while sipping real sake in 1994. But I digress.

A Kinect and robot arm recreate physical aspects of AR in a rift in VR.

This article is about the more practical aspects of meaningfully adapting applications across the many flavours of xR. The arguments below, are presented in the case context of AR and VR and the immersive internet browser JanusVR, but much of the discourse generally applicable to xR.

Physics

Imperative to adapting a VR application to the physical reality implicit in AR is the word “physics”. Successful physics in computer graphics is less about accuracy and more about plausibility. Any VR environment adapted to AR should have “physics” that can be perceptually reconciled with our physical reality. For example, any object unless perceived as capable of flight or levitation, should fall or seem meaningfully anchored to the environment. The tighter the coupling between real and virtual in AR, the greater the need for an underlying physics engine.

VR portals to webspaces in JanusVR (left) and a fan designed in AR (right), adapted to their reality.

Avatars and the Environment

The “skybox” in VR should semantically swap with the physical world in AR.

Any notion of a ground or geometry that supports the locomotion of avatars, should be replaceable by a physical equivalent: a floor, stage or table tracked in 3D using the AR hardware.

Objects in the environment should be tagged with associations that facilitate their layout in xR: For example, objects can be geospatially linked to a real-virtual environment, or to an avatar’s proprioceptive space such as tools around hands or a view-centric heads-up display. These object associations can be dynamic, like a tool that is picked up or put down. The physical world and avatars tracked in AR then provide the geospatial reference for these virtual objects.

Constraints and conflicts

A layout engine should handle the placement of objects, subject to the object associations and constraints between virtual objects and the real world. Layout is best interleaved with or incorporated into the physics engine.

One can always come up with scenarios where the real and virtual world conflict in ways that cannot be reconciled with a layout engine. For example, a real hand can continue to push into a virtual object anchored to the real world, where the interpenetration cannot be physically resolved. A conflict resolution engine should address such scenarios, by altering the space-time appearance of the scene. For example, interpenetrating solid objects could transition state to liquid or gas, to enable visual plausibility.

IOT

The geospatial and url addresses of things from the “Internet of Things” today are disparate but should be meaningfully connected for AR applications where physical sensors and IOT objects, are also part of the viewer’s physical reality.

Many issues will be domain specific, the above are just a few general thoughts and guidelines that have helped us design cross-reality experiences, that can aid in adapting applications across the various flavours of xR.

--

--

Karan Singh

Professor of Computer Science at University of Toronto, co-director of graphics and interaction lab DGP. CTO & co-founder of JanusVR.