Prototyping a native AR experience

Using Unity and Hololens Voice+Gesture Commands to Experience our Design Concept

When designing for augmented reality, it’s frustrating to prototype mid- to high-fidelity artifacts since our design tools are simply not there in terms of helping us create a close-to-native experience. For this reason, we ventured into a little Unity and Hololens coding to simulate the experience of using holograms to support or complement a conversation.

Using the HoloToolkit for Unity as well as the Holographic Academy as a starting point, we modified some code and downloaded a few free 3D assets to create a simple app that allowed us to optionally select holograms from a list and permanently add them to a scene using natural gestures.

Selection and Placement

Using Vuforia image targets, we can cycle through a list of holograms that appear and disappear depending on the image target direct our gaze towards. We can then select and place a hologram in space by air tapping it and dragging it. Another air tap stops the dragging and releases the object. One can always air tap again to move the hologram elsewhere.

Holograms appear using Vuforia image targets. Air tapping on a hologram keeps it in the scene.
Air tap and drag to place a hologram in a scene

Scaling and Rotation

We implemented a voice command to switch modes and rotate an object using just hand movements.

Rotation using hand movements

In a similar vein, we used another voice command to switch modes and scale holograms using just hand gestures.

Scaling using hand movements

Using the prototype

Ultimately, even though the interactions were very basic, they allowed us to simulate conversations and experience what it’s like to build holographic scenes using simple gestures and no menus or UI overlays. The ability to scale, rotate, and manipulate holograms was very satisfying, as we were able to create a diversity of scenes — each team member generating different results. On the other hand, since the gestures weren’t perfectly coded, at times it was still a little cumbersome to obtain the exact manipulation results one wanted. While a native prototype like this did not answer all of our design questions, it was useful to complement some of the more traditional 2D artifacts (video prototypes, mockups) with an interactive tool.