[DevUp: Force-Directed Graph VR] Hand-Held User Interface, Audio Feedback, Downloadable Demo
Today’s update includes some major user experience improvements. Two previously-separated user interfaces are now consolidated into a single “hand-held” panel. Sound effects and other audio feedback help the demo come alive. Throw in some visual enhancements (including a drastic increase in text quality), and the demo is really starting to come together.
You can see (and hear!) all of this in the video below:
Downloadable Demo
Before we dive into this, don’t forget, you can download the latest demo. It requires a Vive or Oculus VR headset. For either headset, you can use a Leap Motion Controller as your 3D input device. The demo also allows input from Vive or Oculus Touch controllers (when no Leap Motion is connected).
VR User Interface
In my previous DevUp, there were three user interfaces: the pop-up interface attached to each node, the main menu panel locked to a radial track around the graph, and a Hovercast menu for scene-level controls.
My primary concern was that, with three different interfaces, there was too much going on. Locking the main menu to a radial track didn’t feel right, either. It worked well enough when the graph was small, and the panel orbited around the content, but didn’t make sense with “room scale” graphs.
My solution was to consolidate the latter two interfaces, keeping the best features of each. The new “hand-held” menu panel retains the main menu’s tabbed, rectangular format. It also incorporates Hovercast’s open/close capabilities and its attachment to the hand.
The interface follows the position of the hand, but not the rotation. This is a break from the Hovercast technique, and most of the other VR user interfaces that I’ve seen recently. Rather than locking the orientation to a hand, wrist, or forearm, I chose to keep the panel facing toward the user. The result is a slightly “floating” feel, with a flexible connection between hand and panel.
A key reason for this: I didn’t want the reliability and accuracy of the interface interactions to be overly dependent upon the stability of the input device. Hand and controller tracking is imperfect, and this is compounded when tracking is simultaneously responsible for the cursor and the menu. It can be difficult for a user to interact with an interface even when it is completely stable, so locking that same interface to an imperfect, moving, rotating tracking point can lead to inaccuracy and frustration.
Aside from the tracking itself, this approach also reduces the “kinesthetic” burden for the user. While aiming for a button or dragging a slider, the user is not simultaneously responsible for keeping the orientation of their left hand completely steady. Instead, the menu always faces the user, and its positional smoothing absorbs minor jitters and movements. This may be less important for brief interactions (like a quick color change in Tilt Brush), but is convenient for use-cases where an interface is meant to stay visible and stable for longer periods.
An unexpected feature of this new hand-held interface: it retains some ability to interact with both hands. I found that the left hand does not need demotion to being a simple “menu holder”. Instead, the left hand’s rotational freedom allows it to tilt inward toward the menu panel, selecting the “tab” items along the left-side of the panel. Also, either hand can open/close the menu panel. I anticipate that new users may forget about (or not realize) the left hand’s capabilities, leading to some early accidental interactions. For me, after using it a few times, the “left hand for tabs” interaction was felt natural — reaching across the panel with my right hand is no longer a consideration.
As a final point, I should note that these interfaces are created and arranged using Hover UI Kit, a development tool I’ve built for exactly that purpose. In fact, this demo is primarily meant to show Hover UI’s capabilities — including the speed which developers can rapidly prototype interfaces, the interactive items with heavy visual feedback and consistent interactions, and the automatic compatibility with various 3D input devices (like Leap Motion hands, or Vive controllers). Hover UI is an open-source project, free to use for open-source (and most non-commercial) projects.
Audio Feedback
I spent a lot of time working with audio (writing, recording, mixing music) in my teens and early twenties. I enjoyed that, but have not done too much with audio since. Fortunately, audio plays a huge role in the VR experience, and it gives me an excuse to dive back into my audio tools. That said, I’m certainly no expert in sound design or production — let’s just consider all the audio in this demo to be “placeholders” that worked well enough for my needs.
After some searching (check out freesound.org), I decided to use an old-school “808” drum kick sound for collisions between your hand and the graph nodes. Using Unity’s audio API, each collision dynamically sets a volume that increases with the hand’s velocity, and a pitch that deepens as the size of the node increases.
There’s also a subtle, per-edge sound that plays while an edge is stretching and contracting. Here, the volume increases with the rate-of-change of the edge’s length, and the pitch changes depending on the direction of that change (growing longer gets a higher pitch). The subtlety is important here… too much volume makes the graph sound like a bunch of whales are swimming through. Maybe I just need better source audio, like a creaky, rope-stretching sound. Any ideas?
I’ll briefly tie this back into some thoughts on data-viz in VR. As I discussed in the video, VR offers very interesting opportunities to feel and experience data, rather than simply seeing it. Audio feedback could play an important role. The sounds during stasis, touching, striking, and stretching (and the range, volume, pitch, reverb) could all be characteristics determined by certain axes of a dataset. Perhaps this would be best suited for minor, less-important axes. But still, imagine walking through a room-scale graph, where swatting your hand against a cluster of graph items could reveal something interesting about them — not via text or popup or drill-down action, but via sound.
Perhaps less interesting, but still useful: I also added a simple sine-wave tone for the “selection” progress of all Hover UI items, and a “whoosh” sound (from Fragmental 3D) for the opening/closing of the menu interfaces.
For the “selection” progress, I initially grouped the sound effect (the Unity AudioSource, etc.) into each Hover UI item. This created lots of extra audio components in the scene, and I decided to simplify. The cursors in Hover UI receive information about highlight/selection progress (based on the items the cursor is affecting), so I was able to replace dozens of per-item audio components with just one for each cursor.
A last, rushed note on audio: with all the node and edge audio components, I have gone beyond the (default) maximum simultaneous sounds that Unity allows. I was hesitant to raise this maximum, as the Unity docs say that this has performance implications. When you move many graph items at the same time, you can hear some faint clicking sounds — I assume these are audio clips that Unity stops, mid-playback, in favor of louder and/or higher-priority audio clips. If you have thoughts on this, please let me know!
Visual Enhancements
This update includes a handful of minor visual enhancements. This includes redesigned items in the “Colors” menu, tweaked color/transparency values for the interfaces, and a brief glowing effect on the nodes when you hit them.
I also came across Unity’s VRSettings.renderScale property, which has a significant impact on text quality. My understanding is that this property increases the effective resolution of the rendered graphics, which, of course, could have significant performance implications.
When running the demo, Unity’s initial “Configuration” popup window shows a “Graphics Quality” selector. This includes choices from 50% to 200% (which correspond to renderScale vaues of 0.5 to 2). For comparison, today’s DevUp video was recorded at 200%, while the previous video used 100%. When viewing the demo in an actual headset, the text is mostly legible at 100%, but it looks much clearer at the higher quality settings.
The End
If you have the hardware, please try out the demo. Does this approach to VR user interface work for you? Does it feel powerful, limiting, frustrating, intuitive? Do you find that a slight reduction in interaction speed (via the “hover” interaction) is an acceptable trade-off against accidental interactions? I’d really like to hear your feedback on the overall experience, but also any specific issues or ideas for improving it.
—
Hey there! I’m Zach Kinstner, working via my one-man company, Aesthetic Interactive. I explore the burgeoning world of VR/AR/UI/UX by implementing apps, dev tools, demos, and other experiments.
None of my explorations are very helpful if I don’t show and explain them, so I post videos and articles. If you like what I’m doing, you can follow me here or on Twitter. And maybe, just maybe, hit that “share” button.