Hand-Tracked Gestures for Immersive Data — Game Changer or Gimmick?

Jason Tam
badvr
Published in
9 min readMay 29, 2020
Looks cool — but what does hand-tracking really bring to the table? 🤔 Source: Oculus

If you’ve been following the BadVR team, you’ll know we are committed to sharing the revolutionary benefits of using immersive (VR/AR) technology to visualize data. I mean, who wouldn’t be excited to promote the ability to see millions of individual data points in an accessible and intuitive way? Building the future would amp anyone up, and our team especially is excited to build and share the future of data interfaces.

However, designing and building immersive data interfaces is no easy task! There aren’t many established best practices, use cases, or standards to follow. We’re truly working at the frontier of user experience design; boldly going where few have gone before, attempting to determine the best ways to deliver immersive data experiences. It’s required a lot of open-minded creative thinking, lots of trial and error, and many more user feedback sessions! During this process, we’ve discovered some very interesting things about how users engage with immersive experiences, how they process and ingest data within these experiences, and how to be design them to be widely accessible. In this article, we’ll touch on one of the most contested and controversial items in our design toolset — hand-tracked gestures!

Some of the latest VR controllers on the market. (Left to Right: HTC VIVE Controllers, Valve Index “Knuckles”, Oculus Touch Controllers; Source: RoadToVR)

Today, a majority of VR and AR solutions are controlled by dedicated handheld hardware. Although most of these controllers are well-designed and easy to use, they are an intermediary step to an ideal control schema. Advancements in technology have brought us far from where we started and will eventually (hopefully?) will allow us to easily and intuitively manipulate our surroundings with the best controllers we have: our own hands.

Leaps and Bounds

Modern day hand-tracking began with Leap Motion, which released their first product in 2010: a dedicated “block” that allowed you to manipulate virtual objects on your desktop. Although it wasn’t initially made for VR, tinkerers eventually found a way to modify and attach it to VR headsets. But there was a catch: it was clunky — really clunky. Since the technology had not yet been integrated into headsets, testers would actually use 3M tape to “MacGyver” the block to the front of an Oculus Rift! Though not the most elegant solution, the initial experience was nevertheless magical and demonstrated the proof of concept. Being able to see and track your hands within an immersive environment was groundbreaking!

A “DIY” solution: attach a Leap Motion to an HTC Vive! True innovation requires such crazy solutions! 🤪Source: Reddit

However, advancements in VR don’t happen overnight. Countless hours of trial and error eventually got us to the user-friendly and consumer-ready VR products we have today. Many forget that for years, the only way to effectively add hand-tracking to your VR experience was to ‘Frankenstein’ two independent systems together, hoping to get a glimpse of the magic.

Fast forward a few years, and thankfully all those hours of trial and error and all that R&D funding eventually paid off. Headsets with an entire suite of hand-tracking sensors integrated within started appearing on the market. The catch? They were available only at a premium price, severely hindering widespread adoption. What the market really needed was a budget all-in-one device for the masses. Enter Oculus Quest: the next step in visualization hardware (or as our CEO called it: VR’s iPod). At a more affordable $400-$500 price, the Quest was both accessible and feature-rich. Not only did it have the expected (and widely promoted) head-tracking and six-degrees of freedom with controllers, it had — drum-roll please! — hand-tracking!

With hand-tracking hardware now readily available for price-conscientious consumers, a bevy of different applications and modifications began to appear on the scene. Hand-tracking had arrived, and it was ripe for the picking!

Hand Gestures vs Controllers: Which Is Better?

It’s extremely interesting to see the various dedicated and hybrid controllers available now. Controllers have traditionally been handheld and remote-like in nature, including various buttons and joysticks, or even a full touch pad. Valve’s Index controllers are especially interesting, as they are designed around the concept of trackable finger gestures, using movements as inputs. While experimentation with this type of hybrid implementation is important, these sorts of controllers are not especially usable in the field. But they are an important steppingstone toward the goal of full hand gesture support.

Valve’s “Knuckles” controllers had grip sensors (colored shapes, above) that made finger gestures possible! 👆🤞🖖 Source: Valve

In terms of gesture vs controller — it’s best to approach the solution with redundancy; whatever the user can do with a controller, they should be able to do with their hands as well. This way, users become familiar and reap the usability benefits of gesture-based interfaces but have their controller as a fall back if the gesture doesn’t work. Since gesture technology is still ‘new,’ it’s important to provide users with the ability to fall back to using their more stable and well-worn controllers.

We at BadVR full believe the immersive industry is poised to move fully towards hand-tracking — and that gesture-based interfaces will eventually dominate. The only bottleneck at this point is workable, reliable tracking technology. Once tracking is consistent and dependable, controllers will be a thing of the past. As such, we design all of our experiences to utilize both inputs, readying ourselves, our products, and our users for a truly gesture-first future.

Are Hand-Gestures Really That Intuitive?

The technology behind gesture-based inputs is definitely still a work in progress. It’s easy to think that the hardware is the hold up, but truth be told, most of the difficulty in implantation comes from software. It relies on predictive algorithms to determine the location of your hand and to predict what your hand might do next. This isn’t an easy thing to do well, and to do well reliably.

BadVR’s vision of hand-tracking… hand-tracked conferences! Groundbreaking work from our team! 👌🖖 Source: BadVR

That being said — the goal of our immersive interface design is to replicate the input schema commonly found in popular science fiction movies, as users are most familiar with gesture controls in this context. As such, hand gestures are most effectively used when used for simple and easy things, such as reaching out and manipulating an object or turning something off or on. More complex and intricate workflows need a more solid and stable controller-based interface. Our goal at BadVR is to design products accessible by non-technical users, so we want to ensure that gestures are used in an intuitive manner, until eventually, industry-wide standardized gesture controls are documented and defined. The most useful input in any immersive user experience is the input that needs the least amount of learning, and we believe gestures open up a new world of super simple controls.

Hand Tracking and Data Visualization — A Match Made In Heaven

Smartphones with multi-finger support helped us all become comfortable with the idea of using our hands and fingers to navigate through interfaces— with VR, we’re just translating these gestures to a multi-dimensional space. 📱🤳😎 Source: Unsplash

If you look at the history of input systems for data visualization, you’ll find it all started with the humble keyboard and a mouse. Although it was primitive software and hardware at the time, it helped eventually find and define input standards. Designers kept building upon the mouse and keyboard, improving bandwidth and latency issues, eventually streamlining the computing experience as a whole.

Then everything was sidelined with the release of smartphones and ensuing mobile computing revolution. With the introduction of high-quality touch screen technology, users learned that they could manipulate data with their hands to arrive at a better understanding and to discover insights. Mobile gesture standardization took time, but soon swiping, holding, and tapping became second nature.

Today, a similar phenomenon is happening with hand-tracked gestures. They are still in an experimental phase and it will take time to standardize them. However, developers and designers are hard at work in discovering and documenting best practices in for immersive interface design and will develop the new standards for how we interact with immersive spaces. What an exciting mission!

Gestures will revolutionize the way we interact with our data and bring us closer to the science fiction future we all dreamed of as children. The mere act of physically touching and manipulating data itself will be a huge leap forward when it comes to working with, and analyzing, future datasets.

The Future of Hand-Tracking

VR development is rapidly moving towards a controller-less future. The goal is to freely use our hands in an intuitive way using only a head mounted display as our tracker and visualization. This eliminates the need to use additional controllers or external tracking devices. Today’s solutions already move us towards this future. Oculus Quest devices recently received updates to include “hands only” control experiences, removing the need to use controllers entirely. However, given the complexity of many immersive experiences and the newness of this hand-tracking technology, it’s just not possible to remove controllers entirely. Not just yet, anyways!

Tony Stark (Iron Man) demonstrating what the future of immersive, gesture based interfaces may look like! 🤖🔮Source: Marvel

The final nail in the coffin for controllers will be haptics: the ability to touch objects and feel textures in a virtual environment. This next generation technology will provide a mind-blowing new way to interact within immersive spaces and fundamentally transform the way we work with data. Currently, we lose all sense of tactile input with hand gestures. Adding haptics into gesture controls will be like the difference between playing air drums and actually playing a real drum kit! Haptics offer an exciting new level of immersion that many have already embraced — with some product utilizing extreme solutions like full body haptic suits. For now though, BadVR is waiting for more elegant, less bulky, future solution to be developed — but we are totally excited be a part of this gesture-driven, haptic enhanced future!

So, let’s answer the question: hand-tracking, game-changer or gimmick? For us, the answer is overwhelmingly in favor of the former. Developers are already replicating controller inputs using hand-gestures, including support for hand-based inputs in newer software by default, and retro-actively adding hand-gesture support to older solutions. The technology for consistent and reliable tracking arrived is available for widespread adoption in affordable and effective hardware which comes with greatly improved predictive tracking software. And all of this is just the tip of the iceberg. We can only wait for an exciting future ahead of us to see what rest of what hand-tracking has to offer. And all of us at BadVR are proud to be a part of that thrilling process!

Precise and intuitive immersive gestures are now possible! THE FUTURE IS NOW, folks! 👌🎉 Source: Engadget

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

If you want to get to know more about this awesome, immersive, gesture controlled future, BadVR is leading the way and we’d be happy to chat with you! Our team strongly believes in putting usability and accessibility first with all of our immersive data analytic solutions — so if you’re overwhelmed with the idea of immersive data, or think you’re not tech savvy enough, think again. We have data solutions for everyone, from every background, for every dataset! BadVR is here to make YOUR data easy.

Contact us today via the links below:

BadVR’s Twitter

BadVR’s LinkedIn

Curious about incorporating immersive data visualization into your remote workflow? Then reach out today for a demo of BadVR’s platform! We’re excited to chat with you and your team today! info@badvr.com

BadVR’s Request Demo Form

--

--