Gestural Communication in VR–2

Anastasiia Ku
Inborn Experience (UX in AR/VR)
4 min readFeb 16, 2020

In continuation of my exploration of gestural communication in VR, in this post, I decided to look closer into gestural interaction types, and what metaphors different experiences use in order to make those tasks simple and intuitive.

Flicking by Chris Hildreth

VR interaction is a form of human-machine interaction where users are able perform interaction in the virtual environment. Both human and machine process information where the physical position of elements in the 3D space is relevant. [Wikipedia]

Most commonly occurring gestural tasks in VR interfaces include object manipulation tasks and UI commands:

  1. Object manipulation tasks include commands such as selecting an object, scaling, rotating, creating, deleting, editing, etc. Some of these directly match the actions we perform in real life (grabbing, rotating), while others would be impossible in the real world but rather necessary in the computer-based application (scaling, deleting). [Chris Hand]
  2. UI commands in XR include tasks generated by the real user within the virtual user interface, and include events like confirmation / cancellation, switching on /off, opening / dismissing, etc.

Good examples of a UI command in VR would be giving thumbs up to proceed to the next step in the on-boarding tutorial in Paint VR, or Microsoft Hololens, who are using gestures like air swipe and hand bloom to communicate commands.

Hololens bloom, a gestural command to open Menu

With hand tracking devices offering true direct object manipulation and gestural communication, we can explore further how to correspond gestures from the real world with the most commonly occurring virtual commands.

The benefit of such communication is that it has the potential to be very intuitive, expressive and natural. It can also be enhanced by tactile or haptic feedback for the users to feel the virtual objects the way they would feel the same objects in real life.

Object manipulation in Blocks

Commands and gestures

To improve learnability and acceptance of gestures, some experiences borrow commonly accepted interaction metaphors from both 2D and real life. Within their Hololens experience, Microsoft labelled such gestures as “instinctual interactions”.

Some gestures can be also borrowed from different sign languages. Although generally using a sign language for interactive gestures in VR is not recommended — due to too many signs in each separate language that are fairly complex, which would more likely create a steep learning curve for the users. [Dan Saffer]

In either case, whichever gestures you might want to choose for your experience, they have to be tested with your target users to understand whether they are clear, easy to use and easy to remember.

Below, I listed some UI commands that may more naturally match with certain gestures that we use in our every day life:

  • Selecting — through pointing with index finger, pinching to create a ray, hovering over, moving a hand / finger into the shape
  • Move closer / further — through grabbing and pulling / pushing the air, beckon with finger or palm
  • Collecting / holding / picking up an object — though gabbing, pinching, collecting a palm into a fist over a selected object
  • Switching off / on — through waving a hand left and right, clapping, dismissing with one wave, snapping
  • Confirming — through thumbs up, ok sign, nodding (Paint VR)
  • Cancelling — through shaking no, thumbs down, open palm as a stop sign, crossing hands, waving down
  • Pausing — fist up, gathering fingers together, waving down
  • Deleting — through throwing away, flicking, waving off
  • Scaling up / down — through pinching with fingers, stretching with two hands
  • Scrolling — through waving off left, right, up or down; pointing left, right, up or down; grabbing and pulling air left, right, up or down
  • Opening menu / starting experience / etc. — through waving hello, bloom gesture, opening a palm face up, opening a palm face down to project a scene underneath
  • Closing — through waving good bye, gathering fingers together, closing a palm into a fist
  • Organising UI elements / objects — through selecting and pointing directions; selecting, grabbing and moving
  • Communicating confusion — through shrug, shaking head, raising hand
  • Communicating frustration / struggle / inability to proceed with experience — through facepalming, crossing hands, closing a palm into a fist
  • Communicating satisfaction / success — through dabbing, clapping, shaking a fist in the air
Hovercast VR Menu

Depending on the experience, other object manipulation tasks might be more suitable for implementation.

What experiences with gestural interfaces have you tried? Were they easy to understand and learn? Did any of those blew your mind? :)

--

--