VR and Embodied Interaction Design
What is the VR version of the “touch gesture reference guide”?
You’ve probably seen one variation or another of gesture icons for touch interactions, such as this one from Luke Wroblewski. They’re basically short hand for annotating interaction design, and a reference guide for what interactions are possible.
When you think of the smart phone, you think of touch as being the new big interaction paradigm. With VR, it’s embodied interaction.
As I’ve been diving into interaction design for VR, the more I see this platform as an amazing playground for exploring new human computer interactions. It feels like you should be able to capture the combinations in a Venn diagram, except there are too many variables to consider in this flat representation.
I wanted to be able to play and explore with different sets of interactions for VR, so I started to create cubes. Each cube being one facet of interaction, and doubling up cubes for things like our arms.
I wasn’t entirely sure what each cube should include, and if some cubes actually need more than 6 sides (they do). But I printed them out anyways and created version 1.0 to start exploring with. It felt natural to arrange them like our body, and really get into the mindset of embodied human computer interaction.
As a mind stretching design exercise, you can take one or more cubes and roll them and then ask the question: “How might we ____ with ____?” Or while working on a interaction challenge, grab a cube and turn it around in your hand to see if one of the facets triggers an idea to break you out of a rut. Especially if that rut is thinking about interaction as a traditional 2D interface.
I love how you can tangibly represent the complexity or depth of interaction available on different VR platforms by adding or removing cubes. For example, all cubes on the table for a premium platform with two controllers, 6DOF, room scale, and so on. Or, a sub-set of cubes for a 3DOF headset.
You can also grab a subset of cubes to create an interaction using just those elements. For example, let’s say I wanted to capture the “I dream of genie” interaction.
This is what that might look like using the VR interaction cubes:
- Gaze cube turned to “blink”
- Neck cube on “comfort look up/down”
- Both arm cubes on “reach towards…”
I started adding cards to augment the cubes to explain further what they are doing. The head is using the ability to look up and down to do a nod. And the arms are reaching out to bring each hand to the opposite elbow.
I noticed that this is a VR variation of Brad Frost’s atomic design, where each cube is an atom, and each card is a molecule, and each collection of cubes and cards is an organism.
In v2.0 of the VR interaction cubes and cards, I’ll be refining them at the atom and molecule level to work better for this type of design exploration. 👏 Clap if you like the idea of this interaction design tool and want to see more of it!
In the meanwhile, for more inspiration about embodied interactions in VR, check out the work by M Eifler 👍