Built for Google Cardboard / Daydream and Gear VR.
Gaze to Color
How might we make prototyping basic mobile VR interactions (on-gaze + on-trigger) more accessible in VR?
When developing interactive objects for VR, in order to address a simple problem (such as the color a button turns when a user looks at it), a decent amount of effort is required to code / test / iterate on a solution, especially when developers have to switch between their HMDs and desktops to test their builds. The objective of this prototype is to address this issue by allowing users to select what happens to a target object on-gaze or on-trigger in a matter of seconds. While the demo itself only shows color changes to a target object on-gaze, the same model can be used for translations (position, rotation, scale) as well. In addition, the same effects can be activated for on-trigger interactions (assigned via the on-click button shown).
How might we construct in-VR menus that clearly present the hierarchy of options while staying compact?
While cascading menus work well for laptop/desktop monitors given the general layout of files and folders, different menu styles are used for mobile as a result of less screen real estate. However, on both menus, users clearly understand where they are (current option tier) and where they came from (previous option tier). Cascading menus take up an unreasonable amount of space in VR, and to replicate mobile menu styles results in loss of context and overall confusion. The objective of this prototype is to provide an in-VR menu solution that allows users to clearly understand the hierarchy of options while being able to focus on one set of relevant options at a time.
How might we take advantage of the virtual space to make ordering food online a more intuitive and rewarding process?
As longer mobile VR experiences are made/released, more users will spend longer periods of time in VR and will therefore be less inclined to remove their HMDs to order (real) food. Imagine how convenient it would be to pause an experience, and spend less than a minute in VR to order food delivered to your doorstep? The objective of this prototype was to create a virtual pizzeria that lets users build their own pizzas in front of their eyes. For consumers, some advantages to this system include seeing exactly how large a “large pizza” is, how thick a “thick crust” is, etc. For restaurants, this serves as an opportunity to move their brands into the virtual space; when ambiance and decor are such integral parts of eating at a restaurant, why not bring that to the experience of ordering food online?
If you enjoyed reading about these experiments, you might be interested in some Room-scale VR projects as well. As always, feedback, comments, and questions are always welcome: sagarram [at] usc [dot] edu.
Originally written in 2016, re-published in 2018.