This weekend marked my foray into the world of VR design. Working in Unity 5, I scripted up some basic movement controls, and used the CardboardSDK and the Unity Remote iOS app to project the Unity output to iOS — Google Cardboard. Here are a few takeaways from my first virtual build.
In Unity, I felt ready to jump in and start building the world — constructing walls, adding objects, applying textures. As it turned out, I actually spent a good amount of time at the outset working to achieve what felt like a lifelike scale for my environment.
Forums tell me that Unity’s in-editor units are meant to roughly resemble the metric system (i.e. 1 unit ~ 1 m). Establishing the height of my main camera was the first step here. I opted for 1.5 units, as I found the feeling of being slightly “dwarfed” more exciting and magical than he feeling of being slightly “big” for the environment.
When working from scratch to build an environment, the question of scale is a recurring one. Each object, structure, or terrain formation ends up “feeling” very different based on small tweaks to its height, width, and depth — and the effects of these tweaks depend on initial camera scale settings as well. My first build gave me an appreciation for the additional challenge of achieving balanced visual hierarchy when designing in 3D space.
In traditional 2D screen design, lighting conditions can be expected to appear more-or-less consistent across screens. Monitor calibration may affect display of different colors, but a 2D UI designer working in Illustrator or Sketch doesn’t need to concern themselves with the lighting conditions in the viewer’s environment. My photographer friends are always well-attuned to lighting conditions in the real world, but in my 2D UI design work, it’s not something I’ve had much reason to encounter.
When designing for VR, I found I had to pay much more attention to lighting than I initially expected. This is a no-brainer for those with experience in 3D design. The interplay between ambient, directional, and point lighting have a profound effect on the mood and character of an environment. As I found, intentional use of non-reflective surfaces or hard shadows can create an artificial yet extremely pleasing visual effect. Yet, lighting settings are not simply defaults to be tweaked; they are a set of choices to be made intentionally by the designer. As any film or stage lighting professional will attest to, light is a cornerstone, not an afterthought, that fundamentally shapes the overall experience of the user in a VR setting.
Beyond scale and lighting, my first experience designing for VR left me with a distinct impression of how much intentionality is required in the 3D design process. Any UI designer is familiar with the concept of working from scratch, and has an appreciation for how a positive visual experience depends on many intentional decisions that eventually fade into the background: margin/padding, line-height, textual hierarchy, and so on.
It’s not that 3D design necessarily requires more intentionality; it’s just that in 3D space, there are many more parameters along which this intentionality is required. Light interacts with textures; scale interacts with camera position; movement fluidity interacts with shadow; these are relatively new considerations for UI designers making the move into 3D interfaces.
Forward-thinking UI designers should start studying the disciplines of lighting design, sculpture, interior design, architecture, and physics. The vocabulary of these disciplines, as well as their tenets, all tie into the types of decisions that UI designers for VR/AR will need to make on a regular basis.
A Note on Rapid Prototyping
Its image quality compromises aside, Unity Remote served as an excellent rapid prototyping tool for Cardboard. In a more traditional model, I would build the Unity Project for iOS, then load it to my device via Xcode for each iteration. Unity Remote allowed me to make small tweaks and adjustments and immediately see the results in my CardboardVR headset.