Designing User-Interfaces for Virtual Reality
An introductory look into spatial design and typography
This article is based off a lecture series from September 2016 discussing the implications of designing interfaces without physical constraints and the design challenges inherent with a new platform.
Over the last ten years, I’ve designed digital products for every type of screen size and input, from mobile phones to twenty-foot interactive displays. While different screen sizes have their benefits and drawbacks, they all followed similar design fundamentals: a two-dimensional surface, constrained borders and click-based methods for input. Last year, our agency started designing applications and experiences for virtual reality. Within the first week, we realized that designing for virtual reality would be vastly different from any type of screen we’ve used before.
Our early design tests were unusable and content often felt “out of reach” for the users. Research into methods of designing interfaces for VR came up short, and most basic resources are still lacking today. The big players in VR — Google, Disney, Microsoft — all have different methods of interacting with objects and few strides have been made in text legibility or navigation.
The first experiments at the agency were for a client project, so we had to move quickly and come to answers that could work across device types. Our client was building a platform for the music industry to view 360° videos — concerts, music videos, interviews, etc — on both mobile and PC VR systems. We were tasked with creating the brand and the interface across these different systems. To create the interface, we needed to understand how people would watch videos in VR, and how they should interact with the application before, during and after the video plays.
Before moving into designing within virtual space, we sketched out all of the content we’d need on a given screen and found that the information would all be really light: titles, a little information about the content, and controls for the video. With videos and photos as the entire background, users didn’t need to read a lot of content to understand where they were — they just needed signs pointing them in the right direction. With such light content, we thought about the interface as more of way-finding signage akin to environmental or spatial design.
This model that interfaces within a three-dimensional space could be seen as posters, wall-art, or signs changed everything for us. We could place things around the user at different sizes, and let them look around the space to navigate the application just like navigating a large building or city block.
Our research turned to the history of road signs and urban navigation to understand how scale, distance and contrast affects the viewer and their ability to process information. We came to view virtual reality as a “real” place where items occupy physical space within the scene. From looking at the typography of highway signs throughout the twentieth century, we gravitated towards type that was meant to be viewed and understood quickly in short bursts.
Along with other sans serif type used in signage, like Freeway, Clearview and Highway Gothic, we also looked towards the original high-speed type: DIN. DIN seemed a great choice for these early projects as it was both practical and extremely modern. It’s grotesque letterforms use space economically and the aesthetic adds a strong foundation to any physical space.
Working with DIN allowed us to create simple menus and “hang” them throughout our scenes. The type worked at both small and large sizes without showing too much personality, and made the scene feel physically real. It also allowed us to maintain a modern and simple aesthetic for navigational buttons on each of the screens. While DIN may not be the only choice for every project, it is a strong default for the platform and allows for personality and brand to shine through in other areas of the application.
Our final issue with the typography was the scale and distance to the user. With a two-dimensional screen, you choose a size for the type and expect the user to move closer or further away to read the content. With virtual reality, the distance has to be defined and the type needs to be scaled accordingly. If the type is too close, the user has to cross her eyes. If the type is too far away, the content might not be seen as important.
While the scale of the type is affected by distance, we could still control what the default should feel like. Type that is 10px at 1ft away should feel roughly the same as type that is 100px at 10ft away. But with three-dimensional depth, you can tell that the type is further away. The real issue is understanding the distance where people are traditionally comfortable reading and interacting with objects. While books and laptops are used close up, road signs and way-finding are much further away. In designing the interface, we wanted people to interact with menus, but we didn’t want to overwhelm them with the interface crowding their personal space.
In the end, we settled on type that was meant to felt like it was about two inches tall (standard for way-finding) at a distance of twelve feet (about the size of a comfortable room). You can test out our early demos on your phone with Google Cardboard and see the areas that we’re still working on today. While the demos are built using A-Frame, the real versions are developed cross-platform with Unity. As we continue to refine the typography, distance, transparency and scale of content within our projects, we’re starting to develop what the defaults should be for virtual reality. Just like the web in the 1990s and smartphones in the 2000s, VR needs to have defaults that people can use or modify instead of starting from scratch with every project. If we’ve discovered one default so far, it’s that sans-serif grotesque type makes the most sense as the standard for virtual reality due to its legibility and functionality within a three-dimensional space.
Just like the early days of mobile-app design, everything being released is still new and doesn’t fully reflect what the landscape will look like in a few years after wider adoption. Over the last six months, we’ve seen the entire community grow by leaps and bounds—better experiences continue to be released and more people are getting to try them out every day.
For any questions about the article or thoughts about your own project, you can reach me on Twitter or at Emerson Stone. Along with client projects, we’re also working on our first game for the Vive/Oculus with some friends in Boulder. If you have a Vive and would like to test it out before release, please let me know.