Over the weekend, I participated in GameJam 2015, and built a simple app called Starry VR with Unity 4.6 (~solo). My goal was to build a fun game and work through some initial challenges of building apps in VR. Briefly I’d like to share my design thinking and what I was able to accomplish in a weekend with no 3d graphics background and having only done the Roll A Ball Unity tutorial going in.
Starry VR was originally ideated as a music synchronization game designed for Google Cardboard. Users “catch” stars to bring life to constellations. Your goal is to “catch” the star as the path between stars are just about to connect. At the end users can view all the constellations in the galaxy.
http://bonnieyu.com/video/starryVRUnity.mp4 — a final build showing on Desktop
http://bonnieyu.com/video/starryVRMobile.mp4 — an early build drawing a constellation recorded on Android
Design Thinking & Challenges
This was an interesting challenge for two reasons. One I’ve never created a product in a 3d environment, and didn’t know how to use Unity. This meant learning the “basics” i.e. how to place and size objects. Two there are no set UX patterns in VR. While the core UX questions end up being the same, the design space feels very open.
How do users “catch” the stars?
The first problem to solve is to help users understand what they are pointing at and what they are selecting. In mobile apps, we don’t have the concept of a cursor. Users are trained to touch anywhere on the screen to select their item. When you select on a mobile app, you can see your hands move, so you get instant feedback about where you are about to select. For Starry VR, I decided to follow a model similar to desktop apps by providing the user a cursor that maps to your head movement. VR is more similar to desktop in that you don’t get that physical feedback about location.
Should users click on the star once their cursor goes over the star? The main difference between my VR cursor and a desktop mouse cursor is users don’t click. Part of the reason is because the Google Cardboard magnetic clicker isn’t very reliable, and half the time I’ve used it I’m more worried about breaking it if I were to pull too quickly or too hard. I opted to have the user simply move the cursor over the star with their head movement as the “catch” mechanism.
While not in the final product, I had started designing a game menu that used a similar gaze “click”. Users would activate a button by hovering over the button for 2 seconds. There are other alternatives such as a head gesture or voice command which I hadn’t yet explored due to the limited time of the GameJam. I do think voice could be a very interesting alternative or supplement to the gaze click. A voice command for “click” can relieve anxiety of “accidentally” activating a button or having the anxiety to always rest your cursor away from the button.
An Initial Sketch
Other design questions that came up along the way was — How big should the stars be? How close should objects be to the camera? How do I give users feedback? Where should I place menus?
Having done this initial prototype, I have the benefit to really play around with different interactions and quickly test out new designs to answer these questions.
I had a lot of fun doing the GameJam, and it was a great learning experience. Unity proved to be a great tool with one of the lowest learning curves compared to other gaming engines, a huge array of video tutorials, and a large community. I’d highly recommend anyone building apps for VR (whether you’re a UX designer or dev) to play around with it. There are many UX questions still being asked in the VR/AR space, and I think that’s what makes this space fun to get in on ☺