UI / UX design in Bubble Labs

design considerations

When designing Bubble Labs I really spent a lot of time deciding on what would be intuitive design. My game was meant to be for VR, and that meant not having over-complicated controls or mechanics to start. Having now demoed over 50 people in my game, I have seen the common traps, and designed haptics, UI and visual cues around these tests. I first started with haptics, as it was the most accessible and started making my first prototypes.

haptic feedback

The Vive hand controllers have really precise haptics, allowing for values in time and strength quite easily. Simply the code is:

controllerDevice.TriggerHapticPulse((ushort)Mathf.Lerp(0, 3999, strength));

Even though there was a simple solution, I initially over-complicated things making functions for Long, Burst, and Delayed Pulses for my script, without really thinking about what I would use those things for.

The solution I wanted to solve through haptics was to convey trigger sensitivity. So, I designed a way to send varied delay pulses and strengths according to how far the trigger was pressed on the controller. In my game, trigger depression correlated directly with the velocity of the bubble particles, so it made sense to have strength increase as you press further. Okay cool, so we have some sort of feedback for how far we are pressing, but how do we handle the delay?

To make things more clear, I wanted a varying degree of delay between pulses to distinctly separate the two modes of firing speeds be it fast or slow. Because each of my bubble particles had a unique ID, I was able to pass this ID through a modulo with the rate of emission multiplied by the triggerAxis value multiplied by constant:

if (i % (rate * Math.Round((currentTriggerAxis.x * 15.0f), MidpointRounding.AwayFromZero)) == 0){ hapticsManager.GradientPulse(currentTriggerAxis.x); }

Because the steam controllers output a value from 0.0f to 1.0f, I could easily pass its value as a fraction to get short time between pulses. The result was something I did not expect to work right away — As the user pulled down the trigger further it would increase the time between pulses because the modulo would fire less frequently. Even now writing this in retro it seems odd, but let me explain why this worked brilliantly. The increased time between pulses actually implied distance the particles would shoot, while the strength still reminded the user of the push.

After implementing this, I saw a immediate change in users noticing the difference between effects. I implore readers to consider unorthodox uses of haptics to convey information.

menu design

The second challenge was how to handle the menus in my game. I was already using the touchpad on the bubble gun, so it made sense to use to the other to interact with the menu. There are still some flaws with this design for beginners, as some will accidentally press on the touchpad without knowing that it is a menu. However, the design I landed on works pretty good for being simple and still visually obvious what it does:

I chose to use iTween for my menu transitions and a enum button selection to handle my presses. iTween also made it really easy to shrink the active item down so as not to interfere with the menu selection. Also, each of the menu items has a LookAt script so that all of the items face the player and can be readable.

The actual iTween implementation was not hard. All I did was pass a bool along with the gameObject and the individual menu items isScaled. I converted the value into an int and used iTween scaleTo in order to scale up to standard, or to 0 as the bool changed.

Also, the selection is never tied to the animation, so the option can be selected quickly and easily without looking.

tooltips and user onboarding

I designed my game from the ground up to be easily demo-able. The ability to stay in or leave the experience at the user’s leisure was important to me, and that meant streamlining my on-boarding process. A minimal yet elegant solution was to use dismissable tooltips. A simple line renderer connected to canvas elements with text proved to be a good way to teach controls and the functions associated with them in my world. Here’s a gif of my tooltips in action:

The ability to dismiss the tips at any time gave new users the control to recall how a button is used, or for experienced users to have no hindrance in their play. Also, I z-positioned the text along with the actual position of the elements to give depth to the diagram. The depth conveyed the position of the actual buttons and allowed new users to find buttons like the trigger more easily. While to some more experienced this might seem obvious, for first time Vive users this had to be reminded.

in retro

I’m really glad I chose to take so much time designing the polish of my game, as it became very clear through the initial tests that there would need to be some form of tutorial for this game. I experimented with the idea of a directed tutorial but the ability to put people in and out of the experience with ease made it really streamlined to demo and captivate the users quickly. Hopefully, you learned a few things about UI/UX and feel free to post comments asking for specific aspects of my game.

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.