UX PoC in AR/VR #2: Manipulation Gizmo

Alexia Buclet
11 min readJan 9, 2023

--

The second episode of a series to share Opuscope’s work focusing on building the best User eXperience (UX) in Augmented Reality (AR) & Virtual Reality (VR) possible, thanks to Proof of Concepts (PoC). Maybe they will help other professionals through their AR/VR journey.

To get more context about how we made PoCs, you can check out the first episode:

Now let’s see the Manipulation Gizmo’s case.

Minsar Studio’s requirements

Minsar Studio was an AR/VR app to make creative professionals able to easily create and share immersive experiences.

In our target, we had 2 major profiles:

  • Creatives specialized in 2D, who wanted to jump into the 3D and immersive design adventure.
  • Creatives specialized in 3D, who wanted to create their own immersive experiences without knowing how to code them.

The manipulation gizmo is what you use to manipulate scene elements: mainly move, scale and rotate them.

Our manipulation gizmo needed to be adapted to both our targeted profile types. It needed to be understood by 2D Designers who weren’t used to 3D manipulation gizmos and fit the 3D Designers’ expectations and habits on the many tools they already use.

Plus, we needed to find the right balance between not being visually heavy not to disturb the scene too much, and being big enough to be easily used with a cursor which wasn’t that convenient to use on some headsets (let’s say it: the HoloLens).

Finally, as the manipulation gizmo was used in an immersive creation environment, it must be usable from anywhere around the scene element.

Previous Manipulation Gizmo

The first manipulation gizmo we made on Microsoft HoloLens (Opuscope — 2017)

For years, we had a manipulation gizmo (inspired by the Microsoft HoloLens’ one as we started with this platform). It visually evolved a little bit, but that’s all. It had some issues:

  • It was quite visually heavy.
  • Handlers weren’t always easy to use.
  • Features were limited, providing the user only with scale, rotate, and move on 3 dimensions at once.
  • It didn’t show any values on the performed manipulations.

With the PoCs, we thought it was time to invest some time to make it the best immersive manipulation gizmo we could!

Manipulation gizmo on Oculus Quest, before the PoC (Opuscope — 2019)

Benchmark

To do so, we started by taking a look at what already existed.

Manipulation gizmos are used in all 3D software (Blender, Unity, Maya, etc.). We wanted ours to look familiar to match their regular users’ expectations. However, they are mainly used on computers. The cursor is way more precise than those of headsets or touchscreen devices. Plus, the user isn’t immersed in the 3D scene. They’re less likely to be oppressed by it.

Several types of manipulation gizmos are used in AR/VR tools (Reality Composer, Maquette, Dreams, Holograms, Actiongram, Layout, HoloSketch, Medium, Quill, etc.), without a clear standard, compared to 3D computer software. As usual, we tried them to get do’s and don’ts for our case.

Here are the PoC’s goals and how we tried to reach them.

Usable from anywhere

As the creator could be anywhere in the scene, we wanted the manipulation gizmo to be accessible wherever the creator was and whatever the scene element was (small, big, flat, etc.).

We needed to set some rules to handle all possible cases.

Minimum size

The creator could import anything into Minsar Studio (3D models, pictures, videos, etc.). The manipulation gizmo was displayed around scene elements, so we faced problems when they were very small (globally or on a dimension). Handlers would collide or be too small to be triggered.

Then thanks to the PoC, we set a minimum size for the manipulation gizmo. It’s an invisible virtual box attached to the element that represents the minimum size needed to manipulate an element whatever its scale and distance. It uses the same proportions as its related element.

An important point in VR is that to visually look always the same size for the user, the virtual box is scaled according to the distance to the camera.

The virtual box to define the manipulation gizmo minimal size, only visible here for the PoC’s purpose (Opuscope — 2021)

This virtual box was used to position the manipulation gizmo handlers.

Always reachable handlers

To make sure the user could always easily trigger the manipulation gizmo’s handlers, we decided to put them around the scene element but at a certain distance (compared to the previous gizmo). No conflict was possible between the scene element and its related handlers.

Manipulation handlers around the scene element in the PoC (Opuscope — 2021)

Handlers were displayed when a scene element was selected. We then decided to render them above everything else (the scene element included). It defies the physics rules, magic made possible in VR for the good of the UX!

Also, handlers’ positions may vary depending on the perspective we have on the scene element to make sure they are all reachable. We had to choose when to reset their position. Doing it in real time, as the user moves, was too disturbing. To make a smooth transition between the 2 handlers’ positioning, we decided to update them only when the creator selected the element, released it or one of its handlers after a move, or teleport themselves.

Limit visual overload

In the former manipulation gizmo, many handlers with the same use were displayed at the same time. With the new version, we decided to limit them to the bare minimum: one handler per rotation axis and 2 for scale.

For the scale handlers, we showed the closest and the farthest one from the camera’s bottom left corner. This way, it was quite easy to reach one of them from the top, bottom, left, or right of the element.

Rotation handlers were displayed on the rotation axis, near the closest edge from the user to be easy to trigger.

Depending on the perspective, handlers may hide each other. Therefore, if a rotation handler was too close to a scale handler, we hid the rotation handler and show another one of the same axis near another edge. Conflicts were avoided, just like that!

Show manipulation values

With the former manipulation gizmo, the user couldn’t know how far they drag and dropped the element, or how much they scaled it.

To solve this, we displayed values for each manipulation: move, rotate, and scale.

Move values

For the move, we showed the distance from the center of the element to its new position. Since the center could be occluded by the element, we rendered it at the top of everything else. The user couldn’t miss it during the gesture.

The value was displayed above the cursor’s grabbing point to make sure the creator saw it, in a textbox for it to be legible whatever the background was (as the background could be anything).

The move distance displayed in the Unity Editor version of the PoC (Opuscope — 2021)

Rotate values

We used the same rule to display the rotation values on top of everything else during the gesture, and the same textbox.

The value was displayed above the rotation handler the user was manipulating.

The rotation value displayed in the Unity Editor version of the PoC (Opuscope — 2021)

Scale values

The hardest one! Since experiences made with Minsar Studio were supposed to be lived at a human scale, we decided to display the physical size of the element, not its percentage compared to its genuine size. It means displaying 3 values, one for each edge (X, Y, and Z). To do so, we faced several challenges.

At first, we wanted to display the same textbox as for move and scale values, with the same style as for pieces of furniture’s description, for example: “2x0.5x1.2m”. It was simpler to display since the system was already in place, but not convenient at all to know which value corresponded to which edge.
We decided to position them on their edge to visually link them.

Then, the question was, should we display them horizontally to match the reading habits? But then with the perspective, we could mismatch them.
We chose to display them along the edge to prevent any mismatch.

At first, we used the same textbox but they were visually cumbersome as there were 3 of them. To fix this, we designed a dedicated style for the values to be easily legible on any background, but less oppressing. We displayed white text with a dark outline. The PoC helped us also define the font size.

Finally, we iterated on the position along the edge to prevent values from conflicting with the handler and in most cases between each other. We also made them face the creator. Only the rotation on the X-axis of the edge was enabled to keep the visual link between the edge and its value.

Compared to rotate and move’s values, we didn’t show the gesture’s value (the difference between the start of the gesture and the end of it), but the actual size of the element. This value can be interesting for designers, even when not performing the gesture. That’s why we chose to display it when hovering over one of the scale handlers too. The creator could then easily get the information, without always displaying them when they didn’t really needed them.

The scale values showed in the Oculus Quest 2 version of the PoC (Opuscope — 2021)

I hope you enjoy the appearance animation we made 😉

PoCs are the opportunity to pay attention and take care of every detail.

Offer advanced features for experts

Identify rotation axes

3D creators need to quickly identify which rotation handler corresponds to which axis (X, Y, and Z). They are used to this feature in the 3D tools they work with.

We applied the same rule as in these software: change their color according to their axis.

We wanted to find an alternative solution for colorblind people but didn’t find any at the time. Since the rotation handler was positioned according to its axis, it still could be used easily.

Local / Global

The default rotation we offered was based on the element. It means that the rotation axes we used were the ones of the element.
However, advanced 3D tools provide their users with both local (the element as a reference) and global (the world space as a reference) to make all their creation wishes come true.

We worked on a way to display the rotation handlers based on the world space axes. We chose to put an invisible bounding box, aligned with the world axes, around the scene element to position the rotation handlers.

The global bounding box to position the rotation handlers on the Oculus Quest 2, only visible here for the PoC’s purpose (Opuscope — 2021)

It had a minimum size version as well.

The user could switch between the local and global references thanks to an option in the menu.

Assisted move

The regular manipulation gizmo was designed to be easily used by 3D novices and experts. It enabled the user to move the element on 3 axes. However, 3D Designers also need advanced options to move the element on 1 or 2 axes to be more accurate. At first, we tried to do everything in the same manipulation gizmo, but it was really too much. That’s why we created a dedicated gizmo we called “assisted move”. The user could switch from the regular one to the assisted move one in the menu.

Since the assisted move gizmo was specialized in moving the element, we offer the possibility to move on 1, 2, or 3 axes. It wasn’t easy to define how. Indeed, in the regular version, the user could move the element by dragging and dropping it, new handlers were needed.

First of all, where to put the handlers? It may be tempting to say “let’s put them to the sides of the element”. This way, there can’t be any conflict between the element and the handlers. False good idea alert! Scene elements can be of any size and proportion. Putting their handlers to the sides would result in having them really far apart from each other if the element is big. The user would have a hard time using them and may take some time to find them.

We chose to display them over the element, with a shape familiar to the one 3D designers are used to in their regular tools.

Assisted move gizmo in Minsar Studio on Oculus Quest 2 (Opuscope — 2021)
  • A small white square at the center of the element to move it on the 3 dimensions. Since the assisted move gizmo was displayed over the element, we disabled the possibility to move it by dragging and dropping it to favor the gizmo’s manipulation.
  • 3 arrows, one for each dimension, to move it on only 1 axis. An intuitive way to follow the axis line. We colored them to match the same axes colors as for the rotation. Expert users could then easily recognize them.
  • 3 faces, one of each combination of 2 dimensions, to move the element on only 2 axes. An intuitive way to follow the invisible surface created by the combination of 2 axes. As for the arrows, it’s easy to anticipate the movement, even for 3D noobs.

During manipulation, all other handlers disappeared to keep the creator focused on what they were doing. As for the regular manipulation gizmo, the distance was shown.

A 2 dimensions handler being dragged in Minsar Studio on Oculus Quest 2 (Opuscope — 2021)

For the arrow, we also displayed the axis line for the user to see where it will go if they move it.

A 1 dimension handler being dragged in Minsar Studio on the Oculus Quest 2 (Opuscope — 2021)

Note: helper feature in the menu

Even with a great gizmo, manipulating an element in an immersive space may not always be easy. That’s why we provided the user with 2 ways to go back to revert their changes:

  • “Undo/Redo” buttons to rollback 1 manipulation at a time.
  • a “Reset” button to make the element go back to its default scale and rotation.

Problems we didn’t solve

Let’s be honest, we couldn’t solve everything we wanted, even with a dedicated PoC 😔

One issue that followed us through the years was: how to manipulate an element while we are inside of it.

We wanted to offer a solution for the user to be able to use handlers from the inside. We had technical problems and render conflicts and couldn’t find a satisfying solution.

PoCs are great to try things, and iterate on them, but sometimes you don’t find the answer to your issues…

--

--

Alexia Buclet

French UX Designer & Cognitive Psychologist since 2010, I worked at Ubisoft, Adobe, Aldebaran Robotics and Opuscope (AR/VR). Currently freelance in impact tech!