Touching Holograms
How the Microsoft Mixed Reality Studios explored more immersive interactions
Read the full article on our new website.
“Can I flick it?”
We were often asked questions like this when users were testing our immersive interaction prototypes on HoloLens 2. Hologram interactions based on simulation and emergence, rather than individual features, can lead to interesting test scenarios.
The honest answer was, “I don’t know, but try it.”
What if you could treat holograms just like real objects?
You wouldn’t need to explain to users the specific steps of interacting with a virtual object. You could just ask them to “pick it up” or “put it over there,” and they would do just that.
We could fluently communicate and play with virtual objects with our eyes and hands in the language we are used to.
I saw this happen when we introduced a self-declared technophobe to our physical interaction prototype on HoloLens 2 for the first time. She proceeded to smile and play with a virtual cube, dancing with the holograms for several minutes (literally).
With articulated hand tracking, eye-gaze tracking, and an increased field of view, HoloLens 2 opens new opportunities for users and interaction designers alike. As designers, we aimed to use these inputs and outputs to push immersion further, allowing people to interact and play with holograms in the same physical way that they do with real objects.
Digital hand twin
How can we make interactions more natural? We started with some laws of nature and built the interactions from the bottom up. When you break it down, interactions like grab, push, and throw are all made up of the same physical concepts: momentum, collision, friction, and gravity.
Real-time physics engines are a key part of many video games, from the Half-Life series to Gang Beasts and Totally Accurate Battle Simulator. Just as these games use physics engines to handle open-ended interactions between virtual objects, we can use physics engines to model interactions between a user and a virtual object.
As articulated hand tracking with HoloLens 2 gives us the pose of a hand, we can construct a physically simulated twin of your hand in the virtual world by replicating the positions, velocities, and momentum of different parts of the hand. When this virtual hand interacts with virtual objects, the physics engine simulates the outcome, applying momentum, collision, friction, and gravity. These elements add up to grasp, throw, flick, or any other hand-object interaction you can think of.
In video games, the physics engine is a completely controlled virtual environment. This is different in Mixed Reality, as there is both a simulated physics engine and a physical reality, which need to interact together. An important law in physics simulation is Newton’s third law of motion, which states, “For every action in nature, there is an equal and opposite reaction,” however, we cannot create these reactions in the physical world, as we cannot apply real force to the user’s hand. This results in the problem of a missing sense — the sense of touch, also known as haptic feedback.
Compensating for missing senses
No matter how realistic we make these experiences look, people still can’t actually feel holograms. HoloLens 2 does not simulate the sense of touch, a key part of hand interactions with objects. Touch provides constant feedback when interacting with an object, so the missing touch sensation can make objects seem ghostly during interactions.
To design around this, we must over-communicate with the senses that we do have access to: vision and sound. When a user touches and releases an object, we play a sound, and the object lights up to strongly communicate this to the user.
Physical interactions between objects are two-way, with both objects influencing each other. We cannot make virtual objects physically influence your hand through touch, but we can use light to show a relationship between the object and your hand.
Holding a bright object will cast light onto and through your hand, giving you more feedback about the interaction between your hand and the object. Adding this subtle effect to the virtual hand has a surprisingly strong effect on the realism of the interaction, giving information about depth, proximity, and direction. This lighting effect blurs the line between digital and physical, as the hand you are now looking at is a composite of the lighting from both the user’s real environment and virtual objects.
Some users described that holding a particularly bright red glowing hologram and seeing its effect on their skin made their hand feel warm, even though they knew that it could not really be heating up.
While these designs improve the experience of handling virtual objects, there is still some ghostliness to the interaction, as there is no haptic feedback.
Bending the rules to accommodate users
Now that these virtual objects behave in a more physical way, testing with users revealed some unwanted physical interactions creeping in, too. Walking large distances to pick up objects and reaching down to the floor to collect dropped items are real-world annoyances that we do not need to tolerate in the virtual world. For users with different abilities, these types of interactions range from annoying to impossible.
We could mitigate the negative effects of dropping a virtual object by disabling gravity. But without gravity, objects float away from the surfaces you place them on, like on the International Space Station.
To improve the negative interaction of accidentally dropping objects, we introduced the concept of “surface gravity,” giving objects gravity when there is a surface below to fall onto. When there is no surface, objects float in the air where you leave them.
During testing, one user threw an object up high, where it floated until they attempted to climb on top of a precarious bar stool to reach it. This highlighted a problem for users trying to access far objects, as this person was putting themselves in physical danger for a virtual object.
To improve the experience of reaching faraway objects, we introduced a “telekinesis” gesture. This allows people to summon objects to their hand when they are far away. While telekinesis is not part of real physical interactions, it is a gesture many people have seen or imitated from popular media like Star Wars and Matilda. Here, we use a combination of eye-gaze and hand tracking to let the user confidently move an object without touching it.
Emergent interactions
The immersion found in this prototype is a glimpse into a world of tangible Mixed Reality interactions, and simulated physicality could change how we use and understand every digital experience.
Imagine, when shopping online, you could hold the blanket you are viewing, throw it over your sofa to see how it looks, then auto-fold it and drop it into your shopping basket.
How could the feeling of ownership change if you could hold a music album in your hands, put it on your shelf, throw it to your speaker to play, or burn a mixtape and hand it to a friend?
What new art and play could emerge if our digital products are more open to undesigned interactions, like record scratching on vinyl or folding a document into a paper plane?
How could we change how we educate, if you could teach gravity by pushing a miniature planet into orbit around a sun, or unroll the Magna Carta and hold it in your hands?
With our prototypes, we contribute towards a vision of more immersive, playful and open-ended interactions to cultivate the inquisitive and creative nature of people to explore and interrogate virtual experiences. It is the beginning of an exciting journey to discover how these types of new interactions come to life across diverse and multifaceted user experiences and use cases.
This exploration was created by Microsoft Mixed Reality Studios — Lighthouse Studio, London, who partner with Microsoft customers to create Mixed Reality experiences.
To stay in the know with Microsoft Design, follow us on Twitter and Instagram, or join our Office or Windows Insider program. And if you are interested in working with us at Microsoft, head over to aka.ms/DesignCareers.