Mystery of Mid-air Touch

Shaoyan Huang
Mystery of Mid-air Touch
7 min readAug 13, 2020

What can you feel in MR experience

Currently, more and more industries in Manufacturing, Healthcare, education, retail and advertising areas, have started to use Mixed Reality technology for resolving real-time issues. The Global Opportunity Analysis and Industry Forecast report also said the Global Mixed Reality Market is expected to reach $5,362.1 million by 2024. The future of Mixed reality is very promising. Mixed reality is a technology that can visualise the virtual environment and real environment, and in which physical and digital objects can interact in real time. For example, players can use their hands to grab virtual objects, such as a gun, balls, and do some actions, or they can anchor virtual furniture into their room to have a visualisation. Its capabilities of breaking down the wall between real world and virtual world offers immersive and fully engaging experiences to users.

Interactive techniques

As mentioned above, making users immersed into the mixed reality environment is the main goal in the development of MR applications. To mimic a real life experience in a MR world, visual is not the only aspect we should focus on, touch, smell, taste and sound are key elements contributing to immersive experiences as well, touch in particular. It is subconscious that when people see something in the real world, they want to interact with it, such as, touch, pick up or rotate them to have a look. If players can only see the fascinating virtual contents, but cannot interact with them, we can never call it an immersive experience. Beside the capability of manipulating digital items, a sense of touch is essential in a MR experience as well.

Currently, Users can interact with digital objects with assistance of controllers, wearable gear such as VR Gloves, tracking sensors worn on hand, arm or fingers, or wireless sensors.

Here we want to talk about a novel interaction named Mid-Air haptics interaction. This new technology allows players to manipulate virtual objects in the air without wearing any extra devices, and following with corresponding tactile feedback.

How to achieve that?

To achieve interactions in the mid-air, we use a tracking sensor called Ultra Leap. It is a development kit consisting of a hand tracking sensor(Leap Motion) and ultrasonic sensors, which can support Mid-air interaction in augmented reality, virtual reality and mixed reality interaction.

Hand tracking

The Leap motion controller can capture the movement of users’ hands with unparalleled accuracy, hence the interaction between players and digital contents can be as natural as physical interactions. The haptic feedback will be activated at the same time when users use gestures to operate digital objects. It uses ultrasound to create tactile sensations in mid-air, replicating the certainty of choice that users feel when using physical touch. The tactile sensor is made up of an array of 256 transducers. Each element can emit ultrasonic vibration with adjustable magnitude and frequency. Thus these ultrasonic speakers are used to generate focal-points of ultrasound in mid-air strong enough to create customised haptic sensations.

Haptics UX

Tactile feelings vary from different operations in real life, such as pressing buttons, petting animals, or grabbing soft/hard objects. Hence, we cannot indiscriminately apply the same haptics sensation to every manipulation done by users when designing the haptic user experience.

Here are some examples of how we apply different haptic sensations according to individual interactions.

Interface manipulation (Tap)

In mixed reality experiences, users always need to interact with buttons in the interface, such as pressing buttons, moving sliders, scrolling through the browser. When using mid-air interaction, users just use their bare hands to complete the entire operations. Seeing a virtual hand interpreting their gestures in the display is not enough. The touch experience will not be perfectly duplicated when haptics is lacking.

Contactless interaction
Click Sensation

In this example, to create a seamless connection between the virtual and real world, the interaction space has been set up in careful consideration of the Leap Motion controller’s specific tracking area. We chose four choice answers as the interactable space was limited. The interface design uses a transparent round button where the options are displayed to mark out the hand tracking area for optimising this hand interaction. A transparent digital hand is designed to mimic the user’s hand movements on the screen, closing the gap between the two worlds. The transducer array was also split into four parts to align with the interactable space. The array will be activated only when the leap motion controller recognises the hand gesture being made, such as press and release of the button. On experimenting with the haptic feedback points, a point vibrating vertically was found to be ideal to simulate a hovering button, and we used a group of points, rotating as they separate, to simulate a click as the user presses down. A feedback loop was created to make the selection happen virtually when the button is pressed and released.

Grabbing objects

Grabbing is what we do everyday just like touching as well. For instance, when we want to drink water, we grab a cup. When we want to send messages, we grab a phone. When grabbing, you have the sense of touch in your fingertips.

Grab objects
Grab sensation

In our clean river mini games, the task is using hands to pick up the rubbish. To mimic the physical grabbing experience, a presentation haptic sensation fixed to fingertips was designed. It will be activated once the virtual hands grab a piece of waste successfully. Otherwise, there is no tactile feedback.

Petting animals

Apart from interacting with objects, people interact with creatures as well, such as pets. How do we physically communicate with those cute animals? Petting, a very common communication way between humans and animals. For example, when people see dogs or cats, they always want to pet those little animals and enjoy the cozy moment. There is science behind that. Research shows that oxytocin levels will increase during the petting interaction in both species, which can build social bonding, relaxation and trust, and ease stress.

Petting Animals
Petting sensation

How to simulate such delightful touching experiences in Mixed Reality? Two important elements were picked up for our design, one is the intuitive hand gesture, and the following tactile feedback to players’ palms. First, we designed a transparent digital hand to mimic the user’s hand movements on the virtual world breaking the barrier between the two worlds. The transducer array will be activated only when the leap motion controller recognises the hand gesture being made i.e. the petting gesture. On experimenting with the haptic feedback points, a point vibrating vertically and fixed to the palm was found to be ideal to stimulate the touching experience of rubbing the soft fur of animals.

Hand goes through a portal

How about something more magical? How would it feel if you put your hands through a magic portal? I guess, it will be like plunging your hand into soft liquid, such as water. How to simulate such experience without wearing any physical devices?

Virtual Xmas stocking
Force Field sensation

In our Christmas Gift stocking experience, players can put their hands through a portal to grab a gift. We create a forceField sensation and attach it to the entrance of the virtual portal. Thus, when the virtual hand collides with the entrance of the portal, the forcefield sensation will be activated. Then the players will have a feeling of their hands going through an invisible virtual wall.

It just feels like Magic!

The experience significantly varies with or without tactile feedback. From the feedback collected from users who use our applications, haptics makes all experiences more realistic and enjoyable, compared with handing tracking only. They comment that “It just feels like magic!”. Participants are highly engaging in the games, while they think they are not physically interacting with the virtual objects at all without haptics.

Haptics interaction is still making its infant’ steps now, so how to design sensations and use them according to hand interaction is a long process. We will continue exploring the world of haptics UX, and happy to see your insights on this.

Thanks for reading.

Reference

https://www.miamiherald.com/living/health-fitness/article102848352.html

--

--

Shaoyan Huang
Mystery of Mid-air Touch
0 Followers

HCI Researcher and UX/UI Designer At JIX Limited (NZ)