Wearable Empathy: Feeling How Others’ Bodies Interact Can Change Your Perspective

Jun Nishida
ACM UIST
Published in
5 min readOct 23, 2020

Hello, I’m Jun Nishida, a postdoc fellow at Human Computer Integration Lab in The University of Chicago. At ACM UIST 2020, I presented HandMorph, which is a mechanical exoskeleton that changes your hand grasp into that of a smaller person. We think HandMorph is especially interesting to enable people, such as product designers, to benefit from a more embodied usability evaluation for their products. For instance, imagine a toy maker testing out a new toy design intended for children aged 3–6; we think these types of situations where the users’ body differ radically from the designers body are especially interesting cases for HandMorph.

While I presented the details of this project at UIST paper session “Session 9A: Haptics for Hands and Feet” (7:30am CDT, Thursday, October 22, 2020), I would like to present the research questions and goals behind HandMorph as well as all my research projects.

From Sympathy to Empathy

While today’s tools allow us to communicate effectively with others via video and text, they leave out other critical communications channels, such as non-verbal cues and body language. These cues are important not only for face-to-face communication but even when communicating forces, feelings and emotions. Imagine a patient of a physical therapy clinic trying to explain to their doctor how their muscles feel “clicky” and “painful” when they bend their knee joint “like this”… this is hardly something you can put into words, yet, when you do visit your doctor that’s all you can do. Looking back, this is the same as the case of the product designer who’s trying to imagine how a 5-year old might grab the toy with their hands: are the handles in this toy too large? These communication challenges happen because: (1) people have radically different abilities and bodies; and (2) our tools are geared to sharing/transferring visual-audio signals but not to share/transfer these body abilities across users!

Changing bodies to change our perspective

To tackle this challenge, we have been exploring the larger concept of changing our bodies into that of another person using wearable devices, as shown in the figure above. This will provide a more authentic, embodied, and empathic knowledge of one’s perspective in daily life.

The concept of changing our perceptions has been initially explored in psychology transferring one’s sense of body ownership into a small doll using a technique based on the Rubber Hand Illusion [1], or even, changing the feeling of our height by embodying a taller or smaller virtual avatar These studies revealed that users’ size and distance perception and even attitude changed by the change in their “virtual” bodies [2]. While these results are exciting because they show that changing one’s body change’s one’s perspective [3], these results are typically achieved only in virtual reality or a passive manner, i.e., not in social contexts with other “real” users, not in one’s workplace, etc… in other words: these approaches exist only in the simulated environments that VR can create but not in our a user’s surroundings.

Thus, our approach for the last five years has been to engineer wearable devices that can bring back these findings and change user’s perspectives/bodies while they inhabit the real world; in other words, our wearable allows you to experience these extreme body transformations (e.g., your hand becoming smaller, feel the way somebody else moves, etc.) while preserving your existing physical and social context.

Morphing Bodies to Change Perspectives

Over the last years, we have developed and evaluated three interactive systems, all taking the form of custom-made wearable devices, as follows:

1) at CHI’19 we demonstrated a wearable augmented reality device which allows the wearer to see from the vantage point of child’s, i.e., it modifies the wearer’s height [4]. We confirmed that a user’s perceived height is shortened, enlarging a user’s personal space due to an oppressive feeling towards an approaching person;

2) at UIST’20 we present a passive hand exoskeleton that changes the wearer’s hand ability to match that of someone with a smaller grasp (e.g., a child) [5]. Wearing this exoskeleton affected a user’s size perception and gained confidence when designing a toy for a smaller person;

3) at CHI’17 we demonstrated a pair of wearable devices that couples the muscle activity of two users together (i.e., if I move, you move at the same time). We used it to share a patient’s kinesthetic perspective with their doctors and designers, e.g., to enable a designer to feel how a person with Parkinson’s tremor, might feel and move [6]. Sharing muscles using electrical muscle stimulation(EMS) and biosignal measurement achieved to transfer rhythmic motion from one to another with no visual and auditory clues.

Through perceptual and behavioral studies in a lab, and observation studies in an exhibition for each project, we have found that changing bodies also changes a user’s perception, action, and even interaction.

I define these embodied and subjective perspectives achieved by changing bodies as HYPERSPECTIVE (hyper + perspective), and would like to continue to explore wearable technologies to achieve this and to apply these technologies for assisting mutual understanding and cooperation among people in the fields of rehabilitation, education, and design.

[1] van der Hoort B, Guterstam A, Ehrsson HH (2011) Being Barbie: The Size of One’s Own Body Determines the Perceived Size of the World. PLOS ONE 6(5): e20195. https://doi.org/10.1371/journal.pone.0020195

[2] Banakou, D., Groten, R., & Slater, M. (2013). Illusory ownership of a virtual child body causes overestimation of object sizes and implicit attitude changes. Proceedings of the National Academy of Sciences, 110, 12846–12851.

[3] Maister, L., Slater, M., Sanchez-Vives, M. V., & Tsakiris, M. (2015). Changing bodies changes minds: owning another body affects social cognition. Trends in cognitive sciences, 19(1), 6–12. https://doi.org/10.1016/j.tics.2014.11.001

[4] Jun Nishida, Soichiro Matsuda, Mika Oki, Hikaru Takatori, Kosuke Sato, and Kenji Suzuki. 2019. Egocentric Smaller-person Experience through a Change in Visual Perspective. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI ‘19). Association for Computing Machinery, New York, NY, USA, Paper 696, 1–12. DOI:https://doi.org/10.1145/3290605.3300926 (Best Paper Honourable Mention Award)

[5] Jun Nishida, Soichiro Matsuda, Hiroshi Matsui, Shan-Yuan Teng, Ziwei Liu, Kenji Suzuki and Pedro Lopes. 2020. HandMorph: a Passive Exoskeleton that Miniaturizes Grasp. In Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology (UIST ’20). Association for Computing Machinery, New York, NY, USA, xx-xx. DOI:http://dx.doi.org/10.1145/3379337.3415875 (Best Paper Award)

[6] Jun Nishida and Kenji Suzuki. 2017. bioSync: A Paired Wearable Device for Blending Kinesthetic Experience. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI ‘17). Association for Computing Machinery, New York, NY, USA, 3316–3327. DOI:https://doi.org/10.1145/3025453.3025829

About the Authors:

Jun Nishida

Soichiro Matsuda

Hiroshi Matsui

Shan-Yuan Teng

Ziwei Liu

Kenji Suzuki

Pedro Lopes

Jun Nishida, Soichiro Matsuda, Hiroshi Matsui, Shan-Yuan Teng, Ziwei Liu, Kenji Suzuki, and Pedro Lopes. 2020. HandMorph: a Passive Exoskeleton that Miniaturizes Grasp. In Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology (UIST ‘20). Association for Computing Machinery, New York, NY, USA, 565–578. DOI:https://doi.org/10.1145/3379337.3415875 (Best Paper Award)

--

--

Jun Nishida
ACM UIST
Writer for

PhD in Human Informatics | Postdoctoral Fellow | UChicago