I felt it too: Emotion regulation through Augmented interoception.
Overview
There has been a long on-going investigation to discover the roots of human emotional experience. Recent convergence in the neuroscience community points towards embodied theories of emotion: that the emotions get generated by the body and are then sensed by the autonomic nervous system to create a conscious emotional experience. Building on these insights from therapeutic practices and neuroscience, we sought to investigate if emotional interoceptive feedback from a virtual body designed to elicit body ownership illusion in VR can be used for emotional regulation.
My Role
Designer, Developer, UX Researcher
As a part of this project, I designed and developed the entire experience. I was mentored by Prof. Aman Parnami (Head of Weave Lab, IIIT Delhi), with Jatin Arora as my team lead.
Duration
2.5 months (May’19-Aug’19).
Tools
HTC Vive (VR headset) | Unity | Shimmer GSR
Process
Research
I studied tons of research articles to gain insights about the intersection of HCI and neuroscience fields. I also read blogs, websites, and news articles throughout my internship period to stay up to date with the latest trends.
Design
About Interoception
Interoception refers to the ability of the human body to perceive sensations from inside itself. Much of this perception remains unconscious; what becomes conscious, i.e., interoceptive awareness involves the processing of inner sensations so that they become available to conscious awareness. For instance, we can get aware of our breathing when mindful. This interoceptive awareness plays a critical role in emotion regulation: our ability to become aware of internal sensations related to our affective states allows us to self regulate our emotions; lack of this awareness leads to emotional disorders. In the past, researchers have developed therapeutic approaches to increase interoceptive awareness; patients are taught to become mindful of the sensations felt in different parts of their body through techniques like body scanning.
Routine
The user stood in a virtual reality environment, saw his virtual body in a third-person perspective. The virtual body followed the user’s body movements as tracked by a Microsoft Kinect to establish perceived embodiment. Next, the user saw a short 8 minutes horror movie played on a two-dimensional screen within VR; horror has been called a body genre due to the bodily sensations that it elicits and hence fits well within the scope of our exploration. While the user watched the movie, the virtual body displayed visualizations corresponding to those bodily sensations as experienced by the user; We expected this display to make the user conscious about the fear sensations and hence assist to regulate them. The experience was tested with 8 participants (4 males, 4 females). The experiments were conducted during day time after taking informed consent from all the participants.
Challenges
Designing the display of the Interoceptive emotions
The virtual avatar displayed visualizations of the fear emotion that the users felt within themselves as they watched the movie. The place of the visualizations within the body was decided by referring to the Bodily maps of emotions as presented by Nummenmaa et al. in their paper. The researchers used extensive self-reports from 701 participants to define the maps of where different emotions were felt; fear was observed to be felt primarily in the chest and the abdomen region. Next, we tuned the intensity of the visualization with the participant’s arousal as observed from the galvanic skin response (GSR) sensor, and with the dynamics of the horror movie itself.
Galvanic skin response or skin conductivity as an approach for arousal measurement offers a poor temporal resolution — the response typically comes 3–5 seconds later than the presented stimuli. Hence, it cannot be used to quantify conditioned or startle responses in the user for interoceptive display. The intensity of the bodily visualizations (w.r.t these responses) was manually tuned by the researchers as per the trajectory of the movie; taking into account the sudden startle moments, playback of scary music, etc. The resultant peaks in intensity were then moderated with GSR drop, used as an index of psychological fear induced in the participant. Amount of psychological fear can depend on a variety of participant-specific factors like socio-cultural contexts, what s/he is afraid of and what doesn’t scare him/her, motivation, attention, etc. Some participants can feel more scared than others during the movie. Moderating with GSR drop took care of these factors.
User study observations
We conducted qualitative interviews with the participants to gather their perceptions regarding the experience. Participants were given an initial brief: that they would watch a horror movie and the virtual body would display the affective sensations that they feel as detected by the (GSR) sensor that they had worn on their hand. They were not informed regarding the event-based tuning done by the researchers. 7 out of 8 participants felt that the visualizations in the virtual body were in synchronization with the sensations that they had felt. They used phrases like “I was able to see my response,” or “my feelings were shown in the avatar,” to describe their experience. One participant wondered if the visualizations were actually connected to the music in the video rather than his emotions.
In the beginning of the video as the story was picking up, participants reported on thinking consciously about their emotions (as displayed on the virtual avatar,) like what were they thinking that was making them scared, etc. 3 participants reported on trying to reduce the fear by not thinking too much about the story or through deep breathing; two of them felt that he was able to achieve that. As the story picked up-tempo this conscious reflection stopped. The participants reported on seeing the virtual body after moments of startle response or when they became too scared.
Future work
Apart from conscious regulation that some participants reported on, we seek to investigate if just looking at the visualizations in the virtual body (like after moments of startle responses) would reduce fear through unconscious processing; this would need to be checked through objective measures like GSR or heart rate. Strong body ownership of the virtual avatar would be necessary to achieve this. Currently, the ownership was elicited through visual-kinesthetic mapping wherein the motion of the avatar was mapped to the user’s movements. This approach, however, is more effective if the user moves his body continuously which didn’t necessarily happen while watching the video. Hence, this aspect remains to be worked on in future design iterations.
Designing a “universal” interoceptive body mirror would require a sophisticated body imaging machinery which probably hasn’t been invented yet. The technique used in this project worked very well in the specific scenario of watching a horror movie. The design of visualizations based on the crowdsourced interoception maps and user-specific tweaking through GSR, was perceived to be similar to the actual bodily experience by the participants.
References
[2] How Do You Feel?An Interoceptive Moment with Your Neurobiological Self
[4] Fear: A Psychophysiological Study of Horror Film Viewing