Electrical Muscle Stimulation in HCI: 10 years later... what about the question of agency?

Pedro Lopes
ACM CHI
Published in
4 min readMay 4, 2019

This article summarizes a paper authored by Shunichi Kasahara, Jun Nishida, and Pedro Lopes. This paper will be presented at CHI 2019, a conference of Human-Computer Interaction, on Monday 6th May 2019 at 14:00 in the session Direct Bodily Interaction.

More than years have passed since researchers started using electrical muscle stimulation (EMS) in interactive systems. Kruijff et al. (VRST'06) explored how these tiny medical devices could change gaming on a desktop by allowing the user's muscles to contract in response to game events. This fueled researchers such as Emi Tamaki & Jun Rekimoto to open the doors of EMS to the CHI community (CHI'11).

Ten years after… I'm astonished at all the creative uses the HCI community found for EMS: Max Pfeiffer & Michael Rohs explored how to steer participants, Jun Nishida & Kenji Suzuki enabled communicating gestures from person to person, and myself together with Patrick Baudisch explored how to turn the user's body into input and output devices (just to cite a few, for a more detailed timeline of the many contributing faces of EMS in HCI, see here).

But, ten years later, it's about time we talk about agency. EMS systems offer a compact and wearable form factor (when compared to their mechanical haptic counterparts, such as exoskeletons) but being moved by an external force: feels weird. If you were ever moved by some haptic actuated device (be it EMS, exoskeleton or a robotic arm) you probably felt how strange it is to see and feel your body being moved by an external cause.

So for this year's CHI, my group at the University of Chicago (with the help of our collaborators) decided to take some first steps in understanding what is going on with the loss of agency in haptic actuation. We asked ourselves the question: how do we understand what is going on in the user's brain when one is moved by an external force?

To answer this we turned to two core questions: (1) can haptic systems actuate us to provide a significant faster reaction time without always entirely compromising my agency? (2) How does our brain integrate haptic feedback when moved by an external force such as EMS?

In our first #CHI2019 paper, together with Shunichi Kasahara (Sony CSL) and Jun Nishida (University of Chicago), we explore how delaying the onset of the haptic actuation dramatically improves the sense of agency! Despite being a first step to understand the relationship between agency and preemptive, we think these results are really exiting; they allow us to build a model to choose how much to sacrifice agency to gain reaction time speed ups, and vice versa!

We demonstrated that by delaying the onset of haptic actuation systems (such as EMS) we can accelerate human reaction time without compromising the user's agency! (CHI2019, video here and paper here)

The next steps will involve understanding how this might scale to more complex tasks, as insofar, we only explored simple scenarios such as high-speed-photography or hitting an object in motion! If you want to try this, consider coming to our demonstration at SIGGRAPH'19.

Now, going deeper, one might ask: but how does our brain integrate and process these haptic signals? This is a question we are constantly debating and digging deeper. One possible angle to explore this came from our second #chi2019 paper, which was a collaboration with Klaus Gramann's group at TU Berlin and C.T Lin's group at UTS. We uncovered another small piece of the haptic agency puzzle while trying to understand how to detect mismatches in virtual reality (VR) without having to ask user's about their subjective experience; i.e., can we evaluate the coherence of a visuo-haptic VR experience without having to show you a presence questionnaire? Here, instead of asking the user, we measured their brain's responses, using EEG, as their interacted with VR objects that exhibited also haptic feedback (in fact, vibration and/or EMS). We found out that when there is a mismatch between visuals and haptics (e.g., out of sync) our brain's event related potential (ERP) looks very different from when things feel right. In fact, there is a pronounced negative valley in the user's ERP when things are off from our expectations.

We found out that we can use EEG to detect mismatches between visuals and haptic while a user is interacting with a virtual environment. This is a very different way to understand realism in VR, which does not rely on asking user's questions. (paper here)

It is precisely this mismatch in expectations, which we observed even when user's did not consciously realize this were off, that might help us understand the puzzle of agency and EMS. These unrealistic situations might be very similar to when we are moved by an inexplicable external force, such as EMS.

Going even deeper into the neural processes, with the assistance of functional magnetic resonnance imaging (fMRI) we can examine our brain at work when we are moved by external forces like EMS. This is precisely what I did with my collaborators Jakub Limanowski (main author at UCL), Janis Keck (FU Berlin), Patrick Baudisch (HPI), Karl Friston, and Felix Blankenburg (FU Berlin). In our upcoming Cerebral Cortex paper, we used fMRI to examine` how agency (i.e., whether you moved yourself consciously or EMS moves you) impacts how our brain interpret and integrates sensory information such as touch sensations!

Using fMRI, we examined how agency, i.e., if you move yourself consciously or if an external force, such as EMS, moves you, impacts the way our brains process sensory information such as tactile information. Paper in Cerebral Cortex (to appear, ask authors for pre-print).

These three projects just scratched the surface of all the questions around agency and haptics. We hope to foster more discussion on agency next week at #CHI2019. Come talk to us! Lastly, I am very grateful to work with all these brilliant minds, with a special and humble thanks to Shun, Jun, Sezen, Lukas, Jas and Klaus!

--

--

Pedro Lopes
ACM CHI

Pedro heads the Human Computer Integration Lab at the University of Chicago: lab.plopes.org Previously PhD with Patrick Baudisch, HPI. #HCI researcher.