HCI & Design at UW
Published in

HCI & Design at UW

Why heart rate and other biosignals are the future of social interaction and how to use them

The work described in this post was done in collaboration with Andrés Monroy-Hernández, Laura Dabbish, Geoff Kaufman, Chunjong Park, Yu Jiang Tham, Jack Tsai, Sven Kratz, Mario Esparza, and Maria Pavlovskaia.

Why biosignals?

There’s no question that digital technology has infiltrated our everyday communication — even more-so during the COVID-19 pandemic. But over text and video calls, it’s really hard to tell how people are feeling without properly seeing their facial expressions or body language. This can ultimately have negative consequences: miscommunication, distrust, and even the Zoom fatigue that many of us may be feeling.

The difficulty in conveying our natural nonverbal cues over technology suggests the need for new types of cues that technology can afford. We propose one that may already be available to you, if you own a smartwatch: your biosignals. It’s well-known that biosignals change based on emotional responses. You can feel your heart rate increasing when you’re excited, or your palms getting sweaty if you’re nervous. Displaying our biosignals to others could become a new kind of cue, a new way to express our emotions from our body itself.

Expressive biosignals: using sensed physiological data such as heart rate, skin conductance, and brain activity, as social cues.

You might be questioning how or why anyone would want to show their heart rate to someone else, but this kind of sharing is already available and being used on popular consumer wearables. You can send your heartbeat through the Digital Touch feature on the Apple Watch, or share it post-exercise on social fitness apps like Fitbit. With these kinds of devices on the rise, one can imagine a future where everyone can access and share their biosignals with others.

Biosignals are, of course, our personal and private data, and we don’t typically discuss them with people out of the blue. Moreover, any AI system that claims to detect our emotions, such as by using biosignal data, needs to be carefully considered, as they could shape the way we communicate in unintended ways. As new means to sense and share these data emerge, we must understand the implications of revealing them to others. How can technology designers sensibly integrate biosignals into communication, and what value and consequences will they have?

What does the research say about biosignals?

To understand the communicative potential of expressive biosignals, we created two smartwatch apps for sharing messages suggested based on biosignals, and evaluated them in people’s natural settings. I’ve summarized them here, but check out the two research papers we have written for the full details! Also, come to our presentation at the CHI 2021 conference on May 9 (7–9pm ET) or May 10 (11–1pm ET) in the Affection and Support in a Digital World A and C sessions!

Animo: Imagine mood ring meets smartwatch meets biosignals. Animo was an experimental prototype we built for the Fitbit Versa that displayed an “animo” on the watch face, a shape that changed colors and animates according to the wearer’s mood, based on their heart rate. People could tap on their animo to send it to one other person, who could send their animo back. We recruited 17 people to use Animo for 2 weeks with a partner of their choice (significant other, friend, co-worker, etc.) to explore how people would use Animo together. We found that the animos:

  • helped people keep in touch with each other and start new conversations about each other’s emotions
  • were sometimes ambiguous, with some people believing they didn’t reflect their mood and sent them just to say “hi”
Left: sending your animo; Right: receiving your partner’s animo

“The thing on my wrist was [him]. It, like, reminded me that, like, there was a prompt to communicate.” — Animo participant paired with their significant other

Significant Otter: We took what we learned from Animo and built an experimental Apple Watch app targeting romantic couples, who were the most excited about sharing their data with each other in the Animo study. Much of the design was similar, where couples could tap to send animations to each other, but we changed our biosignals representation. We showed a set of animated cute otters based on heart rate in order to 1) reduce potential disagreement with only one system-interpreted state, allowing the user to select what most accurately reflects them from a narrowed list; and 2) reduce ambiguity around what’s being displayed with a more expressive and relatable avatar. This time, we wanted to understand the effects of biosignals specifically, by comparing people’s perceptions of the app with and without biosignals. So we recruited 20 couples to participate in a 1 month study to use an experimental version of Significant Otter with sensing OFF (without biosignals) for 2 weeks before switching sensing ON. We found that biosignals:

  • Clarified the meaning of the otters to partners, which were viewed as ambiguous emojis (aligned with prior work) without biosignals
  • Enhanced authenticity, where people felt more open and connected with each other because their otter was personally representing them and was backed by data
  • Elicited questions around accuracy and agency: some people overly trusted the system to know they were feeling, while others were skeptical based on their own understanding of emotions

“I’m more open to be like, honest, I guess, like totally 100% honest compared to 95% honest…the 5% can sometimes make a big difference…. I would, you know, send the [stressed otter] instead of being like, ‘Oh, I don’t want to look weak right now by showing that I am stressed.’” — Significant Otter study participant

How can designers use biosignals to build new social technologies?

Biosignals have the potential to improve the way we connect with each other as an authentic cue to express our emotions, but face challenges as a form of AI-mediated communication (where the AI supplies a message based on sensed emotions). Here are some design recommendations for navigating these tensions:

  • Consider how to provide context — A high heart rate could mean you’re really stressed or it could mean you’re walking up some stairs. Designers should include ways for people to provide context for their biosignals, or focus on sharing within relationships in which people already have context about each other. For example, we found that couples knew enough about each other and each other’s schedules that they were able to figure out what each other’s biosignals meant based on just an animation. In other words, the depth of relationships can help overcome the “shallowness” of the simple content.
  • Choose an appropriate representation — It’s important to consider how you want to display biosignals, and therefore how the system interprets them. Too little interpretation (e.g., raw data) puts a lot of work on people to come up with their own interpretation and explain it to someone else. Too much interpretation (e.g., system stating “you are x emotion”) can clash with people’s subjective understanding of their emotions, or lay understanding of how emotions relate to biosignals. AI systems that interpret your emotions can also have an unsettling influence on the way we feel. Designers should explore ways for the system to collaborate with the user, such as by providing options that support different lay theories of emotions, involving people more in system recommendations, or encouraging more emotional reflection. Whatever the method, the system should clearly state how it uses and interprets biosignal data.
  • Nudge sharing — Apps like the Digital Touch require that people think of sharing their own biosignals in the first place. But since biosignals are a new kind of cue, and people aren’t always aware of them, they may not think to share them on their own. We found that notifications and haptic nudges in our apps helped to encourage sharing as well as raised people’s awareness of their emotions. We suggest that designers incorporate similar ways to nudge people to interact with their biosignals, and potentially use the biosignals to find the best moments to nudge (e.g., during moments of high or low emotional arousal).
  • Preserve privacy — In our studies, we found that people view biosignals as intimate information that they may not feel comfortable sharing with just anyone. We recommend ensuring that people have control over with whom, when, and how to share their own data. Designers may also consider focusing on close relationships, as we found people were most willing and interested in sharing/viewing each other’s biosignals when it was with a significant other or close friend/family member. Additional options for preserving privacy, which we used in both of our apps, could include ephemeral sharing, such that people’s data are not permanently stored, as well as processing the data on the client rather than on the server, and allowing for pseudonyms to avoid collecting any identifiable information.

Biosignals are cool and all, but what about smartwatches?

If you’re interested in smartwatches, with or without biosignals, we have some insights for you too! Even without biosignals, the smartwatch can support a lightweight way to communicate and feel as if your partner is there with you, since it’s physically on the wrist. With the right design, smartwatches could become a great communication platform. Here are some suggestions:

  • Take advantage of the watch face, if possible — Animo was available as a watch face while Significant Otter could be added as a “complication” on the watch face. We found that the former was better at prompting communication, where the message to send (biosignals) was more noticeable, easily glanceable, and more easy to send and receive in the moment, vs having to tap on a complication or notification to open the app first.
  • Design for limited screen space — Since watch screens are small it’s important to utilize minimal messages and interactions. We designed short expressive animations as messages since they could be easily understood and quickly sent with just a tap. Text, on the other hand, can be clunky to write out and take time to read (try out some existing messaging apps on the Apple Watch and you’ll see what I mean). We also used biosignals to narrow down the list of available animations, to prevent people scrolling for a long time through all the options. Designers might also consider non-visual means to communicate that aren’t tied to screen space, such as using audio or haptic feedback.
  • Embrace new interaction modalities The smartwatch is a new form factor for interaction. While it can support gestures that we’re used to on our phones (tapping, swiping, etc.), it also opens up new possibilities for interactions that are uniquely available or more seamless on the watch. For example, the Digital Crown on the Apple Watch can be used in place of dragging on the screen to scroll. In Significant Otter, we saw that scrolling by Crown helped to reduce accidental otter sending compared to scrolling by drag. Designers could also take advantage of the sensed data that’s uniquely available on the watch (other than biosignals), such as accelerometer and gyroscope data, which could be used to detect hand and arm gestures as a new kind of input. Finally, smartwatches are heavily used for notifications. Notifications are now fairly sophisticated with customizable buttons and layouts, while still being glanceable and quick to interact with. Designers could thus consider creating new applications that focus more on notification-based interactions.

Download Significant Otter on the App Store to experience expressive biosignals with a partner!

For more research about expressive biosignals, including a proposed causal model and design space, I wrote a whole PhD thesis. :)

If you’re interested in learning more or collaborating in this space, feel free to reach out to our Human-Computer Interaction Research team at Snap!



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Fannie Liu

Fannie Liu


Research Scientist at Snap Inc., designing and studying new forms of social interactions over technology. PhD from CMU HCII.