SwarmHaptics: Haptic Display with Swarm Robots

Lawrence Kim
ACM CHI
Published in
3 min readMay 6, 2019

This article summarizes a paper authored by Lawrence Kim and Sean Follmer. This paper will be presented at CHI 2019, a conference of Human-Computer Interaction, on Wednesday 8th May 2019 at 9:00 in the session “On the Edge of HCI”, in room Lomond.

Takeaway: We explored how a swarm of small wheeled robots could provide different touch patterns to users. We then ran two studies: one to capture how people perceive them and another study to explore how users would generate different touch patterns to express various types of social touch.

Examples of different touch scenarios with a swarm of robots on users

Touch Interaction with Robots

Touch is an integral part of our daily lives. It allows us to not only discriminate shapes and textures but also is embedded in our social interactions (e.g. handshake and huddle) and affective communications (e.g. hug and holding hands).

On the other hand, more and more robots of various forms and sizes are appearing in our daily lives. While they are getting better and better at object manipulation and sensing, little work has been done on enabling robots to use touch for interaction with people. This is due to various factors such as user’s safety and a lack of understanding on both how to design these touch (haptic) interactions and how people would perceive these touches.

Design of SwarmHaptics

To investigate this, we began with one of the simplest types of robot: a wheeled robot without any face or limbs. Leveraging its motion, we explored what types of touch it can provide as shown below. We then expanded to what a swarm of robots could do by leveraging the temporal, spatial, and force coordination among them as shown below.

Examples of different touches possible with a single robot and a group of robots

Study 1: Perception Study

To first understand how people would react to these touches, we ran a perception study where we had the robots provide different touches to the users and recorded how users felt as shown below. For more study details, refer to our paper.

Up to 7 robots provided haptic stimuli to the participants who wore noise-cancelling headphones to isolate any audio cues.

Study 2: Social Touch Elicitation Study

To better capture the expressivity with the robots, we employed a participatory design where users generated the appropriate interaction, or touch patterns with the robot, to convey different types of social touch. As shown below, participants controlled up to four robots simultaneously through a multi-touch screen to convey prompted messages such as “happy” or “move over”. This study allowed us to learn both how people convey specific social touches and a larger picture of how people use different features of the robots to convey information such as contexts and emotion.

Participants controlled up to 4 robots with a multi-touch screen and felt the touch patterns on their own arm.

Actionable Conclusions

1. Based on our study, there is a trade-off when increasing the number of robots for touch interactions. While more robots increase the perceived arousal and urgency, it comes at the cost of perceived likeability and safety. Thus, depending on the application, you may only want to have a small number of robots provide the haptic feedback even if you have access to a larger number of robots.

2. Visual motion of the robots can be used to provide context especially for abstract social touches. For instance, participants used how the robots move away/toward to convey abstract emotions like surprised and afraid. Thus, it is important to take into account how the robots move even when designing haptic (touch) interactions with people.

Full Citation:

Lawrence H. Kim and Sean Follmer. 2019. SwarmHaptics: Haptic Display with Swarm Robots. In CHI Conference on Human Factors in Computing Systems Proceedings (CHI 2019), May 4–9, 2019, Glasgow, Scotland Uk. ACM, New York, NY, USA, 13 pages. https://doi.org/ 10.1145/3290605.3300918

--

--