Our paper, which explores whether people can perceive as if swarm robots were part of their body, has been accepted for CHI 2024

Shigeo Yoshida
OMRON SINIC X
Published in
5 min readApr 5, 2024

This is Shigeo Yoshida at OMRON SINIC X (OSX).

Our paper, which explores whether people can feel and manipulate swarm robots as if they were part of their own body, has been accepted for CHI 2024, one of the top venues in Human-Computer Interaction (HCI) (acceptance rate: 26.3%).

This work was carried out by our intern, Sosuke Ichihashi, a Ph.D. student at Georgia Tech, and So Kuroki, a project researcher.

Sosuke Ichihashi, So Kuroki, Mai Nishimura, Kazumi Kasaura, Takefumi Hiraki, Kazutoshi Tanaka, and Shigeo Yoshida. 2024. Swarm Body: Embodied Swarm Robots. In Proceedings of the CHI Conference on Human Factors in Computing Systems (CHI ’24), May 11–16, 2024, Honolulu, HI, USA. ACM, New York, NY, USA, 19 pages. https://doi.org/10.1145/3613904.3642870

In this work, we created Swarm Body, a system that allows users to control swarm robots — robots that operate collectively, similar to a swarm of ants, to accomplish a single goal— as if they were their own hands. For more information, please refer to our paper and video.

In the current article, Sosuke, the first author of the paper, introduces our contributions.

Embodiment

“Where is the boundary between our body and the external world?”

The answer to this question seems easy at first glance. Yet, upon reflection, our daily experiences often reveal moments where the boundary between our body and the external world becomes indistinct. Examples include eating food with cutlery, engaging in sports with equipment, controlling a video game character, or driving a car. These instances exemplify the concept of embodiment, where individuals gain the ability to manipulate objects as seamlessly as if they were extensions of their bodies, facilitated by appropriate conditions or design of interactions.

Embodiment is utilized for various interactions, including virtual reality (VR) avatar controls and construction machine manipulations; e.g., a construction machine picks up materials according to the operator’s pick-up hand movement. The embodiment of construction machines augments the human body and enables precise and intuitive movements in terms of size and power. As such, embodiment offers new characteristics and abilities to the human body.

We thought about what kind of body we wanted to have. A computer or smartphone screen depicts various objects with a simple set of pixels of light. Then, if instead of pixels, the body was made of many small robots, it would be possible to freely change the shape and movement of the body according to desired tasks and situations. This is the idea of this study.

In this study, we developed Swarm Body, a system to control swarm robots as if they were our own hands, in VR and the real world. We investigated through experiments whether Swarm Body can actually be embodied as the hand.

A system to control swarm robots as if they were your own hand.

We developed both a VR and a physical version of a swarm robot system, designed to mimic the user’s hand movements and shapes.

First, we implemented a framework for controlling swarm robots based on hand shapes and movements. In this framework, at each timestep, (1) robots’ subgoal positions are determined according to the hand shape and position, (2) a robot is assigned to each one of the subgoals, and (3) robots’ paths are planned to enable the robots to reach the subgoal positions without collisions.

Framework for controlling swarm robots based on hand shape and movement

We implemented two algorithms for step (1): one based on the hand skeleton (bone) and the other based on the outline of the hand (silhouette).

Top: Bone algorithm, Bottom: Silhouette algorithm

Based on the results of the VR experiment, the physical robot was designed in collaboration with Karakuri Products Inc. and Cluster Metaverse Lab. to achieve quick movements while maintaining a compact form.

VR and Physical Experiments

We investigated how the level of embodiment changes with robot size, density (number of robots in a swarm), and control algorithms (bones, silhouettes, etc.) in a VR setting. Participants engaged in tasks that required them to move swarm robots to replicate various hand shapes, after which they provided feedback through a questionnaire and an interview about their sense of embodiment.

The results showed that regardless of the robot size, Swarm Body achieved a higher than moderate level of embodiment. It was also found that Swarm Body felt more like the hand when their subgoal positions were derived from the hand skeleton (bone algorithm).

Based on these results, we further investigated how the robot’s density and control algorithm would change the level of embodiment in a real-world setting. The results suggest that the bone algorithm leads to a high level of embodiment in the real-world setting as well.

Experiments in a VR setting
Experiments in a real-world setting

Applications of Swarm Body

We envision remote communication and collaboration as the primary application for Swarm Body, showing the new possibilities opened up by swarm characteristics. Swarm Body allows a person to physically interact with a person or object in a remote location.

Left: Gesture, Right: Haptic communication

Furthermore, unlike conventional robotic arms or humanoid robots, Swarm Body can split from one hand into two hands and change hand size dynamically. The flexibility and scalability of swarm robots can be given to the human body.

Left: Multipliable body, Right: Transformable body

Conclusion

In this study, we developed a framework and designed a robot to realize an embodied system with a swarm consisting of many individuals, and implemented it as a Swarm Body. Our investigation focused on how the robot size, density, and control algorithms influence the level of embodiment. We believe that this research will be the first step toward a future in which humans will be able to dynamically alter their body shapes and movements.

Call for Interns

At OSX, we will continue fundamental research on natural language processing, computer vision, machine learning, robotics, and HCI. If you are interested in working with us as an intern, please visit our call for internship page.

--

--

OMRON SINIC X
OMRON SINIC X

Published in OMRON SINIC X

At OMRON SINIC X, we are committed to conducting cutting-edge research, to design the near future of our society, and to make our visions come to life. Our primary focus lies at the intersection of robotics, computer vision, machine learning, and human computer interaction.