Free (Robot) Hugs! An Embracing Multimodal Dataset

Synced
SyncedReview
Published in
4 min readSep 28, 2019

--

It only takes a second for humans to form a bond with a hug — the default embrace for conveying warmth and sympathy. The human hug process begins with the motivation for the hug, usually stirred emotions; followed by approaching, which can range from awkward to cautious to eager; then arm-opening, and position and pressure choices, eye contact vulnerabilities, timing questions, and so on.

The simple hug actually presents a highly complex series of actions and emotions replete with rich interactive details for artificial intelligence researchers to study.

As the new generation grows up with omnipresent artificial intelligence, scientists believe robots will be expected to assume more nuanced roles in human-robot collaborations. One of the critical sources for building a theoretical foundation for future collaborative tasks is datasets. Currently, there are not many high quality human-robot interaction datasets.

In an attempt to unlock the science behind hugs, a team of researchers from Arizona State University slapped wearable sensors on 33 humans and collected data on their more than 350 huggings with the humanoid remote-controlled robot “Baxter.” The researchers believe the resulting dataset can help autonomous humanoids behave more like people.

Researchers controlled Baxter remotely. Ohmite FSR01CE force-sensitive resistors (FSR) were used to classify humans’ physical contact as no-touch, soft, or hard. To aid in data collection, human participants wore Myo armbands, pressure sensing shoes, and a hat with motion capture markers.

Participants were asked to approach the robot for a hug then return to their starting point. Researchers used the humans’ armbands to measure acceleration and orientation of the arms, and recorded for example whether the humans contacted the robots at upper forearm, lower forearm or wrist. The pressure-sensing shoes recorded different foot locations. Sensors also identified specific walking states in the important hug approach and withdrawal stages, and the motion capture hats visualized participants’ position and orientation in the test environment.

During the experiments, some participants called out to the unseen human operator with questions such as “Does the robot hug tightly?” and “What if the robot does not hug me?.” Researchers suggest the participants might have forgotten the robot was not autonomous because its acquired hugging skills were humanlike and the experience “immersive.”

Although today’s robots are incredibly smart and complex, they’re still largely made of chunks of metal. Learning a tender technique like hugging will be important for the robots of the future. “The most pure form of physical contact and interaction in humans is hugging” the researchers argue, “(hugging) allows us to learn to pick up on even the smallest social cues and adapt our movements to the person being hugged.”

This new human-robot hugging interaction dataset could help for example to train companion robots to perform greetings, and researchers suggest it could also have applications in assembly, therapy and entertainment robotics.

The paper Multimodal Dataset of Human-Robot Hugging Interaction is on arXiv.

Journalist: Fangyu Cai | Editor: Michael Sarazen

We know you don’t want to miss any stories. Subscribe to our popular Synced Global AI Weekly to get weekly AI updates.

Need a comprehensive review of the past, present and future of modern AI research development? Trends of AI Technology Development Report is out!

2018 Fortune Global 500 Public Company AI Adaptivity Report is out!
Purchase a Kindle-formatted report on Amazon.
Apply for Insight Partner Program to get a complimentary full PDF report.

--

--

Synced
SyncedReview

AI Technology & Industry Review — syncedreview.com | Newsletter: http://bit.ly/2IYL6Y2 | Share My Research http://bit.ly/2TrUPMI | Twitter: @Synced_Global