REEI / Robotic Emotions Experience Interface
AcrossRCA project
In January of 2018, Annette Zimmermann, vice president of research at Gartner, proclaimed, that by 2022, personal devices would know more about ones emotional state than their own family. But how about the other way around? Assuming that future artificial intelligence will develop emotions too, will humans be able to understand these robotic emotions?
In this speculative project we create an immersive multi-sensory interface to expand the interaction between robotic technologies and human beings. Our prototype consists of an ‘empathy-skin’ deploying the olfactory, visual, acoustic and haptic sensations. The aim is for the interface to enrich emotive relationships between the robot and the human.
We believe that robots should be able to express the four main human emotions: happiness, sadness, fear and anger. For the purpose of our project, each emotion becomes a certain smell, a certain haptic experience as well as a visual and acoustic expression. A multi-sensational experience, like our prototype, could be used in the near future to support and improve the interface of artificial intelligences.
This speculative project was developed during the AcrossRCA week at the Royal College of Art in 2018. The Team was part of “Sense-Ability — Multisensory Accessory Design” led by RCA IDE Visiting Lecturer Wan-Ting Tseng, RCA IDE alumnus Jack O’Leary McNeice and CSM graduate and future storyteller Anna Nolda Nagele.
Team Members: Moritz Dittrich, Janina Frye, Guorong Luo, Sushila Pun and Yiling Zhang