Building a Positive Psychology Robot

Shreyas Agnihotri
6 min readNov 18, 2019

--

Programming a humanoid robot to serve as a student assistant and mood-boosting companion: a case study

Background and Goals

For the final group project in my Human-Computer Interaction class at HKUST, my team was tasked with programming a robot to fulfill a useful purpose. We had studied the applications of robotics and intelligent technologies in class, and were challenged to create a working and relevant human-robot interaction.

Our tool was Pepper: an autonomous, programmable semi-humanoid robot developed under SoftBank Robotics. The robot possesses a complete multimedia system, including:

  • four microphones (for voice recognition and sound localization)
  • two HD cameras (for computer vision, including facial and shape recognition)
  • a 3-D depth sensor (behind the eyes)

After learning the intricacies of programming the robot to interact with a human user, we set out to design an optimal usage condition for the technology.

Initial Observations & Brainstorming

Before diving into robotic solutions, we decided to talk to users about their baseline perception of robot assistants. These conversations with other students carried a key theme: our solution had to subtly integrate the robot into everyday life, rather than aiming to replace types of human interaction entirely.

People aren’t that comfortable interacting with a human-like robot unless it’s just doing really simple, like showing me directions. Otherwise it’s kind of creepy.

My group had observed that robots serve best as a supplement to human interaction: cases when there isn’t enough manpower or the task is simple/repetitive.

We decided to limit our solutions to applications within educational contexts, since these were the users and situations we had best access to for testing.

A mind-map of potential educational human-robot interactions

Coming out of a mind-mapping session, we honed in on a specific application: robots as a tool for positive psychology. Our target users in this case would be people in unique mental states, be it children, individuals with autism, or people coping with depression/frustration. Most current solutions to interact with this user group suffer from a lack of constant monitoring or feedback: consider a person with autism who wants to practice social interactions but doesn’t have a consistent companion to do so with. Robots serve as the bridging gap here between the user and the traditional means to help them.

Ideation: Storyboards & Speed Dating

My group sketched a number of applications of robot technology in the context of positive psychology:

Autism Aid Robot

  • User: Children with autism
  • Need/Desire: Improved social skills
  • Our Insight: A companion robot can develop their social skills and improve EQ

The focus with this robot was on children with autism, who have been proven to suffer from more intense and frequent loneliness compared to non-autistic peers. Our robot would serve as a companion for such a child, and reinforce beneficial social skills to help the child better integrate with the rest of the world.

When we took this concept to users, feedback was lukewarm:

There’s a big benefit to helping kids with autism socialize, but I don’t think a robot will teach them real social skills. It will also be hard to represent facial expressions.

The main issue here was that autistic children need genuine social interaction most of all, and the limitations of a robot as a social companion might limit the development of authentic skills.

Children’s Challenges Robot

  • User: Primary school children
  • Need/Desire: Relieve boredom, engage mind
  • Our Insight: Use a robot to play strategic games like maze-solving that involve both interaction and entertainment
Sketch of an example use case/interaction with the Children’s Challenge robot.

This robot is designed to be used with young children, allowing for one-on-one interaction that the traditional student/teacher model doesn’t allow. Students can play games with the robot that challenge their minds and keep them occupied simultaneously.

Though the concept is useful, user feedback questioned the application of robotics in this context:

It’s a fun idea, but kids might not be engaged without rewards. Also, there’s no need for a human-style robot when they can just play games on a phone or with a toy.

A robot was perhaps overkill for the task of occupying children.

Dancing Companion Robot

  • User: University students
  • Need/Desire: Relieve stress/tension
  • Insight: A robot can lighten student’s mood by using proven depression techniques like dancing

Overwhelmingly, the students around us pointed to issues not with teaching or school, but with the stress these institutions caused. Our solution was to build a robot that students could interact with as a study break or mood-boosting device. The goal would be for the active nature and silliness of the product to improve students’ moods.

Feedback was extremely positive for this concept. Though trivial, a dancing robot would integrate well with existing infrastructure; students imagined interacting with these robots within the library.

I think this would be a fun way to relieve stress during exams, kind of like the robots in airports that make the experience more interesting.

We decided to stick with this last idea for the prototype stage.

Building a Prototype

Pepper’s programming environment, letting you link together behaviors in a drag-and-drop interface

The development environment for Pepper is surprisingly intuitive. We used the Coregraphe environment to connect a local machine with HKUST’s robot prototype and program interactions. The goal behavior was as follows:

  • Pepper is pre-programmed to integrate with a student’s calendar and the school’s exam schedule.
  • Pepper pings the student during times of high-stress (before/during exams) and invites the student to come play. Students can also go interact with the robot directly.
  • Once the student initiates, Pepper has a conversation with them and performs a choreographed dance.
  • Pepper can simultaneously play music and converse or interact with the student using its depth and audio sensors.

Demo & Testing

Watch the video below to see the robot in action and a user’s reaction:

Our video prototype and feedback from a sample user

User Feedback: Takeaways

Our demo with a sample user left us with a few key insights about the nature and future of this product:

  1. Add more interaction and real-time sensing of the user. This is partially a limitation of the robot and the coding environment, but our user reported that the robot should feel more interactive. Students looking for a mood-boost should feel like they’re actually playing with something, not just watching it.
  2. Pepper is fun to play with no matter the dance. The stress relief comes in the novelty of the interaction method and the silliness of the dance, not necessarily how complicated it is. Students reported that it felt fun to play with a humanoid robot and this alone was a distraction.

Contribution & Reflections

My group worked together though this project in all stages, but my main mark came in ideation and the strategy behind the final product. I led my groupmates in each contributing a storyboard and potential use case for a robot, and spoke directly with users to identify the feasibility of each. My group was especially torn on the key domain of the project: whether we should focus on positive psychology or other areas, such as athletics. I helped steer the team towards the category that would benefit most from the innovation, and proposed another round of user interviews to validate our decision.

In the process, I also learned a significant amount about robot programming.

  1. Existing humanoid robots have extremely low barriers to development. Despite no past robotics experience, our group was able to quickly grasp Pepper’s programming interface and develop interactions using the physical prototype. This is partly a credit to SoftBank’s technology and Coregraphe’s simplicity, and makes me feel very optimistic about the accessibility and amount of innovation that will come out of this space moving forward.
  2. Pre-programmed behaviors are insufficient for genuine human-robot interaction. Our robot was programmed to interact with users in very concrete ways: responding to certain lines and performing certain movements. This was novel at first, but quickly became a less interactive experience over time. I expect that any serious adoption of humanoid robots in the future will require them to modify their behavior through repeated interactions with users, using the same core machine learning technology behind digital assistants like Siri.

This project taught me a lot about the challenges behind developing authentic human-robot interactions, and the current state of robotic technology. I’d love to return to this project from the lens of a machine learning-focused application that makes the robot more dynamic and practical.

--

--