Robots have feelings too.

Joe Paganucci
Frontiers
Published in
3 min readJun 14, 2016

--

A few weeks ago a co-worker was showing her parents around the office and was on the lookout for Needybot so they could meet. They met on the fifth floor by the elevators and immediately began to introduce themselves—I was nearby so I went over in case they had any questions.

Within a few moments of pushing Needy’s eye her dad got down to Needy’s level —only about two feet off the ground — to say hi and take his picture for Needy to say hi back. Her Dad forgot about all of his assumptions and found the most comfortable way for him to say hi: on all fours. He said hello to Needybot in a way I never thought of. Here was an older man being a kid again—being curious and enjoying it.

When we began to design Needybot my mental model of robots came from movies like 2001: A Space Odyssey—clean, sterile, and modern. Their eyes were more of a reaction, a color blinking or glowing like the sleep indicator on a computer. Their personality isolated to the voice while the eyes showed very little emotion.

This is where my conjectures about robots were wrong—we had begun to design with all the wrong ideas of what Needy really needed to communicate. We focused our research on historic forms of communication such as morse code and hexagrams of I Ching—two forms of communication that don’t have much visual personality.

We designed a clean and modern visual approach to showcase Needy’s emotions through color and animation—we created “playful” positive or negative reactions based Needybot’s interactions with people.

It was all looking great.

A few emotional states: jovial, anxious, and depressed

Once we moved into motion, however, everything fell apart. The motion was clunky, disconnected, and didn’t really tell people what Needybot’s feelings were—we had forgotten about including personality in Needy’s eye.

Our first attempt was a misfire.

We went back to our problem: how do we design a robot eye that tells us it needs our help?

Our eureka moment came when Keith, a team member, was telling me how his daughter showcased her emotions in fun ways—the faces she made really showed what she was feeling; this quickly became our inspiration for Needybot’s eye.

We took a step back and began researching how people show emotions through their eyes. There are a surprising amount of macro-view eye videos on YouTube if one desires—Ren and Stimpy and Looney Tunes became excellent resources for inspiration.

To help simplify the interactions we focused the range of emotions that Needy could display to three: happy, sad and meh. From there, we sketched explorations for each emotional state, as well as success and failure animations. We tested the designs in Needy’s shell to see if the eye fit his personality.

Details were added to help remind friends of Needy’s emotional states. Some worked, like eyelids to personify sadness. Some, like bushy eyebrows, were good on paper, but just plain bad (and kind of funny) thinking back on it.

Finally, we added motion and, with that, Needy’s eye began to speak to us.

Watching my co-worker’s Dad introduce himself in a different way created a moment that reminded me that what is comfortable may not always be the right solution—and when designing to appeal to human emotions, robots could all use a little more humanity.

--

--