Rana el Kaliouby, a pioneer in the field of “emotion AI” and cofounder of Affectiva, talked with NPR’s Aarthi Shahani about her new book, Girl Decoded: A Scientist’s Quest to Reclaim Our Humanity By Bringing Emotional Intelligence to Technology.

Reading Your Face: How To Humanize Technology With Emotion AI

By Heidi Hackford

Computer History Museum
Core+
Published in
5 min readJun 4, 2020

--

“When it comes to the digital world, our computers have trained us to behave as if we lived in a world where none of us can read one another’s emotional cues.”

- Rana el Kaliouby, Girl Decoded

Like emotion AI itself, el Kaliouby’s book is an attempt to bridge the divide between hard science and humanistic perspectives: it intertwines her personal journey with her professional career in AI research and as a tech entrepreneur.

There can be no true connection for human beings without the sharing of emotions. It’s universal and necessary for daily functioning, and just about every thought, action, and interaction involves emotion. But our technology cannot read emotional cues. To be truly effective tools, computers must learn how to read and adapt to human emotions. In other words, artificial intelligence (AI) must include emotional intelligence. During a CHM Live virtual event on May 18, 2020, Dr. Rana el Kaliouby, a pioneer in the field of “emotion AI” and cofounder of , talked with NPR’s Aarthi Shahani about her new book, Girl Decoded: A Scientist’s Quest to Reclaim Our Humanity By Bringing Emotional Intelligence to Technology.

Register for Upcoming CHM Live Events

El Kaliouby believes that science behind artificial intelligence and machine learning must be made more accessible so that people can become more informed consumers of the technology that uses their data. Like emotion AI itself, her book is an attempt to bridge the divide between hard science and humanistic perspectives: it intertwines her personal journey with her professional career in AI research and as a tech entrepreneur.

About Faces

Born into an Egyptian Muslim family, el Kaliouby was not allowed to date in high school or college, but both parents suppported her educational goals. She attended college in Egypt and became inspired to invent an algorithm that would allow computers to recognize a human face after reading MIT professor Rosalind Picard’s book, Affective Computing, which argued that computers need emotional as well as cognitive intelligence. After receiving her master’s degree, el Kaliouby traveled to England to continue her research at Cambridge University. There, she kept in touch with her family-and her new husband-through texting. Their inability to pick up on her homesickness led to an ah-ha moment: what possibilities could be unlocked if machines could be taught not only to recognize a face but to read facial expressions, to not just create the illusion of connection, but to convey a real emotional connection?

Rana el Kaliouby describes the moment when she realized she wanted to teach machines to read human emotions.

The Business of Emotion

“The face is a fascinating canvas for communicating expressions,” says el Kaliouby. Although there are other ways to read emotions, like hand gestures and voice intonations, she focused her research on teaching computers to read facial expressions. Like her personal life, her research had its ups and downs, but in 2004, she met Rosalind Picard in person during a visit to Cambridge. They immediately hit it off, recognizing the complementary nature of their research, and when el Kaliouby returned to Egypt after obtaining her PhD, they applied for a National Science Foundation grant that would allow them to work together at MIT. Though it was initially rejected as “too ambitious,” Picard convinced el Kaliouby to build the system anyway to prove it could be done and then reapply. So, el Kaliouby moved to the US.

The two scientists succeeded, creating an emotion AI system to help autistic people read others’ emotions. Essentially, emotion AI applies algorithms to the face that analyze pixels in, for example, the corners of the mouth, to classify expressions. Combinations of these facial expressions are then mapped to emotions. El Kaliouby’s plan to pursue an academic career at MIT seemed set. But then an unexpected opportunity arose. Interest in the emotion AI technology spread as leaders in different industries imagined how it could be used in their own businesses. The director of the MIT lab encouraged Picard and el Kaliouby to start a company and develop commercial applications for emotion AI. Affectiva was founded in 2010. Next came the entrepreneurs’ first challenge: to find funding to support their expensive research. But could tech investors in Silicon Valley understand what “emotion AI” was all about and be willing to take a bet on it?

Rana el Kaliouby remembers pitching emotion AI to skeptical investors.

Investors signed on, attracted by the company’s potential for market research and digital advertising. Commercial applications, not autism research, attracted Proctor & Gamble as a significant early partner that launched Affectiva into 19 countries and generated revenue. In turn, increasing usage meant the company could collect more data and improve the accuracy of its algorithms.

Beyond advertising, other applications for Affectiva include flagging unsafe driving for automotive technologies. For example, movements in facial expressions that map to emotional states like fatigue or distraction might trigger the car to take real-time actions, such as taking over steering, or signaling the driver to pay attention. Applications for mental health might involve tracking facial and body markers for stress, anxiety, or depression and alerting family members or medical providers when they deviate from an individual’s baseline.

Facing the Future

Picard and el Kaliouby split in 2013, with Picard continuing her work on hardware-focused emotion AI and el Kaliouby, as Affectiva CEO, continuing to develop software. As part of that world, she is adamant that consent and opt-in from all people subjected to Affectiva’s machine learning algorithms is non-negotiable. She’s found that in the last few years, consumer awareness of data collection and its downsides is becoming more prevalent, and she’s an advocate for promoting greater balance in the power asymmetry between people and the tech companies that profit from their personal data. El Kaliouby also believes that most of the tech industry takes a simplified approach to emotions and to avoid perpetuating bias insists that software development teams who are building algorithms represent the full scope of human diversity. Emotion AI must be inclusive in order to create real connection, she says. During the current COVID-19 pandemic, tech tools have become essential, and we must do better on this front in the future.

Rana el Kaliouby imagines integrating emotion AI into communication platforms.

Communicating long-distance and through face masks during the current pandemic in some sense mimics el Kaliouby’s experience over a decade ago trying to maintain her personal relationships through glitchy, underdeveloped technology. We’re all missing in some degree the natural feedback systems that help us decode each other’s emotions. Perhaps like so many other outcomes we can not yet determine, this tragic time will advance our technology in ways that will enable us to communicate through it with the depth and unmatched power of true human connection.

Watch the Full Conversation

“Girl Decoded: Disrupting Industries and Humanizing Technology with Emotion AI,” Rana el Kaliouby in Conversation with NPR Contributor Aarti Shahani, May 18, 2020.

Related Articles and Events

Originally published at https://computerhistory.org on June 4, 2020.

--

--