How Film Theories Inform Conversational AI

Film theories tell us body language is the ultimate natural language. iGenius Conversational Designer, Anna Do Amaral, explores how this translates to Conversational AI

Anna Do Amaral
Ideas @ iGenius
5 min readJan 28, 2021

--

In the book, “Film Theory” by Thomas Elsaesser and Malte Hagener, there’s a statement that makes a compelling point about communication:

The language of gestures is the true mother tongue of mankind.

Our Conversational AI team is always searching for new solutions to better understand the user and how to establish a meaningful communication between them and our virtual advisor.

Film theories could offer a fresh point of view on our research. Let’s see how in the next paragraphs.

The Language of Gestures Makes Empathy Happen

The aforementioned statement, by the Hungarian film critic Béla Balazs, suggests that cinema expresses a wide range of latent meanings through the gestures or body language of the characters.

When you see someone making gestures or acting upon an object, this triggers a simulation of the action in your brain. Through your mirror neurons, you also feel the same emotional responses that the other person shows. That’s how you get fully involved in a movie.

Can AI Show Empathy With Gestures?

The movie “Blade Runner” plunges you into a world where dangerous androids look and act just like humans except for being unable to show empathetic responses to emotional triggers. In a case scenario where AIs aren’t dangerous — this could be a missed opportunity.

AI studies are increasingly focusing on empathy. An interesting approach is precisely mimicking the human body language. That’s what they did with Sophia, a social humanoid robot developed by Hanson Robotics. It’s an attempt to create an Artificial General Intelligence (AGI), a machine that is able to understand or learn any human intellectual task.

Can AI and empathy go hand-in-hand?

Build a Connection Through Differences

Is human-body mimicking the best way to close the gap between humans and machines? Objects that look too human-like, but are recognized as nonhumans, tend to make people uneasy. This phenomenon is called the Uncanny valley effect — as mentioned in this article.

Christian Metz, pioneer of film semiotics thinks that the viewer empathizes with the movie by seeing themselves while watching it. Everything that happens on the screen could be acceptable as long as you are able to distinguish yourself from the character.

It follows that maybe users can truly empathize with a machine only when they can clearly distinguish themselves from it.

Nonverbal Communication… in Conversational AI?

Film theories focus on the actors’ gestures but what if we think of a broader meaning of “gestures”?

Unlike Sophia, most Conversational AI products don’t involve any visual components that strictly relate to the human body, therefore “gestures” themselves.

It’s all about verbal and nonverbal communication, a thin net of relationships that we create between the human user and the AI.

Don’t Design Humans, Design Empathy!

Bringing a human approach to Conversational AI is one of our main goals, as explained in this article about empathy in Conversational Design.

No matter how hard you try, a machine will never be a “real” human, though, just like a movie character will never be a real human either — even if it looks like one.

Empathy is not necessarily about trying to act like real humans. It’s about trying to connect with them. On this side, Conversational AI has a huge advantage over cinema: it can actually interact dynamically with the user.

Empathy is the game-changer in AI design

Be Human-Like by Establishing a Real Connection

An emotionally intelligent AI should have the two-fold capability of deeply understanding humans and displaying empathy.

This leads us to artificial empathy, an AI feature that is able to detect human emotions and respond in a way that the human user recognizes — or, more simply, in an empathic way.

Considering all the aforementioned theories, probably the right path for empathetic AI is presenting itself as different from a human, but able to deeply understand users’ meanings and intentions scattered through their words and even behind their words. In shorter terms: establishing a real connection.

How do we Design Artificial Empathy?

It all starts with closer attention to the natural language classification: other than focusing on user requests which are strictly in scope with our business, we also take into great consideration other aspects of human emotion.

We make sure that the users know we understand their emotion and, more importantly, that we address it by making them feel understood.

For example, if we detect an error, we can expect the user to feel lost and frustrated. Here it’s crucial to show displeasure and help them find a solution, just like a human would — as mentioned in the article Error Messages for Advanced Conversational AI.

But as technology progresses, there will be more. Emotion recognition is a path for future exploration!

Let Your Conversational AI Use Its Own “Gestures”

The other side of the coin is displaying empathy. Get ready to break new ground with natural language generation, voice design, and interface design.

Displaying empathy with components befitting a Conversational AI is possible. When colors, sounds, shapes, and motion work in tandem with words, they can become the gestures an AI displays empathy with — its very own nonverbal communication.

Could the eye of a lense mimic the eye of a human?

In Conclusion — From Film Theories, We Can See…

  • users’ meanings and intentions are to be found even beyond their words
  • an AI can use its own language of gestures
  • empathy can fully involve the user in the AI experience — just like cinema does!

I can’t wait to see what we’ll create for our products in the future to make conversational AI experiences even more unique!

…And clap your hands if you want to give some empathetic response and avoid being mistaken for dangerous Blade Runner androids. :-)

--

--