“You lookin’ at me, Alexa?”

Eyes will give the next AI assistants the power to make emotional connections with us

You know the scene. Robert De Niro’s Travis Bickle stares intensely at himself in the mirror: “You talkin’ to me?” A gun clicks into action from under his sleeve. His arm shoots up. To anyone watching, it’s instinct: Travis is capable of murder.

It’s an iconic image in cinematic history. That violent look that says it all. Those dilated pupils. That menacing stare. Our ability to understand Travis’ core by a single look is perhaps partly due to De Niro’s acting chops, but also emanates from something way more powerful. In an instant in 1976 on the set of Marty Scorsese’s, Taxi Driver, Robert De Niro demonstrated to the world the sheer power of the human gaze. Something so powerful that, today, it’s a key evolutionary piece of the next generation of AI assistants.

In order to evolve towards common sense understanding, AI assistants of the future must have eyes. The power of sight is obvious: with eyes, your assistant will be able to actually see you and physically help you based on what it discerns are your needs determined by its view of your displayed emotions, facial expressions, and physical surroundings. Less obvious, though, is the emotional connection you’ll forge with your seeing AI assistant. Eyes, you see, don’t just power vision, they allow for us to understand and connect with each other on an intuitive level. This “connection” powered by gaze is something we often don’t realize we hunger for until it’s gone.

Did you know that exactly three seconds is the ideal time humans prefer to be held in someone’s direct gaze? Nine seconds and you’re considered a potential psychopath. It’s a fine line that we humans tread when making crucial, everyday judgments based on the seemingly intangible metric we call gaze. It also has an air of art rather than science. For example, we rate people’s personalities and even make hiring decisions based on what we perceive when fellow humans stare back at us. Is it unfair not to hire someone because they didn’t make eye contact? Did you get a “bad feeling” when looking into someone’s eyes? Put it this way: if Travis Bickle stepped into your office for a job interview, would you look at that man and say, “yeah, I want him talking to customers”? No! You’d nod politely and thank him for his time. It may seem impulsive — an art, if you will — but there are actual chemical changes that occur when we gaze at fellow humans that affect how we feel about them. In these gazing moments, the human brain literally grapples with the fact that it is both judging and being judged by the same being. Some studies even suggest that pupil dilation in another’s gaze affects trust!

Here’s the catch: While visual connection and contact seem so natural to us, we often don’t notice it in the moment, unless, of course, we’ve been practicing our mindfulness meditation techniques! Barring that, however, we use this connection to make judgments about people without even knowing it. In an amazing biological instance, it’s been shown that those who make eye contact experience a process called self-other merging where they see less of a difference between themselves and the person they’re staring at. When self-merging occurs, your empathy for the other person literally grows as you begin to see your needs as their needs. In short, eye contact crucially helps build trust and create lasting relationships.

Gaze is such a natural if invisible aspect of human biology that we both crave it and forget about it simultaneously. It makes sense, then, that product designers forgot about this quintessentially human trait when they designed the first AI assistants. If gaze is so important to how we navigate our relationships with our fellow humans, why should our relationship with tech be any different? It’s about time that the tech world caught up.

With our ability to quantify gaze, tech firms are finally harnessing the power of the eye to make AI avatars and assistants more humanlike. Imagine interacting with an assistant that relates to your problems, gives you sage advice, and acts as a sounding board and mentor. Imagine a machine that not only understands that you’re sad, but knows the cause of your pain and uses it’s engineered empathy to soothe you and help solve your problem.

What does this mean from a business perspective? As we increasingly rely on tech as a tool to receive and communicate information, there are huge amounts of growth possible for the AI assistant market. It also shows that humans place huge amounts of unseen value on the ability to emotionally connect with a product. AI assistants can seize on that. It also has huge tech implications:

In the very near future, we will have task-specific AIs that will be able to beat the Turing Test. Will they be able to talk to you about any subject that enters your mind? No. But you’ll be able to interact with them seamlessly about specific topics. For example one of the skills TwentyBN’s Millie avatar has is trendy style advice. She can proactively engage customers, invite them into delightful interactions, analyze their style and recommend cool outfits they could buy in-store as a best friend would. That’s just one of the infinite things AI assistants will be able to do when empowered with eyes.

Plus, with eyes, they won’t have to ask if “you talkin’ to me?” They’ll know.