Emotion AI & Biometric Data

Behavioral Signals Team
Behavioral Signals - Emotion AI
4 min readSep 11, 2019

VoiceSignals #3 —Musings on Voice tech news

The use of biometric identifiers is widespread and a lot of the times we don’t put too much thought into it. We use our fingerprint to activate our phone, our face to go through airport check-ins, and our voice to activate our mobile voice assistant. As facial emotion recognition is now being examined as possibly part of the face geometry, privacy and ethical concerns have risen. People are reading about emotion recognition and wondering where are the boundaries between using our bio identifiers to open doors or check into apps, and when does it become an Orwellian dystopia of surveillance and control? And there are doubts on how accurate facial recognition is. London’s Metropolitan Police trialed a facial recognition technology, to spot suspects, and 80% of the times it failed to work… possibly ruling it ‘illegal for use’ according to researchers.

Emotion recognition should not be used as a unique identifier between humans. Emotions are universal. Happiness is almost identical for humans irrelevant of race, age, gender and easily recognizable in the face, in the voice and in the body. On the other hand no single company has ever talked about 100% accuracy in detecting emotions, while some emotions, like fear or surprise, could often be confused between them. So using it for law enforcement cases may not be reliable or even ethical. Emotion recognition can serve a much higher purpose of improving our human to human or human to machine conversations.

Rana Gujral, CEO at Behavioral Signals, argues that while hiding your face to avoid scanning is hard …”we wear our faces everywhere”… our voice is much easier to consciously control. We decide with whom and when to share it. While he recognizes that the revelations of contractors, having access to voice assistant recordings (Alexa, Siri etc.), has alarmed people, regarding their privacy, he emphasizes that voice can offer a wealth of emotional insights for a more personalized experience at work and at home. It can be used to control our appliances, our car, and our daily schedule, but also to predict issues with our health, our needs and our desires.

In the end it’s up to companies to be more transparent on how they use the data they collect, authorities to place restrictive limits, and regulators to design laws that will protect people’s privacy and their right to data transparency. “Artificial Intelligence could be among our most powerful tools to build a better, fairer society. Or it could exacerbate inequalities and constrain freedoms.” -RSA

Emotion-aware Movie Characterization

Emotion-aware Movie Characterization

So how can knowing about the ‘emotional degree’ of a movie, beforehand, influence how we search for films to watch? According to research, emotions that are expressed by film actors can also play an important role in discovering a high-level content representation that will make the recommendation and search process richer. In other words, the negative, positive, strong and weak emotions that appear in a movie can influence our movie preferences. Thodoris Giannakopoulos, Director of ML at Behavioral Signals, is presenting the research paper (Emotion-aware movie characterization with Oliver API, Sep. 2019) at CBMI2019 in Dublin, and due to the occasion wrote a summary post of the modelling behind recognizing and categorizing emotions from films. The team decided to narrow the analysis to films directed by 6 specifc Directors ranging from Aronofsky to Tarantino and Scorsese. According to their findings the Coen brothers and Roman Polanski are illustrated as the most “compact” directors in terms of both dimensions -emotions in what is being said compared to howit is being said- while Aronofsky and Scorsese are most often “outliers”. Also, Woody Allen can be distinguished from Tarantino, Coen brothers and Aronofsky by only using the emotional representation, i.e. emotions from how something is being said, with accuracy that reaches almost 100%.
Read full paper on Medium>

The Power & Insights of Voice

Minter Dial is a professional speaker & consultant on branding and digital strategy. He also conducts his own podcast interviews, with entrepreneurs, where he talks strategy, business growth, and Heartificial Empathy, as his newest book is called. He’s had a long and successful international career at L’Oréal, which gives him extra insight of the digitally enhanced marketplace. He interviewed Rana Gujral, CEO at Behavioral Signals, and they discussed the power of Voice and what kind of insights it can offer for businesses.
Listen to the full interview >

Women Leading The AI Industry

“Look around for companies that are working in the AI domain. Even if you are not an engineer there is always some way you can contribute and you could be useful” advises Vicki Kolovou, Head of Marketing at Behavioral Signals. She was interviewed by Tyler Gallagher for the Authority Magazine and they discussed technology, the projects she’s working on now, but also what excites her in artificial intelligence.
Read full interview >

LoL now has a Voice Assistant

Voice controlled games are on the rise. League of Legends, the infamous multiplayer online battle arena video game, developed by Riot Games, with over 80 million players has its own voice assistant, Fridai. You can ask Fridai about champs, counters, strengths and according to hallid_ai, the skill developer, more actions will be added. The company is adding the skill to more and more games, like Hunt: Showdown or OverWatch. Obviously, it’s a matter of time before other companies get into the VAs for Games trend and start producing their own versions. It will be interesting to watch how the gamers are using them.
Read more >

Written by Vicki Kolovou for Behavioral Signals

#voicefirst

--

--