Building an Emotional Artificial Brain — Apple Inc.

Carlos Argueta
2 min readJan 8, 2016

--

Apple just bought Emotient, a startup that uses Artificial Intelligence to read people’s emotions by analyzing their facial expressions. Also, in October of 2015 it bought VocalIQ, another startup that uses speech technology to teach machines to understand the way people speak. As usual, Apple did not disclose the reasons behind the acquisitions.

Why is Apple interested in such technologies? To me the reasons are clear. If you have interacted with Siri, Apple’s virtual assistant for mobile devices, you have probably discovered how limited it is. My guess is that the company is trying to inject Siri with Artificial Emotional Intelligence (or some sort of Artificial Emotional Brain), in an attempt to make interactions with the system much more natural. The missing piece of the puzzle? Technology to understand not just our faces and tones, but the implicit emotions hidden in the meaning of hour words. If you haven’t yet, please have a look at my demo of Emotion Detection from Text.

This article is the third in a series that I will be writing as part of the process of attempting to create an Artificial Emotional Brain.

You can read more about Apple’s acquisitions here.

Note: I am not a native English speaker nor a profesional writer. I use English as a universal language to tell and document this journey I am about to embark on. I apologize if in any way I’m killing the beauty of this language with my grammar and spelling errors.

--

--

Carlos Argueta

Working on Autonomy for Mobile Robots with an emphasis on State Estimation and the Perception Stack. I occasionally also work on Natural Language Processing.