Building an Emotional Artificial Brain — The Beginning.

Carlos Argueta
2 min readJan 4, 2016

--

In October last year I finally got my long awaited PhD degree. My research topic was Sentiment Analysis (SA), a sub-field of Artificial Intelligence and Natural Language Processing that seeks to identify the polarity of any given text. Put in simpler words, given any subjective text, SA seeks to tell wether the text is positive, negative, or neutral. The result of my long research is a set of short and poorly optimized algorithms that, when combined in a pre-defined order, can yield a very simple emotions classifier. Yes, emotions, not sentiment, which means this classifier can guess (very often wrongly) which of hundreds of emotions a subjective text expresses.

After setting up a working demo of the emotions detection system, my excitement grew quickly. The next obvious step for me was to build a company and create consumer apps that use the technology. Every single person that heard my plan and tried the demo was excited too. Nonetheless, my plan failed. The reason is simple, I realized that a system that can simply guess an emotion from your text is a very crude representation of what an ideal empathetic system should be. So I moved on and I started another unrelated company, with the hopes of one day reviving my previous dream.

Being a tech entrepreneur, I spend much of my time reading the latest tech news. Some of the most discussed trends in recent days are Virtual Reality, Smart everything (homes, cars, devices, etc.) and the rise (and fear) of AI. With every new article read came a new rush of hope, but the fundamental question still remained: How to apply my algorithms and knowledge to these trending areas? I think I have finally found an answer, and this is what this writing, and the ones that will follow are about.

During the last days of 2015 I decided to build an Emotional Brain, or an Artificial Amygdala to be more specific. This is half personal project, half an attempt to predict what will be one of the main components of any future tech humans interact with. Concretely, I will attempt to build a series of algorithms that together can listen, read, and see us, understand our feelings, and reply or act accordingly. A true empathic system. I don’t know if it will be a full fledged conversational agent, or just a simple component of a whole, like an operating system module. Whatever it turns out to be, I will be telling the story in a series of short articles with varying formats. Some might feel like short research papers, others like a story, and others maybe like a random dump of my mind.

Note: I am not a native English speaker nor a profesional writer. I use English as a universal language to tell and document this journey I am about to embark on. I apologize if in any way I’m killing the beauty of this language with my grammar and spelling errors.

--

--

Carlos Argueta

Working on Autonomy for Mobile Robots with an emphasis on State Estimation and the Perception Stack. I occasionally also work on Natural Language Processing.