Can Software Monitor Your Emotional State Better Than You?

Brain9D

Ever wondered how Facebook knows it’s you in that blurry beach snap you just uploaded? How Snapchat manages to match that dog’s nose and tongue directly up to yours?

I must admit the first answer that popped into my head was not ‘facial recognition’ software. I was thinking more along the lines of these big tech giants employing teams of monkeys to be on hand 24–7, posed and ready to point at where my face was on the screen.

I had my doubts about the ethics of this method, so I did some digging. It turns out the technology behind all this is way more powerful and sophisticated than I initially thought.

Recognising someone’s emotions by looking at their face, is something that humans have been doing since the dawn of time. It has been integral to our development as social beings. It is something which has been distinct to mammals, until now. The ability of a machine to read someone’s face and thus emotions is becoming more and more advanced.

A quick look on the internet for facial recognition software leads to some fairly disturbing questions.

The Atlantic asks; ‘Who owns your face’, after the controversy surrounding the FBI’s ownership of over 52 million people’s facial-recognition data. The question of job-loss and human-machine separation surfaces.

A few months after this digging, I was commissioned by a new company who was using facial recognition software in what seemed like a positive way. They claimed to be using it to detect signs of stress and early onset mental illness.

I took on the assignment with some skepticism. Thanks to my earlier research, my automatic thought was this company wanted to sell data to the government under the disguise of helping me solve my stress problems.

It turns out we have only 6 basic emotions, which all others stem from, and that all our faces display these in a similar way.

6 basic emotions

Humans are able to quickly detect these signals and changes of another being’s face. It is a key part of our interpersonal relationships and communication. You have probably all heard the claim that a whopping 93% of human communication is nonverbal, well 55% of this is attributed to body language, and the face is a key part of this.

Machines can use these facial cues to understand our emotions and accurately predict when we may be moving into a more unhealthy mental state. They are able to accurately track your mood and stress states. This can be used to track your face regularly, noticing any changes or issues which may arise and track your mood and stress states. This can be used to track your face regularly, noticing any changes or issues which may arise and take preventative measures as soon as possible.

If you are anything like me, you are probably starting to get a little worried that Facebook and Snapchat are secretly using this part of facial recognition software. That they are moving towards world domination by emotional mind control, that it is not all as innocent as opposed creating better dog tongues and pre-tagging the back of my head in photographs….

(I thought I would add a disclaimer here in case one of my readers is president Trump scrolling the internet for fake-news, that last paragraph is pure and unfounded speculation which popped out of my worried little head. However, if you are in any way interested in reading and learning more about it, check it out.)

Brain9D

Written by

Brain9D

Follow for cool Infographics and latest news in brain health!

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade