M vs Microsoft: Episode 7

Jul 3 · 2 min read

In this episode, let’s test M’s and Microsoft Azure’s performances on human emotion recognition with a more technologically challenging human face image.

Here’s the context of this image: The husband is leaving on an extremely dangerous mission. Since the wife knows that her husband is not likely to be returned from this deadly mission, she is emotionally scared and sad and is crying desperately.

So how would M and Microsoft Azure quantify this?

Similar to the last episode while Microsoft fails to detect any face to evaluate, M successfully recognized the person’s face in the image, even only half of the face was captured. Furthermore, M analyzes 50.9% sadness, 37.6% fear, and 5.8% anger.

Follow Project M on:

Company website: http://www.ipmdinc.com

Medium: ProjectM

Facebook: Project M

Instagram: mprojectai

Twitter: @mprojectai

*As of June 2019, the Project M team has devoted 62,000 hours to the AI platform, M. The sample input data we are using is the pure testing data that is completely new to M and Microsoft (assumed) and is never been used to train M, so this comparison is a fair trial between M and Microsoft. We appreciate Microsoft, the leader of the industry and emotional AI sector, for allowing the general public to test their testing site for identifying human emotions based on facial expressions.


Written by


M, the Artificial Intelligence Platform that can recognize the human emotions based on hidden and micro facial expression.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade