Azure Cognitive Service Emotion API

Emotion API

Achindra Bhatnagar
Achindra
2 min readNov 8, 2016

--

I watched a documentary on Inside Out, a movie where five basic emotions come to life and influence a little girls actions via a console in her mind’s Headquarters. Documentary said there are 7 basic emotions by psychological research however movie used only five!

And then I came across Azure Cognitive Service Emotion API. This service is capable of identifying all seven emotions on a given face in an Image or a Video! So I started exploring.

Emotion API takes an image with recognizable face and returns emotion matrix. This matrix contains scores like below

[
{
"faceRectangle": {
"left": 68,
"top": 97,
"width": 64,
"height": 97
},
"scores": {
"anger": 0.00300731952,
"contempt": 5.14648448E-08,
"disgust": 9.180124E-06,
"fear": 0.0001912825,
"happiness": 0.9875571,
"neutral": 0.0009861537,
"sadness": 1.889955E-05,
"surprise": 0.008229999
}
}
]

The API needs Subscription Key and the Image to process.

We can either pass image URL through a json string or binary image data stream in request body.

Json/Url

string uri = "https://api.projectoxford.ai/emotion/v1.0/recognize";
HttpClient client = new HttpClient();
client.DefaultRequestHeaders.Add("Ocp-Apim-Subscription-Key", SubscriptionKey);
HttpContent content = new StringContent("{\"url\":\"" + blobUrl + "\"}", Encoding.UTF8, "application/json");HttpResponseMessage response = await client.PostAsync(uri, content);

Binary Stream

using System;
using System.Net.Http;
using System.Threading.Tasks;
using Newtonsoft.Json;
using System.Collections.Generic;
using System.Net.Http.Headers;
using System.IO;
namespace AzureFunctionGetEmotions
{
public class FaceRectangle
{
public int left { get; set; }
public int top { get; set; }
public int width { get; set; }
public int height { get; set; }
}
public class Scores
{
public double anger { get; set; }
public double contempt { get; set; }
public double disgust { get; set; }
public double fear { get; set; }
public double happiness { get; set; }
public double neutral { get; set; }
public double sadness { get; set; }
public double surprise { get; set; }
}
public class FaceObject
{
public FaceRectangle faceRectangle { get; set; }
public Scores scores { get; set; }
}
class Program
{
static void Main(string[] args)
{
string SubscriptionKey = "";
Task t = MakeRequest(SubscriptionKey);
t.Wait();
}
static async Task MakeRequest(string SubscriptionKey)
{
List faceObjects;
string uri = "https://api.projectoxford.ai/emotion/v1.0/recognize";
HttpClient httpClient = new HttpClient();
httpClient.DefaultRequestHeaders.Add("Ocp-Apim-Subscription-Key", SubscriptionKey);

byte[] imageData = File.ReadAllBytes(@"");
var content = new ByteArrayContent(imageData);
content.Headers.ContentType = MediaTypeHeaderValue.Parse("application/octet-stream");
HttpResponseMessage response = await httpClient.PostAsync(uri, content);
if (response.IsSuccessStatusCode)
{
string result = await response.Content.ReadAsStringAsync();
faceObjects = JsonConvert.DeserializeObject(result);
foreach(FaceObject faceObject in faceObjects)
Console.WriteLine(faceObject.scores.happiness.ToString());
}
}
}
}

….

I have an Azure Function App which monitors any images uploaded to a blob store. The function finds emotions for each face recognized and stores in a Table store.

What use? Wait for it :-)

Emotion3_Thumb.png

--

--