Face and Emotion Detection in Android with Azure Cognitive Services(Face API)

Rishab Aggarwal
AndroidDevelopers
Published in
5 min readJun 18, 2020

In this article, we are going to learn how to make an android Application that detects faces and emotions from a photo.

First thing first you need to have some basic level experiences in android app development using an android studio. If not go through the following article

  1. How to get started with Android App Development.

2. Coming Soon…

TIPS FOR ABSOLUTE BEGINNER:

  1. It’s Normal!

2. Try to avoid Android Emulator… OR

So, now you have some experience with android development.

Then, Let’s get started

What the heck is Azure Cognitive Services?

Simply put Azure Cognitive Services is a bunch of APIs that you can use to do a bunch of following AI things without knowing anything about AI.

  1. Face(We are interested in this )
  2. Content Moderator
  3. Personaliser
  4. Language Understanding
  5. QnA Maker
  6. Text Analytics
  7. Translator
  8. Speech to Text
  9. Text to Speech
  10. Speech Translation
  11. Computer Vision
  12. Custom Vision

Now you know of what is Azure Cognitive Services. We just need to do 2 more things before we start coding.

  1. Make an Azure account

2. Create Cognitive services API in azure and get API key and endpoint

Note:- API endpoint will look something like this

https://<API Name>.api.cognitive.microsoft.com

we are using this API for face and Emotion detection so don’t forget to add /face/v1.0/ at the end for Eg

https://centralindia.api.cognitive.microsoft.com/face/v1.0/

and API KEY will be 32 char long alphanumeric string.

Let’s Start Coding….

First, add the following permission in the android manifest.

<uses-permission android:name="android.permission.INTERNET"/>
<uses-permission android:name="android.permission.CAMERA"/>

Now add the following dependency in build.gradle under Gradle Scripts in the app module

dependencies {
...
implementation 'com.microsoft.projectoxford:face:1.4.3'...
}

Before we can detect face and emotion we have to click photos from the phone’s camera ( we can also use static photos from res/drawable).

We have the following 2 ways of doing that:-

  1. We can use intent for capturing a photo from the mobile inbuilt camera application.

2. Or We can use a texture view if you don’t want to use an inbuilt camera application but it is a bit complicated, but if you are still interested you can check the following article.

Article Coming Soon….

Now we need to Ask for camera permission from the user if we want to use the camera of the phone.

if (ContextCompat.checkSelfPermission(getApplicationContext(), Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(MainActivity.this, new String[]{Manifest.permission.CAMERA}, 110);
}

After permission is granted by the user we will use intent to open the camera application.

Intent cameraIntent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
startActivityForResult(cameraIntent, 0);

After capturing a photo from the camera application we will use the onActivityResult method to retrieve bitmap from the camera application. Bitmap retrieved from the application will be converted into inputStream which will be sent to FACE API.

@Override
protected void onActivityResult(int requestCode, int resultCode, @Nullable Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (requestCode == 0 && resultCode == RESULT_OK) {
Bitmap bitmap = (Bitmap) data.getExtras().get("data");
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
imageBitmap.compress(Bitmap.CompressFormat.JPEG, 100, outputStream);
ByteArrayInputStream inputStream =
new ByteArrayInputStream(outputStream.toByteArray());
}
}

Now we need initialize FaceServiceClient.

private faceServiceClient faceServiceClient = new FaceServiceRestClient(API_ENDPOINT, API_KEY);

Our Face API is ready to use.

We will use detect function of the faceServiceClient which will return an array of faces detected by the API. We can ask for specific information about Faces in Our case we Required emotion data of the faces from the API.

Face[] result = faceServiceClient.detect(
inputStream,
true, // returnFaceId
true , // returnFaceLandmarks

new FaceServiceClient.FaceAttributeType[] {
FaceServiceClient.FaceAttributeType.Emotion }
);

We have to enclose faceServiceClient in the AsyncTask as network activities are not allowed on the main thread.

AsyncTask<InputStream, String, Face[]> detectTask =new AsyncTask<InputStream, String, Face[]>() {@Override
protected Face[] doInBackground(InputStream... params) {
try {
publishProgress("Detecting...");
Face[] result = faceServiceClient.detect(
params[0],
true, // returnFaceId
true , // returnFaceLandmarks
// returnFaceAttributes:
new FaceServiceClient.FaceAttributeType[] {
FaceServiceClient.FaceAttributeType.Emotion }
);
return result;
} catch (Exception e) {
e.printStackTrace();
return null;}
}
@Override
protected void onPreExecute() {

}
@Override
protected void onPostExecute(Face[] result) {
};detectTask.execute(inputStream);

If you want to parse data from Face[] to JSON you can use the following code.

for (int i=0;i<result.length;i++) {
jsonObject.put("happiness" , result[i].faceAttributes.emotion.happiness);
jsonObject.put("sadness" , result[i].faceAttributes.emotion.sadness);
jsonObject.put("surprise" , result[i].faceAttributes.emotion.surprise);
jsonObject.put("neutral" , result[i].faceAttributes.emotion.neutral);
jsonObject.put("anger" , result[i].faceAttributes.emotion.anger);
jsonObject.put("contempt" , result[i].faceAttributes.emotion.contempt);
jsonObject.put("disgust" , result[i].faceAttributes.emotion.disgust);
jsonObject.put("fear" , result[i].faceAttributes.emotion.fear);
Log.e(TAG, "doInBackground: "+jsonObject.toString() );
jsonObject1.put( (String.valueOf(i)),jsonObject);
}

If you want to work on UI thread in AsycTask you can use the following code but it is not recommended.

runOnUiThread(new Runnable() {@Override
public void run() {
Toast.makeText(MainActivity.this,"Something",Toast.LENGTH_LONG).show();
}});

If you want to draw a rectangle on the faces detected by face API as shown below.

private static Bitmap drawFaceRectanglesOnBitmap(
Bitmap originalBitmap, Face[] faces) {
Bitmap bitmap = originalBitmap.copy(Bitmap.Config.ARGB_8888, true);
Canvas canvas = new Canvas(bitmap);
Paint paint = new Paint();
paint.setAntiAlias(true);
paint.setStyle(Paint.Style.STROKE);
paint.setColor(Color.RED);
paint.setStrokeWidth(1);
if (faces != null) {
for (Face face : faces) {
FaceRectangle faceRectangle = face.faceRectangle;
canvas.drawRect(
faceRectangle.left,
faceRectangle.top,
faceRectangle.left + faceRectangle.width,
faceRectangle.top + faceRectangle.height,
paint);
}
}
return bitmap;
}

GitHub Repo Link:- https://github.com/rishab247/Face_and_Emotion_Detection

If You have any doubts feel free to ask in comments.

--

--