How HUAWEI ML Kit’s Face Detection and Hand Keypoint Detection Capabilities Helped with Creating the Game Crazy Rockets

Irene
Huawei Developers
Published in
4 min readNov 2, 2020

--

Introduction

There are so many online games these days that are addictive, easy to play, and suitable for a wide age range. I’ve long dreamed of creating a hit game of my own, but doing so is harder than it seems. I was researching on the Internet, and was fortunate to learn about HUAWEI ML Kit’s face detection and hand keypoint detection capabilities, which make games much more engaging.

Application Scenarios

ML Kit’s face detection capability detects up to 855 keypoints of the face, and returns the coordinates for the face contours, eyebrows, eyes, nose, mouth, and ears, as well as angles. Integrating the face detection capability, makes it easy to create a beauty app, or enable users to add special effects to facial images to make them more intriguing.

The hand keypoint detection capability can be applied across a wide range of scenarios. For example, short video apps are able to provide diverse special effects that users can apply to their videos, after integrating this capability, providing new sources of fun and whimsy.

Crazy Rockets is a game that integrates both capabilities. It provides two playing modes for players, allowing them to control rockets through hand and face movements. Both modes work flawlessly by detecting the motions. Let’s take a look at what the game effects look like in practice.

Pretty exhilarating, wouldn’t you say? Now, I’ll show you how to create a game like Crazy Rockets, by using ML Kit’s face detection capability.

Development Practice

Preparations

To find detailed information about the preparations you need to make, please refer to Development Process.

Here, we’ll just take a look at the most important procedures.

1. Face Detection

1.1 Configure the Maven Repository

Go to allprojects > repositories and configure the Maven repository address for the HMS Core SDK.

allprojects {repositories {google()jcenter()maven {url 'https://developer.huawei.com/repo/'}}}

Go to buildscript > repositories and configure the Maven repository address for the HMS Core SDK.

buildscript {repositories {google()jcenter()maven {url 'https://developer.huawei.com/repo/'}}}

Go to buildscript > dependencies and add AppGallery Connect plug-in configurations.

dependencies {...classpath 'com.huawei.agconnect:agcp:1.3.1.300'}}

1.2 Integrate the SDK

Implementation 'com.huawei.hms:ml-computer-vision-face:2.0.1.300'

1.3 Create a Face Analyzer

MLFaceAnalyzer analyzer = MLAnalyzerFactory.getInstance().getFaceAnalyzer();

1.4 Create a Processing Class

public class FaceAnalyzerTransactor implements MLAnalyzer.MLTransactor<MLFace> {@Overridepublic void transactResult(MLAnalyzer.Result<MLFace> results) {SparseArray<MLFace> items = results.getAnalyseList();// Process detection results as required. Note that only the detection results are processed.// Other detection-related APIs provided by ML Kit cannot be called.}@Overridepublic void destroy() {// Callback method used to release resources when the detection ends.}}

1.5 Create a LensEngine to Capture Dynamic Camera Streams, and Pass them to the Analyzer

LensEngine lensEngine = new LensEngine.Creator(getApplicationContext(), analyzer).setLensType(LensEngine.BACK_LENS).applyDisplayDimension(1440, 1080).applyFps(30.0f).enableAutomaticFocus(true).create();

1.6 Call the run Method to Start the Camera, and Read Camera Streams for Detection

// Implement other logic of the SurfaceView control by yourself.SurfaceView mSurfaceView = findViewById(R.id.surface_view);try {lensEngine.run(mSurfaceView.getHolder());} catch (IOException e) {// Exception handling logic.}

1.7 Release Detection Resources

if (analyzer != null) {try {analyzer.stop();} catch (IOException e) {// Exception handling.}}if (lensEngine != null) {lensEngine.release();}

2. Hand Keypoint Detection

2.1 Configure the Maven Repository

Go to allprojects > repositories and configure the Maven repository address for the HMS Core SDK.

allprojects {repositories {google()jcenter()maven {url 'https://developer.huawei.com/repo/'}}}

Go to buildscript > repositories and configure the Maven repository address for the HMS Core SDK.

buildscript {repositories {google()jcenter()maven {url 'https://developer.huawei.com/repo/'}}}

Go to buildscript > dependencies and add AppGallery Connect plug-in configurations.

dependencies {...classpath 'com.huawei.agconnect:agcp:1.3.1.300'}}

2.2 Integrate the SDK

// Import the base SDK.implementation 'com.huawei.hms:ml-computer-vision-handkeypoint:2.0.4.300'// Import the hand keypoint detection model package.implementation 'com.huawei.hms:ml-computer-vision-handkeypoint-model:2.0.4.300'

2.3 Create a Default Hand Keypoint Analyzer

MLHandKeypointAnalyzer analyzer =MLHandKeypointAnalyzerFactory.getInstance().getHandKeypointAnalyzer();

2.4 Create a Processing Class

public class HandKeypointTransactor implements MLAnalyzer.MLTransactor<List<MLHandKeypoints>> {@Overridepublic void transactResult(MLAnalyzer.Result<List<MLHandKeypoints>> results) {SparseArray<List<MLHandKeypoints>> analyseList = results.getAnalyseList();// Process detection results as required. Note that only the detection results are processed.// Other detection-related APIs provided by ML Kit cannot be called.}@Overridepublic void destroy() {// Callback method used to release resources when the detection ends.}}

2.5 Set the Processing Class

analyzer.setTransactor(new HandKeypointTransactor());

2.6 Create a Lengengine

LensEngine lensEngine = new LensEngine.Creator(getApplicationContext(), analyzer).setLensType(LensEngine.BACK_LENS).applyDisplayDimension(1280, 720).applyFps(20.0f).enableAutomaticFocus(true).create();

2.7 Call the run Method to Start the Camera, and Read Camera Streams for Detection

// Implement other logic of the SurfaceView control by yourself.SurfaceView mSurfaceView = findViewById(R.id.surface_view);try {lensEngine.run(mSurfaceView.getHolder());} catch (IOException e) {// Exception handling logic.}

2.8 Release Detection Resources

if (analyzer != null) {analyzer.stop();}if (lensEngine != null) {lensEngine.release();}

Learn More

For more information, please visit HUAWEI Developers.

For detailed instructions, please visit Development Guide.

You can join the HMS Core developer discussion on Reddit.

You can download the demo and sample code from GitHub.

To solve integration problems, please go to Stack Overflow.

--

--