👨🏼‍💻LibGDX Demo Game Application With ML Kit Hand Gesture Detection and Ashley System Library Part 1

Erdal Kaymak
Huawei Developers
Published in
6 min readMar 8, 2022

Introduction

In this demo application, we will build LibGDXDemo Game Application with ML Kit Hand Gesture Detection and Ashley Entity System Library. If you didn’t know anything about LibGDX you can read my first article about LibGDX at this link. Firstly, I will explain the LibGDX Ashley Entity System Library and how to use and implement this Library in LibGDX. After that, I will explain the ML Kit Hand Gesture Detection and how to implement it. Finally, we will create a custom camera view to use hand gesture detection while playing the game.

Integrating Applications to HMS Core

To start developing an app with Huawei mobile services, you need to integrate your application to the HMS core. Check the link below to integrate your application, also don’t forget to enable the ML Kit from AppGallery Connect.

Ashley Entity System Library

Ashley Entity System is an Entity System library that’s managed under the LibGDX organization and is well-suited for game development. It depends on LibGDX utility classes. Entity systems provide a different way to manage data and functionality towards large sets of objects without having to make the object classes rich with inheritance. Utilizing Ashley might be a helpful approach for those looking for an object modeling approach like Unity provides, but with the scope of a framework instead of the game engine.

Ashley Library is formed by the combination of Entity, System, Component, and Engine.

Entity: Entities are game object it exists in our game world used with a list of components

Component: Components are game data, it is used with entities and systems

System: Systems are game logic used with the Family to use specific entities. There are three systems in Ashley Library. These are Interval System, Entity System, Iterating System.

Family: Families are groups of components. it defines which components should be used in a specific System. Systems only work with those components

Engine: The engine class is the core class and center of our Ashley Library. We can add systems and entities using the Engine Class.

ML Kit Hand Gesture Detection

This service provides two capabilities: hand keypoint detection and hand gesture recognition. The hand keypoint detection capability can detect 21 hand keypoints (including fingertips, knuckles, and wrists) and return positions of the key points. The hand gesture recognition capability can detect and return the positions of all rectangle areas of the hand from images and videos, and the type and confidence of a gesture. This capability can recognize 14 gestures, including the thumbs-up/down, OK sign, fist, finger heart, and number gestures from 1 to 9. Both capabilities support detection from static images and real-time camera streams. I use hand gesture detection and choose the number one sign to move the player in this project.

Assigning Permissions in the Manifest File

The ML Kit SDK Hand Gesture Services requires some permissions. We should declare the permissions in the AndroidManifest.xml file as follows:

Preparations for the Code

After adding permission to AndroidManifest.xml we need to add dependency of ML Kit Hand Gesture Service to Project/build.gradle file.

We should define our android specific services under Project/ build.gradle in the android header for Libgdx Application.

We can define the ML Kit Hand Gesture Service dependency in the project(“: android”) dependencies.

Lens Engine Preview Class

We use this preview class to control the Lens Engine and Graphic Overlay. We can use start and stop functions to start or stop the Lens Engine.

Lens Engine: Lens Engine a class with the camera initialization, frame obtaining, and logic control functions encapsulated.

Graphic Overlay Class

I use this class to render a series of custom graphics to be overlayed on top of an associated preview (i.e., the camera preview). The creator can add graphics objects, update the objects, and remove them, triggering the appropriate drawing and invalidation within the view. Supports scaling and mirroring of the graphics relative to the camera’s preview properties. The idea is that detection items are expressed in terms of a preview size but need to be scaled up to the full view size, and also mirrored in the case of the front-facing camera.

Associated [Graphic] items should use the following methods to convert to view coordinates for the graphics that are drawn:
[Graphic.scaleX] and [Graphic.scaleY] adjust the size of the
supplied value from the preview scale to the view scale.
[Graphic.translateX] and [Graphic.translateY] adjust the coordinate
from the preview’s coordinate system to the view coordinate system.

Custom Camera View Class

I create this class by using the device camera to recognize hand gestures and movement rectangles on the game screen.

I create two dynamic Relative Layouts to show the camera on the game screen. For creating the dynamic layouts we should define Layout params and Rules. I use Layout params for defining the width and height of the layouts. Also, I use rules for defining the location of the relative layouts on the screen.

Hand Analyzer Transactor

I create the HandAnalyzerTransactor class for processing recognition results. This class implements the MLTransactor<MLGesture> API and uses the transactResult method in this class to obtain the recognition results and implement specific services.

(in line 12) I call HandGestureGraphic class with mGraphicOverlay parameter and list parameter to get the real coordinate of the result points.

Hand Gesture Graphic

I use this class for getting results points and adjusting coordinates from the preview’s coordinates system to the view coordinates system.

(In lines 5–6) I get the result lists every member and translate this rectangle to real view coordinates with the help of translateRect method. (In lines 8–13). I get the mlGesture.category and use the rect.centerX() method to get hand-moving coordinates and give that coordinates to my constant value to move the player.

ML Kit Class

We should create that class under the android folder.

(in lines 16–20) I create a hand gesture recognition analyzer using a gesture analyzer setting. Also, I set the transactor with help of the analyzer setTransactor method. (in lines 24–26) I initialize the custom camera view. (In lines 29–32) I initialize the Lens engine preview and graphic overlay. (in line 35–41) I create Lens Engine with the help of Lens Engine creator. (in lines 45–55) I control the lens engine if it is not null then I start the lens engine preview with lens engine and graphic overlay parameter. (in lines 61–68) I destroy the lens engine and ML Gesture Analyzer.

Kit Module Object

We create that module object for the Dagger Hilt dependency injection using our MLKit class.

Android Launcher Class

I use this class to check camera permission and trigger the ml kit functions. This class is the main class of the Android for playing the game on the android device.

(In line 15) I use the injection for the ML Kit class with @Inject annotation. (In line 23) I use the MLKit class initPreviewAndOverlay method to initialize the LensEnginePreview and GraphicOverlay. (In lines 28–34) Firstly, I check the camera permission after that I create and start the LensEngine with the help of MLKit class. (In lines 37–41) I create the game view using the initializeForView method. After that, I use MLKit class to initialize the custom view with a parameter of the game view. (in lines 133–136) I stop the LensEnginePreview in the onPause method and (in line 140) I release the Lens engine and stop the analyzer in the onDestroy method.

Conclusion

Now we learned how to implement and use the ML Kit Hand Gesture Services with the LibGDX application. Also, we learned how to create a custom camera view with the LibGDX game. If you want to learn more about LibGDX and its services, you can check this link. We will continue the LibGDX demo application in the next part of this article by creating our Libgdx Game screen and understanding LibGDX Ashley Entity System Library. Also, we will create our player object with entity factory class, we will learn and use components, systems, engines in the LibGDX game demo application and etc.

Take Care until next time …

References

--

--