How to use Huawei AR Engine in Unity?

YusufAltun
May 22, 2020 · 5 min read

Hi Everyone, in this document I will share how to use Huawei AR engine in Unity platform.

Here is an example of hand gestures recognition:

Hand Gestures with 3D objects

Lets Get Started

Every day augmented reality market is growing. There will be 1 billion AR users by 2020. By 2025, the worldwide user base of both AR and VR games will grow to 216 million users¹.

The biggest mobile phone providers invest AR tech. First Apple released their own ARKit which is a set of tools to aid developers in creating augmented reality applications for iOS devices. Afterward Google create ARCore for android devices which is a development platform for creating augmented reality applications, that was released in 2018. One year later in 2019, Huawei enter the game and released Huawei AR engine.

Introduction of Huawei AR engine

HUAWEI AR Engine is a platform for building Augmented Reality Apps on android smartphones. Through its integrated AR core algorithm and HiSilicon chip, HUAWEI AR Engine currently provides motion tracking, plane detection, light estimation and hit testing, hand gesture recognition and skeleton tracking, human body skeleton tracking, human body mask, image tracking, scene mesh, facial expression tracking and face mesh.

HUAWEI AR Engine Unity SDK implements the functionality by wrapping the NDK interfaces of HUAWEI AR Engine. Application can integrate the AR functionality by importing the unitypackage.

Hand Gesture Tracking

In this article, I tried to give a brief introduction of hand gesture tracking development. To achieve this aim I create Unity application template.

Create project

Development Steps

After importing SDK, create file structure like this:

File Structure

Add hand configuration from Huawei AR SDK. Right click on your project folder, then follow the steps below(HuaweiARConfig):

Add Hand Config

Configure a scene:

  • Double click scene and delete default camere.
  • Create empty object then put “HuaweARUnitySDK ->Prefabs->PreviewCamera” inside this object. PreviewCamera renders camera device as a background to attached Unity Camera.
  • Click the hand object and add component as a scripts: “SessionComponent.cs” and “HandController.cs” which we have created in the first step.
Hand Object Configuration

Session Management:

Session conponent script invoke HuaweiARUnitySDK.ARSession functions for create, stop, resume, pause and update to session.

Before we start to session we should check whether the device supports AR Engine or not.

void Init(){
AREnginesAvaliblity ability = AREnginesSelector.Instance.CheckDeviceExecuteAbility();
if ((AREnginesAvaliblity.HUAWEI_AR_ENGINE & ability) != 0)
{
AREnginesSelector.Instance.SetAREngine(
AREnginesType.HUAWEI_AR_ENGINE);
}

The RequestIntall method will check the AR availability on this device and the compatibility of the engine.

installRequested = false;
switch (AREnginesApk.Instance.RequestInstall(!installRequested))
{
case ARInstallStatus.INSTALL_REQUESTED:
installRequested = true;
return;
case ARInstallStatus.INSTALLED:
break;
}

The inner process of RequestInstall is:

After that, check camera permissions.

const string ANDROID_CAMERA_PERMISSION_NAME = "android.permission.CAMERA";if (AndroidPermissionsRequest.IsPermissionGranted(ANDROID_CAMERA_PERMISSION_NAME))
{
_ConnectToService();
return;
}

If all permissions are ready, then we can connect to AR service with invoke “HuaweiARUnitySDK -> Sripts -> ARSession.cs” functions.

private void _ConnectToService()
{
ARSession.CreateSession(); //create session
isSessionCreated = true; //flag to indicate session is created
ARSession.Config(Config); //config with HandARTrackingConfig
ARSession.Resume(); //resume session
ARSession.SetCameraTextureNameAuto(); //set external texture to receive camera feed automatically
ARSession.SetDisplayGeometry(Screen.width, Screen.height); // set display width and height
...
}

4. Hand Controller

HUAWEI AR Engine can detect gesture and hand skeleton in real time. When HuaweiARUnitySDK.ARHandTrackingConfig is configured to ARSession, it will return the recongnized HuaweiARUnitySDK.ARHand in camera preview. Currently, only one hand detection is supported.

ARFrame.GetTrackables<ARHand>(newHands, ARTrackableQueryFilter.NEW);//Get new detected hand in this frame.for (int i = 0; i < newHands.Count; i++)
{
GameObject handObject = Instantiate(handPrefabs, Vector3.zero, Quaternion.identity, transform);
handObject.GetComponent<HandVisualizer>().Initialize(newHands[i]);
}

When we create Hand prefab we should add script to this component for adding some features.

Hand Visualizer script help us to invoke game objects and hand box when hand detected.

public void Initialize(ARHand hand)
{
m_hand = hand;
m_handCamera = Camera.main;
m_spider = GameObject.Find("spider3");// Invoke my 3D Object
m_spider.SetActive(false);
.....
}
public void Update()
{
if (null == m_hand)
{
return;
}
_DonotShowHandBox();// Hide objects
if (m_hand.GetTrackingState() == ARTrackable.TrackingState.STOPPED)
{
Destroy(gameObject);
}
else if (m_hand.GetTrackingState() == ARTrackable.TrackingState.TRACKING)
{
_UpdateHandBox();
}
void _UpdateHandBox()
{
var handBox = m_hand.GetHandBox(); //Get the bound box of detected hand.
Vector3 glLeftTopCorner = handBox[0]; //Get the top left corner of the bound box. Vector3 glRightBottomCorner = handBox[1]; //Get the bottom right corner of the bound box. Vector3 glLeftBottomCorner = new Vector3(glLeftTopCorner.x, glRightBottomCorner.y); Vector3 glRightTopCorner = new Vector3(glRightBottomCorner.x, glLeftTopCorner.y);
....
int type = m_hand.GetGestureType(); //Get the gesture.}

Calculate center coordinates of detected hand:

float glCenterX = (glLeftTopCorner.x + glRightTopCorner.x) / 2;
float glCenterY = (glLeftTopCorner.y + glLeftBottomCorner.y) / 2;
Vector3 glCenter = new Vector3(glCenterX, glCenterY);

We can put the game objects to center of hand like this:

m_spider.transform.position = (_TransferGLCoord2UnityWoldCoordWithDepth(glCenter));
....

Now your project is ready to use.

Player Setting

Change Resolution and Presentation settings accordingly:

● Set Allowed Orientations to Portrait

Change Other Settings accordingly:

● Remove multithreaded rendering

● Change package name to your own and your version

● Change minimum api level to 28

Now the project can be built and run into a device.

With that thanks for reading.

I hope this gives you a starting point for Huawei AR kit. Feel free to check out the full source code in github.

Huawei Developers

Huawei Developers

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store