HUAWEI Scene Kit

Kadirtas
Huawei Developers
Published in
14 min readOct 14, 2020

Easy Integration of Complicated Features

Hi everyone,

In this article I will talk about HUAWEI Scene Kit. HUAWEI Scene Kit is a lightweight rendering engine that features high performance and low consumption. It provides advanced descriptive APIs for us to edit, operate, and render 3D materials. Scene Kit adopts physically based rendering (PBR) pipelines to achieve realistic rendering effects. With this Kit, we only need to call some APIs to easily load and display complicated 3D objects on Android phones.

It was announced before with just SceneView feature. But, in the Scene Kit SDK 5.0.2.300 version, they have announced Scene Kit with new features FaceView and ARView. With these new features, the Scene Kit has made the integration of Plane Detection and Face Tracking features much easier.

At this stage, the following question may come to your mind “since there are ML Kit and AR Engine, why are we going to use Scene Kit?” Let’s give the answer to this question with an example.

Differences Between Scene Kit and AR Engine or ML Kit

For example, we have a Shopping application. And let’s assume that our application has a feature in the glasses purchasing part that the user can test the glasses using AR to see how the glasses looks like in real. Here, we do not need to track facial gestures using the Facial expression tracking feature provided by AR Engine. All we have to do is render a 3D object on the user’s eye. Face Tracking is enough for this. So if we used AR Engine, we would have to deal with graphics libraries like OpenGL. But by using the Scene Kit FaceView, we can easily add this feature to our application without dealing with any graphics library. Because the feature here is a basic feature and the Scene Kit provides this to us.

So what distinguishes AR Engine or ML Kit from Scene Kit is AR Engine and ML Kit provide more detailed controls. However, Scene Kit only provides the basic features (I’ll talk about these features later). For this reason, its integration is much simpler.

Let’s examine what these features provide us.SceneView

With SceneView, we are able to load and render 3D materials in common scenes.

It allows us to:

  • Load and render 3D materials.
  • Load the cubemap texture of a skybox to make the scene look larger and more impressive than it actually is.
  • Load lighting maps to mimic real-world lighting conditions through PBR pipelines.
  • Swipe on the screen to view rendered materials from different angles.

ARView

ARView uses the plane detection capability of AR Engine, together with the graphics rendering capability of Scene Kit, to provide us with the capability of loading and rendering 3D materials in common AR scenes.

With ARView, we can:

  • Load and render 3D materials in AR scenes.
  • Set whether to display the lattice plane (consisting of white lattice points) to help select a plane in a real-world view.
  • Tap an object placed onto the lattice plane to select it. Once selected, the object will change to red. Then we can move, resize, or rotate it.

FaceView

FaceView can use the face detection capability provided by ML Kit or AR Engine to dynamically detect faces. Along with the graphics rendering capability of Scene Kit, FaceView provides us with superb AR scene rendering dedicated for faces.

With FaceView we can:

  • Dynamically detect faces and apply 3D materials to the detected faces.

As I mentioned above ARView uses the plane detection capability of AR Engine and the FaceView uses the face detection capability provided by either ML Kit or AR Engine. When using the FaceView feature, we can use the SDK we want by specifying which SDK to use in the layout.

Here, we should consider the devices to be supported when choosing the SDK. You can see the supported devices in the table below. Also for more detailed information you can visit this page. (In addition to the table on this page, the Scene Kit’s SceneView feature also supports P40 Lite devices.)

Also, I think it is useful to mention some important working principles of Scene Kit:

Scene Kit

  • Provides a Full-SDK, which we can integrate into our app to access 3D graphics rendering capabilities, even though our app runs on phones without HMS Core.
  • Uses the Entity Component System (ECS) to reduce coupling and implement multi-threaded parallel rendering.
  • Adopts real-time PBR pipelines to make rendered images look like in a real world.
  • Supports the general-purpose GPU Turbo to significantly reduce power consumption.

Demo App

Let’s learn in more detail by integrating these 3 features of the Scene Kit with a demo application that we will develop in this section.

To configure the Maven repository address for the HMS Core SDK add the below code to project level build.gradle.

Go to

project level build.gradle > buildscript > repositories

project level build.gradle > allprojects > repositories

maven { url 'https://developer.huawei.com/repo/' }

After that go to

module level build.gradle > dependencies

then add build dependencies for the Full-SDK of Scene Kit in the dependencies block.

implementation 'com.huawei.scenekit:full-sdk:5.0.2.302'

Note: When adding build dependencies, replace the version here “full-sdk: 5.0.2.302” with the latest Full-SDK version. You can find all the SDK and Full-SDK version numbers in Version Change History.

Then click the Sync Now as shown below

After the build is successfully completed, add the following line to the manifest.xml file for Camera permission.

<uses-permission android:name="android.permission.CAMERA" />

Now our project is ready to development. We can use all the functionalities of Scene Kit.

Let’s say this demo app is a shopping app. And I want to use Scene Kit features in this application. We’ll use the Scene Kit’s ARView feature in the “office” section of our application to test how a plant and a aquarium looks on our desk.

And in the sunglasses section, we’ll use the FaceView feature to test how sunglasses look on our face.

Finally, we will use the SceneView feature in the shoes section of our application. We’ll test a shoe to see how it looks.

We will need materials to test these properties, let’s get these materials first. I will use 3D models that you can download from the links below. You can use the same or different materials if you want.

Capability: ARView, Used Models: Plant , Aquarium

Capability: FaceView, Used Model: Sunglasses

Capability: SceneView, Used Model: Shoe

Note: I used 3D models in “.glb” format as asset in ARView and FaceView features. However, these links I mentioned contain 3D models in “.gltf” format. I converted “.gltf” format files to “.glb” format. Therefore, you can obtain a 3D model in “.glb” format by uploading all the files (textures, scene.bin and scene.gltf) of the 3D models downloaded from these links to an online converter website. You can use any online conversion website for the conversion process.

All materials must be stored in the assets directory. Thus, we place the materials under app> src> main> assets in our project. After placing it, our file structure will be as follows.

After adding the materials, we will start by adding the ARView feature first. Since we assume that there are office supplies in the activity where we will use the ARView feature, let’s create an activity named OfficeActivity and first develop its layout.

Note: Activities must extend the Activity class. Update the activities that extend the AppCompatActivity with Activity”

Example: It should be “OfficeActivity extends Activity”.

ARView

In order to use the ARView feature of the Scene Kit, we add the following ARView code to the layout (activity_office.xml file).

Overview of the activity_office.xml file:

We specified 2 buttons, one for the aquarium and the other for loading a plant. Now, let’s do the initializations from OfficeActivity and activate the ARView feature in our application. First, let’s override the onCreate () function to obtain the ARView and the button that will trigger the code of object loading.

Then add the method that will be triggered when the buttons are clicked. Here we will check the loading status of the object. We will clean or load the object according to the its situation.

For plant button:

For the aquarium button:

Now let’s talk about what we do with the codes here, line by line. First, we set the ARView.enablePlaneDisplay() function to true, and if a plane is defined in the real world, the program will appear a lattice plane here.

mARView.enablePlaneDisplay(true);

Then we check whether the object has been loaded or not. If it is not loaded, we specify the path to the 3D model we selected with the mARView.loadAsset () function and load it. (assets> ARView> flower.glb)

mARView.loadAsset("ARView/flower.glb");

Then we create and initialize scale and rotation arrays for the starting position. For now, we are entering hardcoded values here. For the future versions, by holding the screen, etc. We can set a starting position.

Note: The Scene Kit ARView feature already allows us to move, adjust the size and change the direction of the object we have created on the screen. For this, we should select the object we created and move our finger on the screen to change the position, size or direction of the object.

Here we can adjust the direction or size of the object by adjusting the rotation and scale values.(These values will be used as parameter of setInitialPose() function)

Note: These values can be changed according to used model. To find the appropriate values, you should try yourself. For details of these values see the document of ARView setInitialPose() function.

float[] scale = new float[] { 0.15f, 0.15f, 0.15f };
float[] rotation = new float[] { 0.707f, 0.0f, -0.500f, 0.0f };

Then we set the scale and rotation values we created as the starting position.

mARView.setInitialPose(scale, rotation);

After this process, we set the boolean value to indicate that the object has been created and we update the text of the button.

isLoadResource = true;
mButton.setText(R.string.btn_text_clear_resource);

If the object is already loaded, we clear the resource and load the empty object so that we remove the object from the screen.

mARView.clearResource();
mARView.loadAsset("");

Then we set the boolean value again and done by updating the button text.

isLoadResource = false;
mButton.setText(R.string.btn_text_load);

Finally, we should not forget to override the following methods as in the code to ensure synchronization.

The overview of OfficeActivity should be as follows.

In this way, we added the ARView feature of Scene Kit to our application. We can now use the ARView feature. Now let’s test the ARView part on a device that supports the Scene Kit ARView feature.

Let’s place plants and aquariums on our table as below and see how it looks.

In order for ARView to recognize the ground, first you need to turn the camera slowly until the plane points you see in the photo appear on the screen. After the plane points appear on the ground, we specify that we will add plants by clicking the load flower button. Then we can add the plant by clicking the point on the screen where we want to add the plant. When we do the same by clicking the aquarium button, we can add an aquarium.

I placed an aquarium and plants on my table. You can test how it looks by placing plants or aquariums on your table or anywhere. You can see how it looks in the photo below.

Note: “Clear Flower” and “Clear Aquarium” buttons will remove the objects we have placed on the screen.

After creating the objects, we select the object we want to move, change its size or direction as you can see in the picture below. Under normal conditions, the color of the selected object will turn into red. (The color of some models doesn’t change. For example, when the aquarium model is selected, the color of the model doesn’t change to red.)

If we want to change the size of the object after selecting it, we can zoom in out by using our two fingers. In the picture above you can see that I changed plants sizes. Also we can move the selected object by dragging it. To change its direction, we can move our two fingers in a circular motion.

FaceView

In this part of my article, we will add the FaceView feature to our application. Since we will use the FaceView feature in the sunglasses test section, we will create an activity called Sunglasses. Again, we start by editing the layout first.

We specify which SDK to use in FaceView when creating the Layout:

The overview of activity_sunglasses layout file:

Here I state that I will use the AR Engine Face Tracking SDK by setting the sdk type to “AR_ENGINE”. Now, let’s override the onCreate() function in SunglassesActivity, obtain the FaceView that we added to the layout and initialize the listener by calling the init() function.

Now we’re adding the init () function. I will explain this function line by line:

In this function, we first create the position, rotation and scale values that we will use for the initial pose. (These values will be used as parameter of setInitialPose() function)

Note: These values can be changed according to used model. To find the appropriate values, you should try yourself. For details of these values see the document of FaceView setInitialPose() function.

Then we set a click listener on the FaceView layout. Because we will trigger the code to show the sunglasses on user’s face when the user clicked on the screen.

In the onClick function, we first check whether sunglasses have been created. If the sunglasses are not created, we load by specifying the path of the material to be rendered with the FaceView.loadAsset () function (Here we specify the path of the sunglasses we added under assets> FaceView) and set the marker positions. For example, here we set the marker position as LandmarkType.TIP_OF_NOSE. In this way, FaceView will refer to the user’s nose as the center when loading the model.

This function returns an integer value back to us. If this value is a negative value, the load will fail. If the return value is a non-negative number, the number is the index value of the loaded material. So we’re checking this in case there is an error. If there was an error while loading, we print Toast message and return.

If there is no any error, we specify that we successfully loaded the model by setting the initial pose of the model and setting the boolean value.

If the sunglasses are already loaded when we click, this time we clean the resource with clearResource, then load the empty asset and remove the sunglasses.

Finally, we override the following functions to ensure synchronization:

The last version of SunglassesActivity:

And we added FaceView to our application. We can now start the sunglasses test using the FaceView feature. Let’s compile and run this part on a device that supports the Scene Kit FaceView feature.

Glasses will be created when you touch the screen after the camera is turned on.

SceneView

In this part of my article, we will implement the SceneView feature of the Scene Kit that we will use in the shoe purchasing section of our application.

Since we will use the SceneView feature in the shoe purchasing scenario, we create an activity named ShoesActivity. In this activity’s layout, we will use a custom view that extends the SceneView. For this, let’s first create our CustomSceneView class. Let’s create its constructors to initialize this class from Activity.

After adding the Constructors, we need to override this method, and call the APIs of SceneView to load and initialize materials.

Note: We should add both two constructors.

We are overriding the surfaceCreated() function belonging to SceneView.

The super method contains the initialization logic. To override the surfaceCreated method, we should call the super method in the first line.

Then we load the shoe model with the loadScene() function. We can add a background with the loadSkyBox() function. We load the reflection effect thanks to the loadSpecularEnvTexture() function and finally we load the diffuse map by calling the loadDiffuseEnvTexture() function.

And also if we want to do an extra touch controller on this view, we can override the onTouchEvent() function.

Now let’s add CustomSceneView, the custom view we created, to the layout of ShoesActivity.

Now all we have to do is set the layout to Activity. Now, we set the layout by overriding the onCreate() function of ShoesActivity.

That’s it!

Now that we have added the SceneView feature, which we will use in the shoe purchasing section, now it is time to call them from MainActivity.

Now let’s edit the layout of the MainActivity where we will manage the navigation and design a perfect bad UI as below :)

Now, let’s do the necessary initializations from MainActivity. First, let’s set the layout by overriding the onCreate method.

Then we add the following codes into the MainActivity class and handle button clicks. Of course, we should not forget that we will use the camera while using the ARView feature and FaceView features. For this reason, we should check the camera permission among the functions I have mentioned.

After checking the camera permission, we will override the onPermissionResult() function, which is the place where the flow will continue, and redirect the clicked activity according to the request codes we provide in the button click functions. For this, we add the following code to the MainActivity.

Now that we have finished the coding part, we can add some notes.

NOTE: To achieve the expected ARView and FaceView experiences, our app should not support screen orientation change or split screen mode to get a better display effect; so add the following configuration to the AndroidManifest.xml file inside the related activity tags:

android:configChanges="screenSize|orientation|uiMode|density"
android:screenOrientation="portrait"
android:resizeableActivity="false"

Note: We can also enable Full-screen display for Activities that we used for implementing the SceneView, ARView or FaceView to get better display effects.

android:theme="@android:style/Theme.NoTitleBar.Fullscreen"

After making these configurations, the general view of the AndroidManifest.xml file of our application will be as follows:

And done :) Let’s test our app on a device that supports features.

SceneView:

MainActivity :

Summary

With the Scene Kit, I tried to explain how we can easily add features that will be very difficult to add to our application without dealing with any graphics library, with a scenario. I hope this article has helped you. Thank you for reading.

See you in my next articles …

Full code:

Sources:

3D Models:

--

--