ARCore with Unity — Beginner on Android

Face to face with Diablo through ARCore

With the improvements in computer vision and superior hardware advancements, AR and VR are bound to become part of our daily life. Also Google brought out ARCore 1.2.0. With all of these, I had to try out and have a hands on experience.


For the tutorial, we need some basic knowledge on using Unity. For me, with no formal training on Unity, my first task was to find some online tutorials which can introduce me to it and once I am familiarised with it, then I can look out for some more advanced sessions. As for ARCore, the content seems very scarce. There are courses on various training websites, but I am not sure on how good or otherwise they are. Also basic knowledge of C# is required.

Also note, the information provided below are based on the my own personal understanding, knowledge and experience. I too am learning it. I would be glad if you share your thoughts and knowledge too. You would also require an ARCore supported device. For me, I am currently using Pixel 2 device.


ARCore has three main parts which are used for rendering virtual content on the real world using the display medium (like Camera)

  • Understanding the environment — Ability to recognise the horizontal surface on which the real world. This surface will be used to anchor the virtual objects.
  • Motion Tracking — Used for tracking the virtual objects on the real world. It provides the ability to anchor virtual objects on different places.
  • Lighting Estimation — Provides the logic for taking the lighting information from the real world and applying it on the virtual object.

Let us start with developing our first application using Unity and ARCore for Android. We are going to disable a virtual object on real world.


Checklist before we can continue. You can continue even if you do not meet all the requirements, but you will not be able to build and deploy.

  • Unity 2018.1.0f2
  • Android SDK 7.0 and higher
  • Google ARCore SDK for Unity
  • Supported hardware for deployment (I am using Pixel 2)

Create a new application in Unity and let us name it as ARCoreWorld.

Once done, we need to import the Unity file which we had downloaded. To import, Assets → Import Package → Custom Package. Select the file and click import to start the process.

ARCore Unity SDK (At the time of writing this 1.2.0 was latest)

It should then display the items that are being imported. For this case, we are going to import everything.

Once imported delete, main camera and directional light component. Then add the prefab (1) ARCore Device and (2) Environmental Light.

All Prefabs can be found under Favourites → All Prefabs on Project container (bottom left)

In this state, the application is buildable, but to make it build for Android, we need to switch the platform. To access the build settings you can do it from File → Build Settings.

Select Android and then click on Switch Platform

Once it is done, click on Player Settings and in here we are going to make the following changes:

In here we are going to provide a valid package name. Set the Minimum API level to Android API 24 and that target as the highest installed. Also note to disable multithreaded rendering. In the XR Settings, we are going to enable ARCore Support.

At this point, the skeleton code is ready but running this application now would just display the real view through the camera.

Nothing surprising in here

Point Cloud

Point clouds is a container which contains set of 3D data point information about the real world. They are used to deduce information about the real world and the composition of the frame to augment it with virtual object.

For this we need to create a material. Materials are objects which allows us to provide texture or color information to a 3D model.

Create a material named PointMaterial and assign the shader type as ARCore → PointCloud. I changed the size to 15 and the color to red.

Now, once the material is created we need to create an object in which we can use this material. For this purpose, we will create a cube. Once the cube is created we are going to use the material on the cube. And finally, we need to add a component called, Pointcloud Visualizer.

Building the application at the current state would look something like this.

Point Cloud detection on the keyboard

We now have the field ready. Now we are going to place our virtual object into the real world. The object is Diablo. You can get the model from here. This has been created by another developer. I am using it for the purpose of learning.

Let us import the model into Unity. Create a folder named Diablo and drag and drop the .OBJ file into it. Once the model has been imported you will notice that it is just the wireframe of the model. There is no texture. Not to worry. The developer of the model has also included the required texture too, but that is in DDS format. We need to convert from DDS to TGA so that we can import it to Unity. You can use any of the online tools to perform the conversion. Once done, drag and drop the TGA file into Unity.

To use the TGA file and use it as a texture we need to create a material out of it. For this create a new material and use the shader as Unlit/Texture. Then use the TGA file.

Now drag the new material into the Diablo model and see that it changes from the greyscale model to the one that we want to see.

Diablo model once the material is applied

I reduced the scale of the model as otherwise it was very big. You can reduce the scale by selecting the model and reduce the scale factor from 1 to 0.15 or as you desire. Once done, create a prefab from the Diablo model which we will be using on the renderer later.

Now we need view the plane on which the virtual object would be draw. For this we can use the Plane Generator and Plane Visualiser component.

Plane generator (interconnected triangles) with point cloud.

Going forward we will reduce the size of the point cloud cube to 5 units.


Now lets get into some coding. We have the plane, the model with the applied texture. Now we need to place virtual object of Diablo on real world.

Create an empty object and call it DiabloRenderer. Now add a new C# script attached to it. The update method should look something like this.

First, we need to expose three variables which we then need to link.

  • FirstPersonCamera — This is the camera that can be found on ARCore Device. This will allows us to view the virtual object.
  • DetectedPlanePreFab — This will be used for the plane detection on which the virtual object is to be rendered.
  • DiabloGameObject — This is the prefab of the Diablo model.
if (Input.touchCount < 1 || (touch = Input.GetTouch(0)).phase != TouchPhase.Began)

This method will check if we have clicked on the screen. If not, then there is nothing to be drawn.

if (Frame.Raycast(touch.position.x, touch.position.y, raycastFilter, out hit))

Once we touch on the screen, the above will take the input from the screen coordinate on the touch and then case it to the position where the virtual object needs be placed on the real world.

var diabloGameObj = Instantiate(DiabloGameObject, hit.Pose.position, hit.Pose.rotation);
diabloGameObj.transform.Rotate(0, k_ModelRotation, 0, Space.Self);
var anchor = hit.Trackable.CreateAnchor(hit.Pose);
diabloGameObj.transform.parent = anchor.transform;

If the hit is good and is favourable for rendering then first an instance of the game object, for this case the diablo model will be created. On real world it will be placed on the location defined by hit.Pose.position. Next, to affix the model in a place on the real world, even if we try to move around the model, we need to ensure that the virtual object is anchored to the Pose in which it was created. Hence, we create an Anchor and assigned it to the parent of the game object.


Alright, so we have made many changes. Let us now see how our application looks like.

Reduced the scale to 0.05 for the purpose of the screenshot

The virtual object can be viewed 360 degree. The below video provides all the details.

Diablo in my room rendering through ARCore

We have used the Google ARCore example for our understanding. I will continue to work on this and will try to make some more changes to improve on this application. Will try to create a next part to this. Will try to improve more on this and try to go to the next level.

Hope I have been able provide the information and go through some of the points. Do let me know if you have any comment and any pointers on which I can look into and learn more on this.

Like what you read? Give Soumya Kar a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.