Introduction to Motion Tracking in ARCore…

Shivang Chopra
Coding Blocks
Published in
8 min readJul 23, 2018

Before digging into the concept of Motion Tracking in ARCore and it’s implementation, it is important to learn about the various hardware of a phone used by ARCore and their purpose in creating a better Augmented experience for the user.

The mobile hardware can be Bradly categorised into 3 categories based on their functionality:-

  1. Hardware that enables motion tracking
  2. Hardware that enables location based AR
  3. Hardware that enables view of real world with AR

Hardware that enables Motion Tracking

Accelerometer: Measures acceleration, which is speed divided by time. Simply put, it’s the measure of change in velocity. Acceleration forces can be static/continuous — like gravity — or dynamic, such as movement or vibrations.

Gyroscope: Measures and/or maintains orientation and angular velocity. When you change the rotation of your phone while using an AR experience, the gyroscope measures that rotation and ARCore ensures that the digital assets respond correctly.

Phone Camera: With mobile AR, your phone camera supplies a live feed of the surrounding real world upon which AR content is overlaid. In addition to the camera itself, ARCore-capable phones like the Google Pixel rely on complementary technologies like machine learning, complex image processing, and computer vision to produce high-quality images and spatial maps for mobile AR.

Hardware that enables location-based AR

Magnetometer: Gives smartphones a simple orientation related to the Earth’s magnetic field. Because of the magnetometer, your phone always knows which direction is North, allowing it to auto-rotate digital maps depending on your physical orientation. This device is key to location-based AR apps.

GPS: A global navigation satellite system that provides geolocation and time information to a GPS receiver, like in your smartphone. For ARCore-capable smartphones, this device helps enable location-based AR apps.

Hardware that enables view of real world with AR

Display: the display on your smartphone is important for crisp imagery and displaying 3D rendered assets. For instance, Google Pixel XL’s display specification is 5.5" AMOLED QHD (2560 x 1440) 534ppi display, which means that the phone can display 534 pixels per inch — making for rich, vivid images.

Tracking in AR

AR relies on computer vision to see the world and recognise the objects in it. The first step in the computer vision process is getting the visual information, the environment around the hardware to the brain inside the device. The process of scanning, recognising, segmenting, and analysing environmental information is called tracking, in immersive technologies. For AR, there’s two ways tracking happens, inside-out tracking and outside-in tracking.

Outside-In Tracking

With Outside-in Tracking, cameras or sensors aren’t housed within the AR device itself. Instead, they’re mounted elsewhere in the space. Typically, mounted on walls or on stands to have an unobstructed view of the AR device. They then feed information to the AR device directly or through a computer. Outside-in Tracking overcomes some of the space and power issues that can occur with AR devices. The external cameras or sensors can be as large as you want, at least theoretically. You don’t have to worry about people wearing them on their faces or carrying them in their pockets. But what you gain in function, you lose in portability. If your headset loses connection to the outside sensors for even a moment, then they can lose tracking. The visuals will suffer breaking immersion.

Inside-Out Tracking

With inside-out tracking, cameras and sensors are built right into the body of the device. Smartphones are the most obvious example of this type of tracking. They have cameras for seeing and processors for thinking in one wireless battery-powered portable device. On the AR headset side Microsoft’s HoloLens is another device that uses inside-out tracking in AR. But all that hardware takes up space, power, and generates heat. The true power of standalone AR devices will emerge when they become as ubiquitous and as useful as smartphones.

Motion Tracking

Whether it’s happening on a smartphone or inside a standalone headset, every AR app is intended to show convincing virtual objects. One of the most important things that systems like ARCore do is motion tracking. AR platforms need to know when you move. The general technology behind this is called Simultaneous Localisation and Mapping or SLAM. This is the process by which technologies like robots and smartphones analyse, understand, and orient themselves to the physical world. SLAM processes require data collecting hardware like cameras, depth sensors, light sensors, gyroscopes, and accelerometers. ARCore uses all of these to create an understanding of your environment and uses that information to correctly render augmented experiences by detecting planes and feature points to set appropriate anchors. In particular, ARCore uses a process called Concurrent Odometry and Mapping or COM. That might sound complex, but basically, COM tells a smartphone where it’s located in space in relationship to the world around it. It does this by capturing visually distinct features in your environment. These are called feature points. These feature points can be the edge of a chair, a light switch on a wall, the corner of a rug, or anything else that is likely to stay visible and consistently placed in your environment. Any high-contrast visual conserve as a feature point. This means that vases, plates, cups, wood textures, wallpaper design, statues, and other common elements could all work as potential feature points. ARCore combined, it’s new awareness of feature points with the inertial data, all the information about your movement, from your smartphone. Many smartphones in existence today have gyroscopes for measuring the phones angle and accelerometers for measuring the phones speed. Together, feature points in inertial data work together to help ARCore determine your phones pose. Pose means any object’s position and orientation to the world around it. Now that ARCore knows the pose of your phone, it knows where it needs to place the digital assets to seem logical in your environment. Remember, virtual objects need to have a place and be at the right scale as you walk around them. For example, this lion needs to have its feet on the ground to create the illusion that it is standing there, rather than floating in space.

Playing with Motion Tracking in Unity3D

In the app we will be building, we will not be anchoring the objects to a plane as we are just focussing on the motion tracking component of ARCore and not the complete environmental understanding.

Step 1: Open Unity3D and create a new project names ARCore102. Make sure to select 3D in the template section.

Step 2: After the project is created, head over to Unity->Unity Preferences->External Tools. Add the path of Android SDK and Android JDK taken from Android Studio into the respective fields.

Step 3: Go to File->Build Settings, select Android in the Platforms list. Then click Switch Platform. The unity logo should appear in front of Android in the platforms list.

Step 4: Download ARCore SDK for Unity from the following link:

Step 5: Right click in the Assets window->Import Package->Custom Package and select the downloaded ARCore SDK.

Step 6: Click on All and then Import in the dialog box that appears.

Step 7: The Asset window should look like this.

Step 8: Go to File-> Build Settings and click on Player Settings.

Step 9: The following window will appear in the Inspector.

Step 10: In the Inspector window, configure the following settings:

Step 11: Go to Asset Store Window and search for SD Martial Arts Girl.

Step 12: On the page that appears scroll down and click on Downlaod and then Import.

Step 13: The Project window should now look like this:

Step 14: Go to SD_Character->Character->4Hero->Prefabs and add the Model to the scene.

Step 15: Go to SD_Character->Character->4Hero->Animations->Chara_4Hero_Controller. The following screne appears.

Step 16: Delete all cells except Entry, Any State and Exit.

Step 17: Add the Attack, Idle and Dead Animations to the Animator windows.

Step 18: Go to the MainCamera Game Object and click Add Component in the inspector window. Then type in Track Pose Driver and press Enter.

Step 19: Set the Pose Source in Inspector Window to Center Eye.

Step 20: Go to the MainCamera Game Object and click Add Component in the inspector window. Then type in ARCore Session and press Enter. Also add the ARCore Background Renderer component.

Step 21: Click on the circle beside Session Config and select Default Session Config.

Step 22: Click the circle beside Background Material and select AR Background.

Step 23: Save the scene and build the app. Test the app on your Android device and play with it.

That’s it! I hope you liked this post. If you did, don’t forget to 👏.

Next Up: Point and Plane Detection in ARCore.

--

--