I had the opportunity to play around with ARCore to create an augmented reality experience for an Android app. I have attempted AR with Vuforia before, but I had to integrate the AR experience with the existing app. I wanted to create something simple enough to help me learn about ARCore and it’s capabilities.
The first thing that came to mind was to render a layout when an image is scanned, click on the layout and open a web link. Seems pretty simple enough, right?
There is a lot that one can do with ARCore, but in this blog post, we will focus on the use case mentioned above.
Brief Intro To ARCore
ARCore is a platform for building augmented reality experiences. It enables your phone to do the following:
- Sense its environment.
- Understand the world.
- Interact with information.
How does ARCore integrate the real world and virtual content? It uses the following capabilities to do that:
- Motion Tracking- allows the phone to understand and track its position relative to the world. The phone’s camera is used to identify feature points. ARCore’s motion tracking technology then tracks how those features move over time. ARCore determines the pose(position and orientation) of the phone as it moves through space using the combination of the movement of features and readings from the phone’s inertial sensors.
- Environmental Understanding- this capability allows the phone to detect the size and location of all types of surfaces and makes those surfaces available to your app as planes (not to be confused with ✈️) .
Flat surfaces without texture, such as a plain white wall may not be detected correctly due to the fact that ARCore uses feature points to detect planes.
- Light Estimation- ARCore allows the phone to detect information about the environment’s current lighting conditions. This information increases the sense of realism as it lights up your virtual objects under the same conditions as the environment around them.
The ARCore SDK is available for the following platforms:
- Android NDK
- Unity for Android
- Unity for iOS
For this blog post, focus is on the Android Platform. We will build an AR app that responds to 2D images in the environment by making use of the Augmented Images API.
Augmented Images API
The Augmented Images API has the following capabilities:
- Tracking and responding to images that are fixed in place. With ARCore 1.9, ARCore also has the capability of tracking moving images or an image on a flat object held by the user as they move their hands around.
- Providing estimates for pose and physical size. If the physical size of the image is not provided, ARCore estimates the size.
- Continuous tracking of an image’s pose even though the image is temporarily moved out of the camera view after detection.
- Tracking up to 20 images simultaneously.
- Tracking happens on the device, so an internet connection is not necessary.
- Images can be added to an image database at runtime.
To ensure that the API works as intended, the images that will be used should:
- Fill at least 25% of the camera frame.
- Be flat.
- Be in clear view of the camera.
Now that we have the gist of things, let’s get to some code 👩🏾💻
✨Do check if your device is ARCore supported device. The list can be found here.
To enable ARCore for your Android app, you can follow the guide here .
Setting Up An AugmentedImageDatabase
The database will be used to store a list of images to be identified and tracked by ARCore. There are two ways of creating an AugmentedImageDatabase:
- Load a saved image database- if you have an existing image database use:
AugmentedImageDatabase.deserialize() to load it.
2. Create a new database
Loading an image to an existing database
You can load an image to an existing database as follows:
Best Practices for Reference Images
- Augmented Images supports PNG and JPEG file formats.
- Augmented Images detects points of high contrast on images. Colour, black and white images are detected. The better the contrast, the better the detection.
- Image resolution should be 300 x 300 pixels. Using an image with a higher resolution does not improve performance.
- Avoid images with repetitive or scattered features.
- Make use of the arcoreimg tool to get a score for each image, this helps in checking the quality of your reference images.
The ARCore session looks for images by matching feature points from the camera and compares them against those in the image database. To check if we have any matches, we loop through the images in the database.
When we have a match, the layout is converted into a render and is added in the ArSceneView.
The following code sample is how we set the image that is matched:
Setting the onClick listeners for images:
Method to open the web link:
It was quite a challenge for me to get the hang of ARCore, I hope this post gives you a good head start.
So, What Next?
There are a couple of apps you can build using ARCore. Fancy a bit of Snapchat 🤳🏽 like I do? You can play around with Augmented Faces to create your own version. Augmented Faces allows you to detect different regions on faces and overlay assets on those regions.
What AR app do you want to build?
Get in touch.