AR technology for Android — Part 3: ARCore SDK

Loredana Zdrânc
Zipper Studios
Published in
6 min readJun 4, 2019

Like the second part of this article, “AR technology for Android — Part 2: Wikitude SDK”, this part is addressed to Android developers who want to gain a broader understanding of augmented reality technology. This article will guide you step-by-step in building your first ARCore Android app.

As an alternative to Tango, ARCore is a platform powered by Google which incorporates multiple functionalities related to AR technology manipulation on Android apps.

Let’s dive into ARCore by understanding its fundamental concepts!

  • Motion tracking: Behind this platform lays a complex process named COM (Concurrent Odometry and Mapping). As your phone moves in the world, ARCore tracks your device’s position relative to the real world and detects the feature points of the space. These points are used to determine when the device changes its position. The visual information is combined with device inertial measurements in order to get the position and the orientation of the camera in real-time. The virtual camera position is aligned with the device camera position and the virtual content is overlapped with the real content. In this way, the virtual content seems to be part of the real world.
  • Environmental understanding: Before diving into this concept, we should know what ARCore exactly does. ARCore constantly builds a space model by collecting information, as the device moves, retaining new parts of space and new details about it. At the same time, it detects and groups the feature points in order to establish the common horizontal and vertical surfaces and interpret them as planes. Because the whole ARCore’s algorithm is based on feature points detection, the plane surfaces without texture, like white walls, cannot be detected.
  • Light estimation: The goal of this technology is to create content as realistic as possible. ARCore detects information about environmental lighting and provides the virtual content with the average intensity and color correction of a given camera image.

The great advantage of this AR SDK is related to Cloud Anchors. They allow you to create multiplayer or collaborative AR experiences that Android and IOS users can share. You can learn about Cloud Anchors in the next part of the article: “AR technology for Android — Part 4 : AR Cloud Anchors

Enough talk. Let’s code!

I’ll further explain step-by-step how to create your first Android app using Google ARCore SDK. The code will be written in Kotlin. The app will place a virtual wolf on the real world space around you.

Step 1. Open Android Studio IDE and create a project.

Step 2. Open AndroidManifest.xml and add camera and internet permissions and ARCore metadata.

<uses-permission android:name="android.permission.CAMERA"/><uses-permission android:name=“android.permission.INTERNET"/><application<meta-data 
android:name="com.google.ar.core"
android:value=“optional"/></application>

Step 3. Open activity_main.xml and replace TextView with a Button. When this button is tapped, camera permission is requested. If permission is enabled, the activity responsible for loading AR Experience is started. We will request camera permission using the easypermissions library.

companion object {const val CAMERA_REQUEST = 201}private fun onButtonClicked() {val perms = arrayOf(Manifest.permission.CAMERA)if (!EasyPermissions.hasPermissions(this, *perms)) {EasyPermissions.requestPermissions(this,getString(R.string.camera_permission_rationale),CAMERA_REQUEST,*perms)return}startSimpleArActivity()}@AfterPermissionGranted(CAMERA_REQUEST)private fun startSimpleArActivity() {startActivity(Intent(this, ArCoreActivity::class.java))    
}
override fun onRequestPermissionsResult(requestCode: Int, permissions: Array<String>, grantResults: IntArray) {EasyPermissions.onRequestPermissionsResult(requestCode, permissions, grantResults, this)}

Step 4. Create an activity responsible for rendering the virtual content over the real world images captured with the device camera. Do not forget to declare the new activity in Manifest!

Step 5. Add the Sceneform dependency.

classpath ‘com.google.ar.sceneform:plugin:1.5.0’ // Project level build.gradleimplementation “com.google.ar.sceneform.ux:sceneform-ux:1.9.0"     // App level build.gradle

Step 6. Now, because you can access the AR views, you can create the layout corresponding to ARCoreActivity and add the ArFragment inside ArCoreActivity corresponding layout.

<fragmentandroid:layout_height="match_parent"android:id="@+id/sceneform"android:layout_width="match_parent"app:layout_constraintStart_toStartOf="parent"app:layout_constraintEnd_toEndOf="parent"android:name="com.google.ar.sceneform.ux.ArFragment"app:layout_constraintTop_toTopOf="parent"app:layout_constraintBottom_toBottomOf="parent"/>

Step 7. Before building and adding the model to the scene, you need to add the asset into your project. Go to https://free3d.com/3d-model/wolf-rigged-and-game-ready-42808.html and download the free wolf asset. Unzip the archive and copy .obj and .mtl files from the OBJ folder. Place files inside sampledata -> wolf. If you don’t have a sample data directory, create it: right-click on app directory -> New -> Sample Data Directory.

Step 8. Inside ArCoreActivity create a boolean method that returns true if your device is compatible with ARCore SDK or false otherwise. ARCore requirements are the following:

  • Android Studio version must be a minimum of 3.0
  • Device must have at least 7.0 Android version
  • Camera permission should be enabled

Check the SDK version.

if (Build.VERSION.SDK_INT < Build.VERSION_CODES.N) {Toast.makeText(this, "Sceneform requires Android N or later",       Toast.LENGTH_LONG).show()finish()return false}

Get OpenGL version and check it’s a minimum 3.0.

val openGlVersion = (getSystemService(Context.ACTIVITY_SERVICE) as ActivityManager).deviceConfigurationInfo.glEsVersionif (java.lang.Double.parseDouble(openGlVersion) < 3.0) {Toast.makeText(this,   getString(R.string.opengl_version_required), Toast.LENGTH_LONG).show()finish()return false}return true}

Step 9. Import the asset: right click on the .obj file, choose the first option Import Sceneform Asset and click finish.

Note: If you don’t see this option, install Google Sceneform Tools(Beta) plugin:

  • Windows: File-> Settings-> Plugins->Browse Repositories
  • Mac OS : Android Studio -> Preferences -> Plugins

The Wolf_One_obj.sfb file is generated into the assets directory and your module build.gradle file contains a sceneform.asset() line of code, where:

  • sampledata/wolf/Wolf_One_obj.obj is the source asset path specified during import;
  • default is model specified during import;
  • sampledata/wolf/Wolf_One_obj.sfa is the .sfa output path specified during import;
  • src/main/assets/Wolf_One_obj is the .sfb output path specified during import.
sceneform.asset('sampledata/wolf/Wolf_One_obj.obj',
'default',
'sampledata/wolf/Wolf_One_obj.sfa',
'src/main/assets/Wolf_One_obj')

Step 10. Build the model using ModelRenderable class that works asynchronously. We need to create the ModelRenderable by setting the source that will load the model. ThenAccept will be called if the model will be successfully loaded and will return the built model. As its name says, the exceptionally() method will be called if the model cannot be created.

ModelRenderable.builder().setSource(this, Uri.parse("armchair.sfb")).build().thenAccept { renderable -> modelRenderable = renderable }.exceptionally {Toast.makeText(this, "Unable to load object.",   Toast.LENGTH_LONG).show()null}

Step 11. Now that we have the model, all we have to do is to add the model to the scene. But, before we do that, we need to understand some concepts:

  • Scene — is the space where the 3D object will be placed. ArFragment hosts this scene. An anchor node is attached to the scene and all the other objects are rendered as its objects.
  • Anchor — describes a fixed location and orientation in the real world.
  • Anchor node — is the first node that gets set when the plane is detected.
  • HitResult — can be seen as an infinite imaginary line that gives the point of intersection between itself and the real world.
  • TransformableNode — it is a node that can be interacted with. It can be moved around, scaled, rotated and much more.
val mARFragment = supportFragmentManager.findFragmentById(R.id.sceneform) as ArFragment?mARFragment?.setOnTapArPlaneListener { hitResult: HitResult, plane:  Plane, motionEvent: MotionEvent ->if (modelRenderable != null) {val anchor = hitResult.createAnchor()val mAnchorNode = AnchorNode(anchor)mAnchorNode.setParent(mARFragment.arSceneView.scene)val mARObject =   TransformableNode(mARFragment.transformationSystem)mARObject.setParent(mAnchorNode)mARObject.renderable = modelRenderablemARObject.select()
}
}

Step 12. Build your project and run it on your hardware connected device. Tap PLACE WOLF button and experience your first ARCore app.

The full code is available on Github.

That’s all!

Hopefully, this post showed you how powerful and easy to work with Google’s ARCore platform is. If you would like to learn how multiple devices can share the same augmented reality world, you can read the following article I wrote on Cloud Anchors: AR technology for Android — Part 4: AR Cloud Anchors.

Thanks for reading! Have a nice AR experience!

https://www.zipperstudios.co

Zipper Studios is a group of passionate engineers helping startups and well-established companies build their mobile products. Our clients are leaders in the fields of health and fitness, AI, and Machine Learning. We love to talk to likeminded people who want to innovate in the world of mobile so drop us a line here.

--

--