ARCore v1.7 — An intense AR update

Sumeet Rukeja
Yudiz Solutions
Published in
5 min readApr 15, 2019

Finally! Here comes the end to my wait for ARCore to turn towards the form it is expected to be in. Google released ARCore library back in 2017 and since then it is progressing relatively slow in comparison with its rivals in terms of features and updates. But this latest update has surely helped it take a noticeable leap forward.

What’s new?

Let’s glance through the features it has to offer in this update.

  • The trending face filters are now possible with ARCore and these can be implemented in a very easy way.
  • ARCore introduces ARCore Elements. These guide us to set correct AR experiences in the app. Google has released its app on Play Store which teaches developers the better way to work with AR experience.
9to5Google
  • The animation is now possible! .fbx files can be used for this. A very basic AR requirement missing since years is now fulfilled. :D
  • New Shared Camera feature can allow users to switch camera modes between AR mode and normal mode very quickly allowing them to capture/record their AR experience in a simple and smooth way.

These are noteworthy features of this update apart from tons of bug fixes and improvements.

Augmented Face API

Before getting our hands dirty with a demo, I want to talk a bit about face filter feature which is the takeaway for me. Google calls it Augmented Face. It builds a high-quality 468-points face-mesh without using depth sensors. This forms the strong base to place 3D models like animated mask and accessories.

Practical

I’ll demonstrate a feature to place a character from a chromakey video into AR. This was achievable even before this update but its awesomeness has forced me to experiment on it and showcase it to wonderful people like you :D.

For those who are not aware about chroma key video, it is basically a video with a green background which helps designers to customize it. The following gif will help you understand it better.

I have used Pixabay to download this video. Let’s plot this character in AR.

Prerequisites :

  1. Basic knowledge of ARCore and Sceneform.
  2. A chroma key video.
  3. A video renderable which will be plotted on the surface and will act as a base to play the video. It is an .obj file. It must be converted into scenform’s very own .sfb file as sceneform doesn’t work directly with 3D model files. It can be downloaded using the link to codebase attached with this post. This is how it looks :

I’ll skip the initialization part and quickly move to the core part which does the actual magic. If you are unfamiliar with ARCore and Sceneform, you can refer to my blog on the same. It’ll provide all the basics you need to understand the concept.

We’ll store the required files in the raw folder.

val texture = ExternalTexture()mediaPlayer = MediaPlayer.create(this, R.raw.dance)
mediaPlayer!!.setSurface(texture.surface)
mediaPlayer!!.isLooping = true
ModelRenderable.builder()
.setSource(this, R.raw.chroma_key_video)
.build()
.thenAccept { renderable ->
videoRenderable = renderable
renderable.material.setExternalTexture(“videoTexture”, texture)
renderable.material.setFloat4(“keyColor”, CHROMA_KEY_COLOR)
}
.exceptionally { throwable ->
Log.e(TAG, “Error loading video renderable”)
null
}

The above code is what makes this AR feature possible. Here, we have got a media player object which plays the video stored in the raw folder. We have an External Texture object which ‘connects’ video with the 3D object. The surface of the texture is provided to the media player and the same texture is binded with renderable as shown above. The CHROMA_KEY_COLOR is the green color in the video. Here, its value is Color(0.1843f, 1.0f, 0.098f).

Note: The keys used for texture and color are predefined. Changing these may produce unexpected outputs.

When the user taps on the detected surface, we need to bind this renderable with the node and place the node on the anchor as shown in below code. We also need to start the media player in this scenario.

arFragment!!.setOnTapArPlaneListener { hitResult: HitResult, plane: Plane, motionEvent: MotionEvent ->
if (videoRenderable == null) {
return@setOnTapArPlaneListener
}
val anchor = hitResult.createAnchor()
val anchorNode = AnchorNode(anchor)
anchorNode.setParent(arFragment!!.arSceneView.scene)
val videoNode = Node()
videoNode.setParent(anchorNode)
val videoWidth = mediaPlayer!!.videoWidth.toFloat()
val videoHeight = mediaPlayer!!.videoHeight.toFloat()
videoNode.localScale = Vector3(
VIDEO_HEIGHT_METERS * (videoWidth / videoHeight), VIDEO_HEIGHT_METERS, 1.0f)
if (!mediaPlayer!!.isPlaying) {
mediaPlayer!!.start()
texture
.surfaceTexture
.setOnFrameAvailableListener { surfaceTexture: SurfaceTexture ->
videoNode.renderable = videoRenderable
texture.surfaceTexture.setOnFrameAvailableListener(null)
}
} else {
videoNode.renderable = videoRenderable
}
}

And boom…! We have a dancing character in AR, filtered out from a video. :D

I’m amazed to see the shadow that ARCore adds to the character!

Use cases

As we can load videos dynamically in the app, we can give this feature a dynamic edge. API can be used to download videos from the server into the SD card. This way, the admin can have control over the AR content. Also, we can add tons of animations in the video and can load it in AR without building 3D models of characters.

Conclusion

ARCore is already on its path to G.O.A.T AR SDK and with Google I/O ’19 round the corner, I’m sure that there will be huge announcements in ARCore.

That’s it for now, thank you for going through the blog. Have a great augmented day ahead! :D

Codebase link

https://github.com/yudiz-solutions/arcore_chromakey_video

--

--

Sumeet Rukeja
Yudiz Solutions

An Android App Developer with an increasing interest in Kotlin. In love with ARCore and Flutter. A Machine Learning admirer.