ARCore, Sceneform & Augmented Images #2 Android AR
Sceneform & Augmented Images
Hello everyone. This is the Part 2 post in ARCore and Sceneform series. We will be going through some core APIs of sceneform and augmented images used in ARCore. If you want to learn about AR and ARCore follow my first post :
Part 1 — Overview of AR and ARCore
Going forward we will build our AR app:
Sceneform
Sceneform is a 3D framework introduced by Google in IO’18
APIs provided by sceneform help to develop ARCore apps easily in fast way as most of the fundamental concepts required are handled internally, like detecting planes and lights estimation is all handled by sceneform. Best part is no need to learn 3D graphics or openGL.
Some of the core APIs offered which you will be using in most of the Apps are :
ArFragment
— fragment which can be added to android layout file like any other fragment. It automatically checks for required ARcore version in device and also asks for runtime permissions for camera under the hood.
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
>
<fragment android:name="com.google.ar.sceneform.ux.ArFragment"
android:id="@+id/ux_fragment"
android:layout_width="match_parent"
android:layout_height="match_parent" />
</FrameLayout>
ArSceneView
— It renders the camera images and highlightsPlanes
when they are detected by ARCore. You can useArSceneView
directly in your app however in that case you have to handle runtime checks and permissions checks.ModelRenderable
— allows you to load 3D model generally ofsfb
format from a specified path. It returns aRenderable
that can be rendered on the screen.
ModelRenderable.builder()
.setSource(this, <Path of 3D model>)
.build()
.thenAccept(renderable -> myRedenderable = renderable)
Node
— are the virtual objects that need to rendered. It contains all the information sceneform needs to render object which is provided byPose
API (Position and orientaion). You can transform, animate and rotate the node. You can use inbuiltTransformableNode
for transforming object.
For more APIs you can refer official site.
Augmented Images
It lets you build AR apps that can detect 2D images and then we can render 3D objects on it. Using these APIs we are going to develop our app that will detect car 2D image and then we will render 3D car model on top of it.
You need to maintain image database of images ( AugmentedImageDatabaset
API ) which is used by ARCore to detect images. Once the images are detected you can use information of image using AugmentedImage
API.
Now we have required understanding of sceneform and augmented images we will look at steps for creating our first AR app.
Steps for creating sceneform AR App:
- Import 3D assets
- Configure Sceneform and Sceneform plugin
- create
sfa
andsfb
files. sfb files are bundled into apk. We will talk about more about these files in next part. - Configuring Manifest.xml file specific to ARCore apps
- Initialise ArFragment
- Initialise Augmented Image Database and add image to database
- Detect Augmented Image
- Create anchor node at centre position
- Render the model
If any of the terms are not clear here, you will surely get to know when we will start coding
What’s Next?
We will configure Android Studio for loading the sceneform models and begin coding our first ARCore Sceneform app. I knew you were waiting to code :-p
Thanks for reading. If you find the post helpful, cheer me with claps. Spread and help others learn.