ARCore, Sceneform & Augmented Images #3 Android AR app
Creating Android AR app using Sceneform
In this following series we have so far learnt about AR, ARCore and Sceneform fundamentals. You can follow along here :
- Part 1 — Overview of AR and ARCore
- Part 2 — Overview of Sceneform Augmented Images and it’s core APIs
- Part 3 — Building ARCore App using Sceneform
My AR App
We are going to build an AR app that will detect a car image and place a 3D model over it which user should be able to transform and rotate.
Lets start getting our hand dirty with this cool stuff — AR
Step 1: Import 3D assets
You can use the Google’s Poly site to download the required asset. There are other sites also available to download 3D objects.
You can search for the car object and download model as OBJ file.
Step 2 : Configure Sceneform and Sceneform plugin
We will be creating a new project in Android Studio and set minimunSDK as version 7.0 (API level 24).
Note: Android Studio version to be used should be version 3.1 or higher
In gradle (app level) add sceneform plugin dependency as
classpath 'com.google.ar.sceneform:plugin:1.4.0'
Next we need to add ARCore and Sceneform to our project. Add following libraries to gradle file (project level)
//arcore
implementation 'com.google.ar:core:1.4.0'
implementation "com.google.ar.sceneform:core:1.4.0"
implementation "com.google.ar.sceneform.ux:sceneform-ux:1.4.0"
add this to bottom of the gradle to apply sceneform
apply plugin: 'com.google.ar.sceneform.plugin'
Step 3 : Export 3D assets into project — Sceneform tools plugin
Android studio provides us with the plugin Google Sceneform tools which will help us in exporting assets to our project. You can find it under :
Settings
> Plugins
> Browse Repositories
Now that we have the downloaded 3D model and plugin installed we can make use of this plugin to create sfb
files and export 3D model into our project. To do so follow these steps :
- We will be holding our downloaded 3D model into
sampledata
folder provided by Android Studiorightclick app > new sample directory.
Data stored here is not bundled into apk rather it is only used for development purposes. Copy downloaded.obj
and.mtl
files into this. - To export models,
right click on obj > Import sceneform asset.
- Above step will create two files
.sfa
(Sceneform asset definition) and.sfb
(Sceneform binary asset) files. .sfa file is human readable definition of .sfb file. It points to the models, material definitions, and textures in your 3D asset. It’s in json format and you can change certain values to change look and feel of your asset. For more info you can check official doc here - Gradle build happens during this process and necessary changes in the build our done automatically by this tool.
If you see successful gradle sync and can see .sfb
file in specified path then yes! we are done importing 3D model into project. Next we will look into initiating AR fragment.
Step 4 — Configuring AndroidManifest.xml file specific to ARCore apps
Next we need to configure our project’s Android manifest to add some meta-data which are required for ARCore apps
//camera permission
<uses-permission android:name="android.permission.CAMERA" />
// application requires Arcore
<uses-feature
android:name="android.hardware.camera.ar"
android:required="true" />//inside application tag adds this
<meta-data
android:name="com.google.ar.core"
android:value="required" />
These configurations ensure that AR app runs only in devices that support ARCore and specfies required permissions and features.
Step 5— Initialise ArFragment
Add Arfragment to your mainactivity layout file
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
>
<fragment android:name="com.google.ar.sceneform.ux.ArFragment"
android:id="@+id/ar_fragment"
android:layout_width="match_parent"
android:layout_height="match_parent" />
</FrameLayout>
Then you can refer to this fragment in your main activity java file
arFragment= (ArFragment) getSupportFragmentManager().findFragmentById(R.id.ar_fragment);
// hiding the plane discovery
arFragment.getPlaneDiscoveryController().hide();
arFragment.getPlaneDiscoveryController().setInstructionView(null);
We are hiding the plane detection as it is true by default and for the particular example we are doing, we are not required to detect plane rather we will be rendering 3D model over the detected image. I will be explaining rendering model on plane in other post.
Run and Try
Save and run the app in your AR supported device or emulator.
If you have followed along through these steps, then you should be able to see camera permission popup (Noted that we haven’t written any code to handle permissions , it’s because it is handled internally by ArFragment
). After this you should see camera feed running in your phone.
Summary
- We were able to add plugin to export 3D models into Android studio
- We have added the necessary libraries and dependencies to gradle files
- We have made required configuration changes to AndroidManifest.xml
- Initialises ArFragment
What’s Next?
Next, we will code along to complete the remaining steps.
- Initialise Augmented Image Database and add image to database
- Detect Augmented Image
- Create anchor node at centre position
- Render the model
Thanks for reading. If you find the post helpful, cheer me with claps. Spread and help others learn.