Android TensorFlow with custom model

In this article I will present a step-by-step guide how to create a new android app with running neural network in the background. For this, I will use the native library of TensorFlow.

I spent a lot of time creating a basic android app, compile and ran TensorFlow on it, because the official example in the library is too complex to modify it, and the few tutorials, that I have found didn’t include all the details that I needed. So I have decided to summarize my experiences for myself and others here.

Requirements

I will work on Ubuntu 16.04 (Probably it works on other platforms also, because the components are all ported, but I haven’t tried it). I will use TensorFlow 1.2.1., submodules are also important for TensorFlow, so clone it like:

$ git clone --recurse-submodules https://github.com/tensorflow/tensorflow.git

We will need the android NDK 12b from here and the newest Android Studio from here.

And finally we will use Bazel 0.5.2 to build TensorFlow, which you can download from here.

Build TensorFlow for Android

Go to TensorFlow’s root and edit the WORKSPACE file to build the android part also. We just need to uncomment these lines and fill the path to your android SDK and NDK:

Than we build the shared object file for our target architecture with the following command:

$ bazel build -c opt //tensorflow/contrib/android:libtensorflow_inference.so \
 --crosstool_top=//external:android/crosstool \
 --host_crosstool_top=@bazel_tools//tools/cpp:toolchain \
 --cpu=arm64-v8a

My target architecture is arm64-v8a, replace this with yours.

Copy the created file to our home folder:

$ cp bazel-bin/tensorflow/contrib/android/libtensorflow_inference.so ~/

Than build the java counterpart:

$ bazel build //tensorflow/contrib/android:android_tensorflow_inference_java

and copy the created jar also:

$ cp bazel-bin/tensorflow/contrib/android/libandroid_tensorflow_inference_java.jar ~/

Create the Model

Now, we need to create a tensorflow graph, train it, save it, and optimize the inference on it so it can ran rapidly in our device.

I have created a very little graph for the simplicity, but you can replace this with your own easily. The important is that to name your input and output nodes, and make note of their shape.

Running the above piece of code will produce two files: first, it saves the TF computation graph in a GraphDef text file called tfdroid.pbtxt. Next, it will do a simple assignment (which normally would be done through actual learning) and saves a checkpoint of the model variables in tfdroid.ckpt.

Now, that we have the model we need to freeze the graph and optimize it for inference. We can do it like:

$ tensorflow/python/tools/freeze_graph.py --input_graph=’tfdroid.pbtxt’ /
--input_checkpoint=’tfdrouid.ckpt’ --output_node_names=’O’ --output_graph=tfdroid.pb

this command freezes the weights in the graph. Please note that we need the name of the output node here!

After that we can optimize our graph with the graph transform tool:

$ bazel build tensorflow/tools/graph_transforms:transform_graph
$ bazel-bin/tensorflow/tools/graph_transforms/transform_graph \
--in_graph=tfdroid.pb \
--out_graph=optimized_tfdroid.pb \
--inputs='I' \
--outputs='O' \
--transforms='
strip_unused_nodes(type=float, shape=”1,299,299,3")
 remove_nodes(op=Identity, op=CheckNumerics)
 fold_constants(ignore_errors=true)
 fold_batch_norms
 fold_old_batch_norms’

This will output the optimized_tfdroid.pb, which we will use in our android application. We also need to mention the input and output node names for this script.

Createing the Android Application

Create a new android project with an empty Activity in Android Studio. Create a libs folder and add the Tensorflow libraries to the android application:

$ mkdir TfExampleApp/app/libs/arm64-v8a
$ mv ~/libtensorflow_inference.so TfExampleApp/app/libs/arm64-v8a/
$ mv ~/libandroid_tensorflow_inference_java.jar TfExampleApp/app/libs/

We need to let our build system know where these libraries are located by putting the following lines insideapp/build.gradle:

Create an assets folder and place your TensorFlow model file there:

$ mkdir TfExampleApp/app/src/main/assets && cp optimized_tfdroid.pb TfExampleApp/app/src/main/assets/

To access the TensorFlow native library we will need to load the library and import TensorFlowInferenceInterface like:

We will need an instance of TensorFlowInferenceInterface to run the inference. In the constructor we need to provide the assets and the path of our model.

After that we will need to call feed, with the name of the input node, the input numbers and the dimensions of the input to initialize the input node. The run function will need the name of the output nodes. When the inference is done we can call the fetch function with the name of the output node and a corresponding output array with the right dimension. After the call the array will be filled with the output values. Here is my code:

That’s all, I hope I could help. If you need any more info or got stuck somewhere feel free to text me or comment!

Thanks for Amit Shekhar this example and for Omid Alemi for this tutorial. I would not be able to make this guide without their two articles.