Using a Pre-Trained TensorFlow Model on Android — Part 1
It used to be really hard to embed a pre-trained TensorFlow model on Android, and now it is much easier!
Tell Me More!
After giving my “Machine Learning for Android Developers” talk at AnDevCon DC (screencast, slides), I was moaning about the complexity of embedding TensorFlow models into Android apps. Fortunately, some of the Google IOT developer advocates were there. In addition to giving great talks and running an excellent hackathon, they were available to answer lots of questions. :-) Wayne Piekarski showed me this sample Android Things project:
Contribute to sample-tensorflow-imageclassifier development by creating an account on GitHub.github.com
This project demonstrates a simple way to embed a pre-trained model, simply by adding a dependency to your
build.gradle and then using the
TensorFlowInferenceInterface class (docs are in the TensorFlow contrib README).
How did This Work Before?
Here’s a lovely diagram showing how an Android app calls from Java to TensorFlow, by using the Java Native Interface (JNI) to bridge into the NDK:
The TensorFlowAndroidMNIST project gives an example of one way to get this all working. The author has written JNI code and built it into a
libtensorflow_mnist.so native library, which is then called from the Java
This code is complex and hard to maintain, e.g. the JNI code needs to be built differently from normal Android Studio/Gradle builds.
In the above example, only the
armeabi-v7a architecture is supported, so I expect this project will crash on other architectures (e.g. an x86 emulator).
You’ve Got to Admit, It’s Getting Better!
To make this easier, in late 2016 Google added the
TensorFlowInferenceInterface class (GitHub commits). This helped standardize how to interface with TensorFlow models from Java. It provides these prebuilt libraries:
libandroid_tensorflow_inference_java.jar— the Java interface layer.
libtensorflow_inference.so— the JNI code that talks to the TensorFlow model.
The AndroidTensorFlowMNISTExample project gives an example of this approach:
The Future is Now
In Feb 2017, an Android specific contrib was added to TensorFlow which allows all the native binaries and Java code to be built into a single library (packaged as an AAR file). In May 2017 that dependency was published on JCenter.
Now, to embed an existing TensorFlow model, all we need to do is:
- Include the
compile 'org.tensorflow:tensorflow-android:+'dependency in your
- Use the
TensorFlowInferenceInterfaceto interface with your model.
I tried this out for myself by updating the AndroidTensorFlowMNISTExample project to use the new Gradle dependency. Here’s my pull request:
The org.tensorflow:tensorflow-android:1.2.0 dependency downloads an AAR file which includes native binaries for all…github.com
Read Part 2 to find out more about how the
TensorFlowInferenceInterface and the
org.tensorflow:tensorflow-android dependency work!
DISCLOSURE STATEMENT: These opinions are those of the author. Unless noted otherwise in this post, Capital One is not affiliated with, nor is it endorsed by, any of the companies mentioned. All trademarks and other intellectual property used or displayed are the ownership of their respective owners. This article is © 2017 Capital One.
For more on APIs, open source, community events, and developer culture at Capital One, visit DevExchange, our one-stop developer portal: https://developer.capitalone.com/