How to build custom TensorFlow binary for Android and iOS

Vladimir Valouch
5 min readMay 6, 2019

--

Photo by George Hiles on Unsplash

This post describes why and how to build a custom binary version of TensorFlow for running a trained model in your Android or iOS app. The post describes both the happy path and how to handle some of the common exceptions.

The motivation for this post was to help others avoid the ultra time-consuming path that I had to go through when I tried to build a custom version of TF binaries.

Mobile and Machine Learning

Does your model operations fit into the list of supported operations by TensorFlow Lite (TF Lite)?

If yes, then you are a lucky person. Your setup consists of adding the TF Lite dependency and calling it. You can have a look at an example described in “From Keras to Android with TF Lite” if are not sure where to start.

If no, well, it will be harder but still doable. First, we need to decide whether we rebuild the model or binary for running the model or wait and hope that the operations will be added to TF Lite. None of these decisions is simple and each has its own flaws. I would personally encourage you to rewrite the model. The model rewrite makes sure that you can easily update TF binaries in the future.

On the other hand, even model rewrite might not ensure that it will always work for you. I actually had to rebuild the TF library just because of different C++ macro definition and string operations in TF source for the server-side run than the mobile one. You can see one of the differences that exist in the TensorFlow code.

//example of differences tensorflow/core/framework/register_types.h
//server side TF macro definition
#define TF_CALL_bool(m) m(bool)
//Android side TF macro definition
#define TF_CALL_bool(m)

Thus in order to be able to run the same model that you trained for the server you really need to build the TF from scratch.

Tensorflow Setup

The installation steps are pretty nicely described in the Tensorflow(TF) webpage. First, scroll down to find out the correct version of Bazel and then install it. The incorrect combination of Bazel with TensorFlow causes quite some unpredictable troubles. Thus please TRIPLE CHECK that you have compatible versions of Bazel and TensorFlow. The next step is to simply clone Tensorflow. After git clone, you can have a look at available branches and select one.

git branch -r 
git checkout -f r1.13

These commands above should ensure that you have TensorFlow(TF) 1.13 source code ready for building. While being in the TF directory run the .config to finish the setup.

Android build steps

The Android build has Android NDK to Bazel compatibility challenge. This should go hopefully away with TF 2+ and newer versions of Bazel. Until that moment I still recommend the usage of Android NDK 14b. It is also very convenient to have the system variables in the “.bashrc” to “.bash_profile”.

#MacOS/Linux system variables
export ANDROID_NDK_HOME=”/Users/vvalouch/android-ndk”
export NDK_ROOT=$ANDROID_NDK_HOME
export PATH=$PATH:”$ANDROID_HOME/tools/bin/”

Once the system variables setup is done we can start the build. The first run can take up 60minutes so please be patient. My personal experience is that the first 3–5 attempts to build TF will most probably fail. The usual causes were either missing or the wrong version of libraries or need of minor tweaks to TF source codes.

bazel build \ tensorflow/python/tools/print_selective_registration_header

This step prepares the tool for our next step which is the creation of the header definition file. The reason why you should care about selective header registration is the size of the TF binary.

bazel-bin/tensorflow/python/tools/print_selective_registration_header --graphs=/path/to/frozen_graph.pb,/path/to/frozen_graph2.pb > tensorflow/core/framework/ops_to_register.h

This way you can register selective headers for multiple models. This step takes about 10sec.

This step will use the selective headers file to create an inference library and place it to your project dir. Please keep in mind that the script above counts with Android’s project setup for library development. You just need to run this step as many times as you have architectures. The good news about this step is that only the first run takes ages. The bad news is that this step is super error prone. The last step is JNI wrapper generation which is shown in the example below.

Once you have created both *.so and *.jar you are pretty much done and you can switch to Android Studio and start using the TF.

Android TF compilation errors

A few compilation errors that I bumped into and their solution/workaround can be found in the Gist below.

iOS build steps

The custom build of TF for iOS is simpler for compilation but a little bit more challenging for later project configuration than Android.

git clone https://github.com/tensorflow/tensorflow.gittensorflow/contrib/makefile/download_dependencies.sh

If you have not installed already then install the following.

xcode-select --install
brew install automake
brew install libtool

The happy path

If you are lucky one this step is your last build one. The command below will generate needed libraries for you.

tensorflow/contrib/makefile/build_all_ios.sh [-a arch] \ 
-g /path/to/frozen_graph.pb,/path/to/frozen_graph2.pb

The challenging path with “tf.estimator package not installed”

I would strongly recommend using different branches of the TF if you do any OS-specific modifications. I tried to use just one but it was very error prone.

Additional iOS TF compilation errors

A few compilation errors that I bumped into and their solution/workaround can be found in the Gist below.

In the first part of the post, you can read about how to create a custom TF binary for Android the “happy path” way. The post continues with tips on how to handle some of the exceptions that I encountered. The third part took us through happy path build for iOS and the last part showed how tricky it can be if you are not the lucky one and meet some build exceptions.

I personally like TensorFlow and I am using it as much as I can. Nevertheless, make sure that you are always on the “well-lit” path from Google otherwise prepare for “The Lord of the Rings” style adventure to reach your goal.

And what is your general experience with TensorFlow?

--

--