Creating an image classifier on Android using TensorFlow (part 3)

This post assumes you’ve already read part 1 and part 2 of the series!

We’ve already setup our Docker container for building TensorFlow and the Android demo app. Now we’re going one step further and we’re actually going to retrain our TensorFlow model to recognize something new.

This article is heavily based on the very excellent TensorFlow For Poets code lab. You should be able to follow my instructions without referring to it, but if you get stuck then go take a look at it.

What’s missing from the original demo app?

Lets launch the original Android “TF Classify” demo app and show it some pictures of a rose (it’s blurry because I’m viewing a picture that I found in Google image search).

The app is showing a 53.9% confidence that this is “velvet” … which is obviously incorrect. TensorFlow For Poets tells us that this is because:

ImageNet was not trained on any of these flower species, originally.

Download and prepare images to retrain with

Let’s download some flower images to retrain the base Inception v3 model. In the interests of time, we’re going to delete a few of the sets of images, and only keep the daisies and roses.

cd $HOME
mkdir tf_files
cd tf_files
curl -O http://download.tensorflow.org/example_images/flower_photos.tgz
tar xzf flower_photos.tgz
rm -rf dandelion sunflowers tulips

To show that we can customize the classification labels, let’s rename our training folders to have “ retrained” on the end of their original names.

mv daisy "daisy retrained"
mv roses "roses retrained"

Retraining the model

We need our training images to be accessible to our Docker container, so we use the -v option to launch it with the $HOME/tf_files folder on our local machine mounted to /tf_files in the container:

docker run -it -v $HOME/tf_files:/tf_files danjarvis/tensorflow-android:1.0.0

Make sure you use your $HOME folder, as only certain folders are allowed to be mounted to Docker containers. If your local machine is running Windows, you might need to change the forward slashes to back slashes, e.g. $HOME\tf_files.

We can now retrain the Inception v3 model using just our small training set.

# cd /tensorflow
# python tensorflow/examples/image_retraining/retrain.py \
--bottleneck_dir=/tf_files/bottlenecks \
--how_many_training_steps 500 \
--model_dir=/tf_files/inception \
--output_graph=/tf_files/retrained_graph.pb \
--output_labels=/tf_files/retrained_labels.txt \
--image_dir /tf_files/flower_photos

This step took 18 minutes on my Macbook Pro (3.1 GHz, 16GB RAM).

This script loads the pre-trained Inception v3 model, removes the old final layer, and trains a new one on the flower photos you’ve downloaded.

The final lines output by the script are:

...
2017–02–27 02:39:58.461678: Step 499: Train accuracy = 100.0%
2017–02–27 02:39:58.461841: Step 499: Cross entropy = 0.036179
2017–02–27 02:39:59.081111: Step 499: Validation accuracy = 99.0% (N=100)
Final test accuracy = 97.2% (N=145)
Converted 2 variables to const ops.

This is a good time to commit your changes to your Docker image, otherwise you’ll lose it all if you close Docker (see this post, or part 2 for more info on this). I recommending bumping your tag version from 1.0.0 to 1.0.1, so you can keep checkpoints at each step.

docker commit <CONTAINER ID> danjarvis/tensorflow-android:1.0.1

We now have a 87MB retrained_graph.pb in /tf_files, and a file with our labels.

# cat /tf_files/retrained_labels.txt
roses retrained
daisy retrained

As we can see, there are only two labels, so this model will only be able to distinguish between these two classifications.

Using the model in our Android demo app

Before we can use our retrained model, we have to remove the “unsupported JPEG decoding layers” from it so that we can use in the demo app. To do that, we need to build the strip_unused tool.

Just like we saw in part 2, we need to pass some special options to ensure our Docker container doesn’t run out of memory.

# cd /tensorflow
# bazel build --local_resources 4096,4.0,1.0 -j 1 tensorflow/python/tools:strip_unused

Assuming you are using the danjarvis/tensorflow-android:1.0.0 Docker image that I created, this step should take less than 30 seconds (since I pre-built this tool for you).

Now we can prepare our retrained model (this step is super quick).

# bazel-bin/tensorflow/python/tools/strip_unused \
--input_graph=/tf_files/retrained_graph.pb \
--output_graph=/tf_files/stripped_retrained_graph.pb \
--input_node_names="Mul" \
--output_node_names="final_result" \
--input_binary=true

We want to rebuild the Android demo app with our new model. However, we have to build it one time before we can copy our new files into the build (instructions here come from part 2).

# cd /tensorflow
# bazel build -c opt --local_resources 4096,4.0,1.0 -j 1 //tensorflow/examples/android:tensorflow_demo

Let’s copy the stripped, retrained model file, and our new labels file into our Android demo app build:

# cp /tf_files/stripped_retrained_graph.pb bazel-bin/tensorflow/examples/android/assets/stripped_output_graph.pb
# cp /tf_files/retrained_labels.txt bazel-bin/tensorflow/examples/android/assets/imagenet_comp_graph_label_strings.txt

Now let’s rebuild again.

# cd /tensorflow
# bazel build -c opt --local_resources 4096,4.0,1.0 -j 1 //tensorflow/examples/android:tensorflow_demo

We need to copy it to our shared mount so we can access it from outside the Docker container.

# cp /tensorflow/bazel-bin/tensorflow/examples/android/tensorflow_demo.apk /tf_files

When we launched this Docker container earlier, we specified /tf_files as the mounted location — anything we put in this folder will be available on the local machine.

Testing the updated Android demo app

Now we can install the updated app from our local machine. Make sure you open a new Terminal/Command Prompt window on your local machine to run this command in!

$ adb install -r $HOME/tf_files/tensorflow_demo.apk

Let’s see if it actually worked by viewing that rose picture from earlier… YEEESS!

It also works for daisies.

Before we get too excited, we need to remember that our retrained model can only recognize two categories, so everything is going to appear to be a “daisy retrained” or a “roses retrained”.

I still think this is awesome though. :-)