Unsuccessful attempt to port NSFW to android mobile with tf-lite
Tensorflow platform has grown up to become a very stable and reliable production level resource. Doing so, it released TensorflowJS and Tensorflow for Mobile which are in constant upgradation. While browsing through the GIT repositories, I stumbled upon a project called tensorflow-open_nsfw, originally ripped from implementation by Yahoo name open_nsfw, which performs absolutely fantastic on providing NSFW score.
You’ll must be thinking what is NSFW. We in simple terms, NSFW means not suitable/safe for work, including offensive and adult images (pornographic content included), is an important problem which many companies were trying to solve. One can read more about this on this blog post by Yahoo.

As can been seen from the above image, the scores each image with NSFW score. The higher the score, the more one can expect the image to be having explicit content. So, the idea of converting this model into a portable mobile struck me in between the process of reading the README of the git repo.
Let’s begin with the drill. The git repository provides the weight of the model, which is present in the data folder. In order to export this model into format which can be used by a smart phone (we will only consider Android for time being) we have to follow some predefined steps.
- The first stage of the process is get hold of the trained model. We have this in the form of the graph defined as architecture in file called model.py.
- Secondly, we have to convert this trained model into a frozen graph, which is the format that is used by Tensorflow for serving.
- Thirdly, we transform this frozen graph into file of the type .tflite. This is done using toco utility provided in the Tensorflow.
- Finally, this file is packaged in a android application as a asset and loaded during inference time for scoring.

Step 1. Getting Checkpoint/Model file from weights
There is file called export_model.py in the tools folder of the repository. Use this file with the command as
python tools/export_graph.py -target {path-to-parent}/tensorflow-open_nsfw/model -m {path-to-parent-folder}/tensorflow-open_nsfw/data/open_nsfw-weights.npy
The -m option points to the weights of the graph and -target is the name of the folder which be created afresh for storing the checkpoint files. After execution of the command you should see a folder called model.
Step 2. Getting Frozen Graph
You don’t have to worry about freezing the graph as this step is already taken care by the export_graph.py which we ran in previous step.
Step 3. Getting the tflite file
We are at just two steps away from our destination.. We have to run this command to obtain our lite file.
toco — graph_def_file={path-to-parent-folder}/tensorflow-open_nsfw/model/frozen_open_nsfw.pb — output_file={path-to-parent-folder}/tensorflow-open_nsfw/tflite/output.tflite — output_format=TFLITE — input_array=input — output_array=predictions — input_shapes=”None,”
Let’s take a step back and review this command.
- graph_def_file — location of the graph def file.
- output_file — location of the tflite file.
- input_array — name of the node that takes in the input. This can also be more than one entity.
- output_array — name of the node that will spill the prediction. This can also be more than one entity.
- input_shapes — shape of the input passed to the input.
We don’t know now really what should be the image size, we can try to remove the input_shapes parameter and run the code but it will complain it during runtime.
Lets check the code in the repository for the image size so that we can fix our input_shapes. The tf-model takes in any image irrespective of it’s dimension and reshapes it to 224,224,3. But, the thing to notice is the very specific sequence of steps it is performing on the image before it is passed to the input node. One can refer to the script image_utils.py for the exact steps. If the image is not processed with the function then the model might not be able to produce the output as described by the paper.
So, we fix the input_shapes=”1,224,224,3" in the toco command and run it. As a result we obtain the tflite file. We then use the examples from the tensorflow android repositories to get our mobile app up and working.
First I assumed that I can skip the preprocessing and pass in the tensor directly but I was proved wrong when the model expected 224,224,3 dimension input and android mobile image was 128,128,3. Going forward in order to reshape the image, I have to follow the steps given in the script.
# The whole jpeg encode/decode dance is neccessary to generate a result
# that matches the original model’s (caffe) preprocessing
image = tf.image.decode_jpeg(data, channels=3,
fancy_upscaling=True,
dct_method=”INTEGER_FAST”)image = tf.image.convert_image_dtype(image, tf.float32, saturate=True)
image = tf.image.resize_images(image, (256, 256),
method=tf.image.ResizeMethod.BILINEAR,
align_corners=True)image = tf.image.convert_image_dtype(image, tf.uint8, saturate=True)
image = tf.image.encode_jpeg(image, format=’’, quality=75,
progressive=False, optimize_size=False,
chroma_downsampling=True,
density_unit=None,
x_density=None, y_density=None,
xmp_metadata=None)image = tf.image.decode_jpeg(image, channels=3,
fancy_upscaling=False,
dct_method=”INTEGER_ACCURATE”)image = tf.cast(image, dtype=tf.float32)
image = tf.image.crop_to_bounding_box(image, 16, 16, 224, 224)
image = tf.reverse(image, axis=[2])
image -= VGG_MEAN
But these operation are not supported by Tensorflow Java and I cannot run a toco command as inputs=None as the model needs a fixed shaped input.
This is a known issue with tf-lite that input_shapes must be fixed. This makes all those models which are following some specific preprocessing steps not suitable for mobile export. As these are only supported by the Python and not Java. I have no doubt that this model will work without any issue with Tf-serving.
Updates:
- Found this issue, someone also suggested to pass arbitrary input and then resize.
In the next post
- I hope to check the output of the preprocessing pipeline and come up with Java equivalent operations on mobile end for getting the required shape.
- Try Tensorflow JS and see if the code can be packaged as Hybrid App.
If you have suggestion or solutions to the problem above feel free to give your feedback.
