Deploying Tenserflow Lite Model On Android

Navneet Jasrotia
Bobble Engineering
Published in
3 min readDec 17, 2020

This blog explains how we can load the .tflite model into an Android app and run predictions on it.

What is Tenserflow lite ?

TensorFlow Lite (.TFLITE) is a lighter version of Google’s open-source machine learning framework, TensorFlow. The .tflite models are uniquely designed to run machine learning models on mobile and embedded devices.

However .tflite can only deploy pre-existing models for mobile and embedded devices. For training models one needs to make a Tenserflow model and then convert it to .tflite format.

How we can find input and output parameters of tflite model ?

We have to find the input and output parameters of .tflite model. For that we can use Netron by uploading model there.

The input and outputs for the model look like the following:

In result the model requires a 224x224 RGB image as input and it returns a 2-D array containing a single row that has an array of 149 items.

How we can add .tflite model into our Android app ?

To add the model, open Android Studio and select “File -> New Folder -> Assets Folder”. This should create an assets folder in your app now move the .tflite model into the asset folder.

After adding model in the assets folder open your build.gradle file and add dependency there.

dependencies {
implementation 'org.tensorflow:tensorflow-lite:+'
}

To use it we have to load the model from asset folder into MappedByteBuffer by using the following method:

private MappedByteBuffer loadModelFile(Activity activity) throws IOException {
AssetFileDescriptor fileDescriptor = activity.getAssets().openFd(getModelPath());
FileInputStream inputStream = new FileInputStream(fileDescriptor.getFileDescriptor());
FileChannel fileChannel = inputStream.getChannel();
long startOffset = fileDescriptor.getStartOffset();
long declaredLength = fileDescriptor.getDeclaredLength();
return fileChannel.map(FileChannel.MapMode.READ_ONLY, startOffset, declaredLength);
}

The method above will read our tflite model and return a MappedByteBuffer object. We can then use this object to create our Interpreter.

protected Interpreter tflite;
tflite = new Interpreter(loadModelFile(activity));

Then, to classify an image, all you need to do is call the run method on the Interpeter, passing it the image data and the labels array:


tflite.run(imgData, labelProbArray);

Now in this way we can get the output of tflite model and we can use this output in any way for predictions.

--

--