Text Classification using TensorFlow Lite Plugin for Flutter

Amish Garg
5 min readMay 22, 2020

--

If you wished that there was an easy, efficient, and flexible way to integrate TensorFlow trained models with your flutter apps, I am glad to announce the release of a new plugin tflite_flutter.

Key features of tflite_flutter :

  • It provides a Dart API similar to the TFLite Java and Swift APIs, thus no compromise with the flexibility offered on those platforms.
  • Directly binds to the TensorFlow Lite C API using dart:ffi, making it more efficient than platform integration approaches.
  • No need to write any platform-specific code.
  • Offers acceleration support using NNAPI, GPU delegates on Android, and Metal delegate on iOS.

In this article, I will walk you through building a Text Classification Flutter App using tflite_flutter. Let’s get started by creating a new flutter project text_classification_app.

(Important) Initial setup

  • Linux and Mac users

Copy the install.sh file in the root folder of your app, and execute the command, sh install.sh in the root folder, text_classification_app/ in our case.

  • Windows users

Copy the install.bat file in the root folder of your app, and execute the command, install.bat in the root folder, text_classification_app/ in our case.

This will automatically download the latest binaries from release assets and place them in appropriate folders for you.

Refer to the readme for more info on the initial setup.

Getting the plugin

In pubspec.yaml include tflite_flutter: ^<latest_version> (details here).

Downloading the model

To use any TensorFlow trained model on mobile, we need to obtain it in .tflite format. For more information on how to convert a TensorFlow trained model to .tflite format, refer to this official guide.

We are going to use the pre-trained Text Classification Model available on the TensorFlow website. Click here to download.

This pretrained model predicts if a paragraph’s sentiment is positive or negative. It was trained on Large Movie Review Dataset v1.0 from Mass et al, which consists of IMDB movie reviews labeled as either positive or negative. Find more info here.

Place text_classification.tflite and text_classification_vocab.txt in the text_classification_app/assets/ directory.

Include assets/ in pubspec.yaml .

assets:    
- assets/

Now, we are all set, to begin with coding. 🚀

Coding the classifier

Pre-processing

As mentioned on the text_classification model’s page,

Here are the steps to classify a paragraph with the model:

  1. Tokenize the paragraph and convert it to a list of word ids using a predefined vocabulary.
  2. Feed the list to the TensorFlow Lite model.
  3. Get the probability of the paragraph being positive or negative from the model outputs.

We will first write a method to tokenize the raw string using text_classification_vocab.txt as vocabulary.

Create a new file classifier.dart under the lib/ folder.

Let’s first write code to load text_classification_vocab.txt to a dictionary.

Loading Dictionary

Now, we will write a function to tokenize the raw string.

Tokenization

Inference using tflite_flutter

This is the main section of this blog, as here we are going to discuss the usage of the tflite_flutter plugin.

The term inference refers to the process of executing a TensorFlow Lite model on-device in order to make predictions based on input data. To perform an inference with a TensorFlow Lite model, you must run it through an interpreter. Learn more.

Creating the interpreter, loading the model

tflite_flutter provides a method to create the interpreter direct from assets.

static Future<Interpreter> fromAsset(String assetName,    {InterpreterOptions options})

As our model is in assets/ directory we will just use the above method to create the interpreter. For info on InterpreterOptions refer to this.

Code to create Interpreter

If you don’t want to put your model in assets/ directory then tflite_flutter provides factory constructors to create interpreter as well, refer readme.

Let’s perform Inference!

We are going to use this method for inference,

void run(Object input, Object output);

Notice that this method is the same as the one provided by Java API.

The Object input and Object output must be multi-dimensional lists having the same shape as Input Tensor, and Output Tensor.

To view, the shapes and sizes of input tensors, output tensors you can do,

_interpreter.allocateTensors();// Print list of input tensors
print(_interpreter.getInputTensors());
// Print list of output tensors
print(_interpreter.getOutputTensors());

In the case of our text_classification model,

InputTensorList:
[Tensor{_tensor: Pointer<TfLiteTensor>: address=0xbffcf280, name: embedding_input, type: TfLiteType.float32, shape: [1, 256], data: 1024]
OutputTensorList:
[Tensor{_tensor: Pointer<TfLiteTensor>: address=0xbffcf140, name: dense_1/Softmax, type: TfLiteType.float32, shape: [1, 2], data: 8]

Now, lets, write the classify method which returns 1 for positive, and 0 for negative.

Code for Inference

There are some useful extensions defined under extension ListShape on List in tflite_flutter,

// reshapes a given list to shape, provided total number of elements // remain equal 
// Usage: List(400).reshape([2,10,20])
// returns List<dynamic>
List reshape(List<int> shape)
// returns shape of a list
List<int> get shape
// return total elements in a list of any shape
int get computeNumElements

The final classifier.dart should look like this,

Now, it’s up to you to code the desired UI for this, the usage of classifier would be simple,

// Create Classifier object
Classifer _classifier = Classifier();
// call classify method with sentence as parameter
_classifier.classify("I liked the movie");
// returns 1 (POSITIVE)
_classifier.classify("I didn't liked the movie");
// returns 0 (NEGATIVE)

Check out the complete Text Classification Example app with UI.

Text Classification Example App

Visit the repository am15h/tflite_flutter_plugin on Github to learn more about the tflite_flutter plugin.

FAQs

Q. How is this plugin tflite_flutter different from tflite v1.0.5

While tflite v1.0.5 focuses on offering some high-level features to build apps with specific use cases like Image Classification, Object Detection, etc.., the new, tflite_flutter offers the same flexibility and features as the Java API and can be used with any tflite model. It also offers support for delegates.

tflite_flutter is fast (has low latency) as it uses dart:ffi (dart ↔️(ffi) ↔️C) while tflite uses platform integration (dart ↔️platform-channel ↔️(Java/Swift) ↔️JNI ↔️C).

Q. How to create an Image Classification app using tflite_flutter, is there any package similar to TensorFlow Lite Android Support Library?

Update (07/01/2020): TFLite Flutter Helper library is released.

TensorFlow Lite Flutter Helper Library provides a simple architecture for processing and manipulating input and output of TFLite Models.

Its API design and documentation are identical to the TensorFlow Lite Android Support Library. More info here.

That’s all for this blog, I would love to hear your feedback on tflite_flutter plugin. Feel free to file an issue to report bugs or for feature requests.

Thanks for reading. ✌️

--

--

Amish Garg

Student Developer at TensorFlow (GSoC’21 & 20) | CSE @ IIT Roorkee