ML in Android -2 : Build a Hello World Tensorflow Model

Learn about ways to save a model .Write your first model hello world with tensorflow and predict for user input. Learn how to get information about a tflite model.

Ayush Raj
The STEM
3 min readNov 22, 2021

--

ML in Android Part-2 (Made on online editor)

In the last article of the tutorial series we have talked about various concepts of tensorflowlite. We’ll talk more about how to save a model, convert it, and then optimise it for smart phones in this article. We’ll employ TensorFlow to train a model and afterwards store it as a Saved Model. We’ll optimise this model then use the TFLite converter to generate a TFLite model from it. In a nutshell we’ll follow the flowchart given below.

Steps to follow (drawn on diagram software)

Ways to save a model & the best way

  • TensorFlow models are built using the Keras APIs or the low-level APIs, and then stored as <i>Keras model, <ii>SavedModel, or <iii> collection of concrete in-memory functions.
  • The converter converts to.TFlite formats, which is a flat buffer that may be utilised on mobile devices
  • To convert using Python, we use tf.lite.TFliteConverter on developer’s workstation.
  • In order to acquire an output, it can be created from the SavedModel, the Keras model, or the concrete functions, depending on how the model is represented.
  • The best way to use SavedModel format

Saved Model format

  • tf.lite.TFLiteConverter.from_saved_model() Converts a SavedModel.
  • tf.lite.TFLiteConverter.from_keras_model(): Converts a Keras model (HDF5 file).
  • tf.lite.TFLiteConverter.from_concrete_functions(): Converts concrete functions.

SavedModel Format

  • Universally recoverable, language-neutral technique of serialising a TensorFlow model
  • Using a single abstraction, it simplifies the production, transformation, and consumption of TensorFlow models for higher-level systems.
  • No need to worry about the original code to make the model. Easily share and use using various formats like using TensorFlow Lite, TensorFlow.js, TensorFlow Serving, and TensorFlow Hub.
  • When a SavedModel is exported, a MetaGraph is produced. A metagraph has model variables, list of labels or classes ,signatures for prediction.

Hello World of Deep Learning Model

We’ll develop a simple model that figures out relationship between x and y (y = x +1)without being programmed. Because our loss is so minimal, we’ll terminate training after 190 epochs.

Fig: Loss is very low as we reach 190 epoch

The model will then be saved as a SavedModel. The model must now be converted to TensorFlow Lite. It will saved in a directory called saved_model

The TF Lite converter is used for this. The TFLite model will then be saved. We can see that we only have 928 bytes when we write them down. It’s a little model.

Fig. code for converting to tensorflowlite
Fig : shows 928 bytes written

We can get information about tflite model by loading the model and getting information about input and output tensors. This can provide information about input and output shape of a tflite model.

Fig: Tflite Model’s Output & Input details

The input information tells us that the data type for input must be float32. So we convert the user input from float64 (default) to float32, then we set the tensor with input data and input detail index. Then we invoke the interpreter and get the output tensor from interpreter.

Fig: Sample Input taken from user and Output (nearly input +1)

Note, since we had only one layer used in training (usually there are many layers which increase the accuracy to an extent).

What next ?

In the next part, we’ll build an android app on the model made in this tutorial. It will truly exemplify ‘Build in Python and deploy in Java/kotlin’.

--

--