Firebase ML Kit Custom Models for iOS developers — Part 1: Understanding TensorFlow Lite

naluinui
Firebase Thailand
Published in
5 min readJan 4, 2020

--

First of all, I’d like to say that I am not an AI expert. I have zero knowledge about machine learning! But as a mobile developer, there is a good chance that your organisation will be using ML in their mobile apps in the future. I found myself in this situation and that brings me here to share what I have learnt so far.

Firebase has a product called ML Kit which can take some of the pain out of the machine learning on mobile apps (supports both iOS and Android). Firebase ML Kit has several built-in models that require only a few lines of code to do image classification, object detection, and OCR on-device and in the cloud.

However, the fun starts when you get given a specific custom model created by a data scientist in your organisation. They might also provide you some input and output examples and then you need to figure out the way to implement it into your mobile app. Luckily you can still use Firebase ML Kit, with a feature they call “Custom Models”. The main feature of ML Kit Custom Models is that any TensorFlow Lite model can be served to both your Android and iOS apps, include the model on device but also keep it updated automatically from the server.

I have seen some examples of the complicated custom models like an image classification, and so on which I found that it is quite difficult for me to get my head around it to start with. In this article and the next, I’m going to walk through the iOS implementation of 2 simple custom models that the data science geeks in the Rainforest Connection team kindly created for me. I think having these models would have help me understand custom models better when I was getting started.

The “Hello world” of custom models

Step 1: Inspect the model

Before we start using Firebase ML Kit custom model, I found it is really helpful to understand how a model works first — especially its input and output.

I’m going to start off with the easiest model that I could think of. It is a model that takes 3 numbers (floats) as its input and it will output the sum of the numbers. It is certainly not a model that you need ML for, but it will help us see how the TensorFlow Lite model works.

Download the sum.tflite model.

If you open it in Netron (a model visualiser) then you will see the inputs and outputs:

  • Input type: float32[3] (a 1D array of length 3)
  • Output type: float32[1] (a 1D array of length 1)
The sum.tflite model visualised in Netron.

At its simplest, a TensorFlow Lite model is a function that takes inputs and produces outputs, where the input and output have fixed dimensions and lengths (known as the shape). Mostly you hear about image classification problems such as the input is a greyscale 200 by 200 pixel image (e.g. 2D array of floats with shape [200,200]) and the output is the probability it is a cat (e.g. 1D array of floats with shape [1]). In the sum model, the input is shape [3] and the output is shape [1] — our job as an iOS developer is to pass the data into the model and get the results following these shapes.

Before we implement the Swift code, if you are interested how the sum model was constructed in TensorFlow then you can explore the python code in Colab. It consists of the definition of the function and saving the function in tflite format, or you can see a short version of the code here:

Step 2: Integrate the model in your iOS project

Next we create an iOS project and include the dependencies for Firebase custom models. (For Cocoapods, use pod 'Firebase/MLModelInterpreter') Then to keep our view controller clean, we put the interpreter logic in a separate class, so we will describe 2 files:

  • SumViewController has a button that generates 3 random numbers that are used as input to the model, and two labels to show the input and output.
  • SumInterpreter class to handle loading and preforming the model.

1.Make the model available. You can either bundle your model with the app binary, or host it on the cloud via Firebase console, or both. By hosting a model on Firebase, you can update the model without releasing a new app version, and you can use Remote Config and A/B Testing to dynamically serve different models to different sets of users. By bundling your model within your app, you can ensure the ML features still work when Firebase-hosted model isn’t available (e.g. when offline). With this example, I will only bundle the model on device by placing sum.tflite model in the Xcode project.

Place a TensorFlow Lite model in the iOS project

2. Prepare an interpreter. First, we need to load the model by specify the locations of the model, on cloud, on device, or both. As our sum model only serves on device, so what we need to do is specify the filename of a TensorFlow Lite model inCustomLocalModel object. Then, create a ModelInterpreter object by simply pass the CustomLocalModel object in.

3. Specify model’s input and output. As I mentioned earlier, this step is really important — try getting your input/output right! The model’s input and output use one or more multidimensional arrays which contain either byte, int, long, or float values. Specify your model’s input and output by using a ModelInputOutputOptions which required you to define the number and dimensions (shape) of the array your model uses. In the sum model, the input is shape [3] and the output is shape [1].

4. Perform inference on input data. Finally, to perform inference using the model, get your input data ready then pass it on to your model interpreter’s run method with the input/output options. For this example, I am just going to generate 3 numbers from ViewController, then pass them on to run method in the Interpreter class. Once we get the output, pass it back via the completion to show in ViewController.

This example is pretty easy, right? In Part 2 we are going to look at a more realistic example of playing Tic Tac Toe. Stay tuned!

Firebase ML Kit is still in beta and therefore might change — keep an eye on the official documentation for Custom Models on iOS for the latest examples.

Get the full project code for these examples at github.com/naluinui.

--

--