iOS — Use the TensorFlow Lite model in the SwiftUI Application

Jakir Hossain
4 min readOct 22, 2023

TensorFlow Lite provides an interface to deploy machine learning models to mobile, microcontrollers, and other edge devices. In this article, we will learn how to deploy a simple TensorFlow model in iOS using Swift. This article contains two main parts: generating the TFLite model and deploying the model in an iOS project.

Generate TensorFlow Model from TensorFlow model

To deploy the TensorFlow model in iOS, we have to convert the model into the TFLite Model. You can convert a TF model into a TFLite model using this simple code:

# convert TF model into TFLite model
converter = tf.lite.TFLiteConverter.from_keras_model(model=model)

tfmodel = converter.convert()

# Save TFLite model into a .tflite file

open("model.tflite","wb").write(tfmodel)

Here is a simple machine-learning model you can use for this project.

import tensorflow as tf
import numpy as np
import keras
from keras import layers

# training data
celsius = np.array([-40, -10, 0, 8, 15, 22, 38], dtype=float)
fahrenheit = np.array([-40, 14, 32, 46.4, 59, 71.6, 100.4], dtype=float)


# model creation
model = keras.Sequential([
layers.Dense(units=1, input_shape=[1])
])

# model compilation
model.compile(loss='mean_squared_error',
optimizer=tf.keras.optimizers.Adam(0.1))
# model training
model.fit(celsius, fahrenheit, epochs=500, verbose=False)
print("Finished training the model")

# predict
print(model.predict([100.0]))

# convert TF model into TFLite model
converter = tf.lite.TFLiteConverter.from_keras_model(model=model)

tfmodel = converter.convert()

# Save TFLite model into a .tflite file

open("model.tflite","wb").write(tfmodel)

Colab link of this project. Here, we have trained our model using simple sequential data. We have used Celsius value as input and Fahrenheit as output. After training our model, if we provide a Celsius value, it will predict the Fahrenheit value for us. Like for 100, it will predict 211.56012. Then we convert the TF model into TFLite mode and save it as model.tflite file. We will use this model.tflite file in our iOS project.

Deploy the TFLite model in iOS

To load a TensorFlow Lite (TFLite) model in a Swift application, we have to install & import the TensorFlow Lite Swift Library in our project. We can do that using CocoaPods. If you have not install CocoaPods yet, you can install it simply by writing this command in Terminal:

$ sudo gem install cocoapods

Now create an iOS project in Xcode. Navigate the project in the terminal and type:

$ pod init  

It will generate a Podfile for us. Open Podfile in any text editor and include:

  pod 'TensorFlowLiteSwift'

Here is the full Podfile for this simple app:

# Uncomment the next line to define a global platform for your project
platform :ios, '17.0'

target 'TFLiteExample' do
# Comment the next line if you don't want to use dynamic frameworks
use_frameworks!
pod 'TensorFlowLiteSwift'

# Pods for TFLiteExample

end

Now in the terminal, enter:

$ pod install

It will install TensorFlowLiteSwift for us. And create another Xcode workspace for us like AppName.xcworkspace. Open this xcworkspace in Xcode. Now we are ready to import TensorFlowLite in our project and use any TensorFlow Lite model. Here is the official guide on how to load the model and infer data using the model using TFLite.

Copy the generated TFLite model into the project bundle. Here is the code we can use to predict our simple Celsius to Fahrenheit model:

import SwiftUI
import TensorFlowLite
import Foundation

struct ContentView: View {

@State var output = "Hello"
@State var celsius = ""
@State var fahrenheit: Float = 0.0

var body: some View {
VStack {
TextField( "Enter celsius value", text: $celsius)
Button("Infar"){
predictFahrenheit()
}
Text("\(fahrenheit) Fahrenheit ")
}
.padding()
}

func predictFahrenheit() {
// Load the TFLite model
guard let modelPath = Bundle.main.path(forResource: "model", ofType: "tflite") else {
fatalError("Model not found")
}

do {
// Initialize an interpreter with the model.
let interpreter = try Interpreter(modelPath: modelPath)

// Allocate memory for the model's input `Tensor`s.
try interpreter.allocateTensors()

let inputData: Data // Should be initialized

// Process input data
inputData = withUnsafeBytes(of: Float(celsius) ?? 0.0) { Data($0) }
try interpreter.copy(inputData, toInputAt: 0)

// Run inference by invoking the `Interpreter`.
try interpreter.invoke()

// Get the output `Tensor`
let outputTensor = try interpreter.output(at: 0)
let outputData = outputTensor.data

// Process inference data
fahrenheit = outputData.withUnsafeBytes { $0.load(as: Float.self) }

} catch let error {
print("Error: \(error)")

}
}
}


#Preview {
ContentView()
}

Keep in mind that, the model takes input as Bytes data. Our model takes Float Data as input. So we have to convert the input value into Bytes. The model also gives output as Bytes data. So we have converted the data into Float before using the output value. That’s it. You can get the project from GitHub. I have included both the TFLite model and the Notebook of the project in GitHub. If you wish to learn more, you can follow official quick start guides. Good wishes ❤

--

--