iOS — Use TensorFlow Lite model in SwiftUI Application

Jakir Hossain
4 min readOct 22, 2023

TensorFlow Lite provide an interface to deploy machine learning models to mobile, microcontrollers and other edge devices. In this article, we will learn how to deploy a simple TensorFlow model in iOS using Swift. This article contain two main parts: generating TFLite model and deploy the model in an iOS project.

Generate TensorFlow Model from TensorFlow model

To deploy TensorFlow model in iOS, we have to convert the model into TFLite Model. You can convert a TF model into TFLite model using this simple code:

# convert TF model into TFLite model
converter = tf.lite.TFLiteConverter.from_keras_model(model=model)

tfmodel = converter.convert()

# Save TFLite model into a .tflite file

open("model.tflite","wb").write(tfmodel)

Here is a simple machine learning model you can use for this project.

import tensorflow as tf
import numpy as np
import keras
from keras import layers

# training data
celsius = np.array([-40, -10, 0, 8, 15, 22, 38], dtype=float)
fahrenheit = np.array([-40, 14, 32, 46.4, 59, 71.6, 100.4], dtype=float)


# model creation
model = keras.Sequential([
layers.Dense(units=1, input_shape=[1])
])

# model compilation
model.compile(loss='mean_squared_error',
optimizer=tf.keras.optimizers.Adam(0.1))
# model training
model.fit(celsius, fahrenheit, epochs=500, verbose=False)
print("Finished training the model")

# predict
print(model.predict([100.0]))

# convert TF model into TFLite model
converter = tf.lite.TFLiteConverter.from_keras_model(model=model)

tfmodel = converter.convert()

# Save TFLite model into a .tflite file

open("model.tflite","wb").write(tfmodel)

Colab link of this project. Here, we have train our model using simple sequential data. We have used celsius value as input and Fahrenheit as output. After training our model, if we provide a Celsius value, it will predict Fahrenheit value for us. Like for 100, it will predict 211.56012. And then we convert TF model into TFLite mode and saved as model.tflite file. We will use this model.tflite file in our iOS project.

Deploy TFLite model in iOS

To load a TensorFlow Lite (TFLite) model in a Swift application, we have to install & import TensorFlow Lite Swift Library in our project. We can do that use CocoaPods. If you did not install CocoaPods yet, you can install simply by writing this command in Terminal:

$ sudo gem install cocoapods

Now create a iOS project in Xcode. Navigate the project in terminal and type:

$ pod init  

It will generate a Podfile for us. Open Podfile in any text editor and include:

  pod 'TensorFlowLiteSwift'

Here is the full Podfile for this simple app:

# Uncomment the next line to define a global platform for your project
platform :ios, '17.0'

target 'TFLiteExample' do
# Comment the next line if you don't want to use dynamic frameworks
use_frameworks!
pod 'TensorFlowLiteSwift'

# Pods for TFLiteExample

end

Now in terminal, enter:

$ pod install

It will install TensorFlowLiteSwift for us. And create another Xcode workspace for us like AppName.xcworkspace. Open this xcworkspace in Xcode. Now we are ready to import TensorFlowLite in our project and use any TensorFlow Lite model. Here is the official guide how to load model and infer data using model using TFLite.

Copy the generated TFLite model into project bundle. Here is the code we can use to predict our simple Celsius to Fahrenheit model:

import SwiftUI
import TensorFlowLite
import Foundation

struct ContentView: View {

@State var output = "Hello"
@State var celsius = ""
@State var fahrenheit: Float = 0.0

var body: some View {
VStack {
TextField( "Enter celsius value", text: $celsius)
Button("Infar"){
predictFahrenheit()
}
Text("\(fahrenheit) Fahrenheit ")
}
.padding()
}

func predictFahrenheit() {
// Load the TFLite model
guard let modelPath = Bundle.main.path(forResource: "model", ofType: "tflite") else {
fatalError("Model not found")
}

do {
// Initialize an interpreter with the model.
let interpreter = try Interpreter(modelPath: modelPath)

// Allocate memory for the model's input `Tensor`s.
try interpreter.allocateTensors()

let inputData: Data // Should be initialized

// Process input data
inputData = withUnsafeBytes(of: Float(celsius) ?? 0.0) { Data($0) }
try interpreter.copy(inputData, toInputAt: 0)

// Run inference by invoking the `Interpreter`.
try interpreter.invoke()

// Get the output `Tensor`
let outputTensor = try interpreter.output(at: 0)
let outputData = outputTensor.data

// Process inference data
fahrenheit = outputData.withUnsafeBytes { $0.load(as: Float.self) }

} catch let error {
print("Error: \(error)")

}
}
}


#Preview {
ContentView()
}

Keep in mind that, the model take input as Bytes data. Our model takes Float Data as input. So we have to convert input value into Bytes. Model also give output as Bytes data. So we have converted the data into Float before using the output value. That’s it. You can get the project from GitHub. I have included both TFLite model and Notebook of the project in GitHub. If you wish to learn more, you can follow official quick start guides. Good wishes ❤

--

--