ML with iOS
In this article, we shall see how to train a model and integrate it into an iOS app. If you already have an ML model and want to see how to integrate it in your app, you can scroll down to the Implementation on iOS section.
The objective of this article is to demonstrate how to train an ML model and use it in iOS, and you won’t need any machine learning knowledge to follow through this article.
An ML Model is an alias for a trained (or yet to be trained) model which is expected to perform some intelligent stuff, in our case we are training it to identify some specific objects
Training a model
We are using Google’s Teachable Machine to train a model. It is a fantastic tool that allows us to train a model without requiring any knowledge of machine learning. Currently, it enables us to train models to recognize objects in images, a particular sound, or a pose. For our project we are using images to recognize objects.
- Go to the Teachable Machine website here
Now for our model to recognize particular objects we are providing multiple images of that object. We can use a webcam or upload set of images. The more images we upload the more accurate result we are getting. Make sure to choose pictures from different positions, angles, and environments.
- Provide pictures and edit the class name with the name of the object
I have added two classes for recognizing two different cars as Car 1 and Car 2.
- Once done click on train model
Once the model is trained, we get to watch a live preview. Our model is now able to differentiate between the two objects when placed in front of webcam. The only drawback is that it always returns us one of class value so if none of the class objects (cars in this case) is placed in front of the webcam it will show us the value of the first class of our model (in this case Car 1)
- Click on export model (next to Preview)
- In the dialog box, select Tensorflow Lite → Floating point and click Download my model
- Extracting the downloaded model gives us a
.tflite
file and one.txt
file which we will use in iOS
Implementation on iOS
There are two ways of integrating our model in our app:
- Using the Tensorflow Lite library
- Using Firebase ML Kit
For our project we will be using Firebase ML Kit as
- It is easy to set up
- Models can be hosted on Firebase and also bundled with the app
- We can update our model without updating the application
Lets get Started
- Create a new project on Xcode
Integrate Firebase
If you have an existing Firebase project integrated in your app you can skip this part and go to next step
- Go to the Firebase console here and create a new project
- In the Firebase project overview page click on iOS to start the setup
- Provide the bundle identifier to register the app
- Add the
GoogleService-Info.plist
file to your project
Now that we have Firebase integrated in our app let’s
Add Firebase ML Kit
- Create a pod file if you don’t already have one by typing
pod init
in your terminal in the project’s root folder
pod init
- Add this line to your pod file
pod 'Firebase/MLModelInterpreter', '6.25.0'
- In the terminal type
pod install
to install the dependency
pod install
- Once installed initialize Firebase in your
AppDelegate
class as shown in the setup instructions
And with that we have successfully set up Firebase for our app.
Now in order to use it add our model(.tflite
file) and label file we created earlier to our project.
After doing so, for our UI we will just use a text label placed it at the centre of our screen.
- Add an image for one of the objects (in our case
Car 1
) in the app’s asset catalog - Add this code to
ViewController
class
Here we are using static images from assets to keep the code minimal and avoid distraction. The recommended method would be to let the user provide an image by using the camera or image picker.
- Run the app :)
With this we have implemented the local ML integration part. You can find all the code on GitHub here
Lastly let’s not forget to thank the developers of Teachable Machine and Firebase for their amazing products , they wrote thousands of lines of code which enabled us to train and use our models in just a few lines of code.
Thanks for reading! Feel free to say hi or share your thoughts on Twitter @that_kushal_guy or in the responses below!
You can checkout Android variant of this article here.