Recognize Flowers using Transfer Learning

Ajinkya Jawale
3 min readJul 8, 2019

Retraining a classifier trained on Imagenet Dataset using Tensorflow 2.0 to detect the flower species (Part 2) Fine Tuning and converting model to tflite

Please go through Part 1 for Feature Extraction and Model Training

Fine tuning

In our feature extraction experiment, you were only training a few layers on top of a MobileNet V2 base model. The weights of the pre-trained network were not updated during training.

One way to increase performance even further is to train (or “fine-tune”) the weights of the top layers of the pre-trained model alongside the training of the classifier you added. The training process will force the weights to be tuned from generic features maps to features associated specifically to our dataset.

Un-freeze the top layers of the model

All you need to do is unfreeze the base_model and set the bottom layers be un-trainable. Then, recompile the model (necessary for these changes to take effect), and resume training.

base_model.trainable = True

output: Number of layers in the base model: 155

Compile the model

Compile the model using a much lower training rate.

> model.summary()
model Summary

Continue Train the model

history_fine = model.fit(train_generator,
epochs=5,
validation_data=val_generator)

model.fit

WOW ! now you can see out Accuracy of the model is increased to val_accuracy: 0.8112 ❤

Convert to TFLite

Saved the model using tf.saved_model.save and then convert the saved model to a tf lite compatible format.

Let’s take a look at the learning curves of the training and validation accuracy/loss, when fine tuning the last few layers of the MobileNet V2 base model and training the classifier on top of it. The validation loss is much higher than the training loss, so you may get some overfitting. You may also get some overfitting as the new training set is relatively small and similar to the original MobileNet V2 datasets.

plotting the training & loss

Summary:

  • Using a pre-trained model for feature extraction: When working with a small dataset, it is common to take advantage of features learned by a model trained on a larger dataset in the same domain. This is done by instantiating the pre-trained model and adding a fully-connected classifier on top. The pre-trained model is “frozen” and only the weights of the classifier get updated during training. In this case, the convolutional base extracted all the features associated with each image and you just trained a classifier that determines the image class given that set of extracted features.
  • Fine-tuning a pre-trained model: To further improve performance, one might want to repurpose the top-level layers of the pre-trained models to the new dataset via fine-tuning. In this case, you tuned your weights such that your model learned high-level features specific to the dataset. This technique is usually recommended when the training dataset is large and very similar to the original dataset that the pre-trained model was trained on.

So, what then… this tflite model are say called compressed version of the original trained model( i wondered file size was just 9.1 mb! ) these tflite model further can be integrated with Android IOS or various IoT devices like Raspberry-Pi etc. web application Integration can be done using tensorflow.js

Integration: https://www.tensorflow.org/lite/guide/get_started#3_use_the_tensorflow_lite_model_for_inference_in_a_mobile_app

References:

find more and more knowledgeable resources related to #ai #machinelearning #deeplearning #python…https://twitter.com/Ajinkya_Tweets

Ajinkya Jawale, https://www.linkedin.com/in/ajinkya-jawale-b3421a12a/

https://angel.co/ajinkya-jawale

reach me here, ajinkyajawale14499@gmail.com

Github Code- https://github.com/ajinkyajawale14/Flower_tflite

gracies!

--

--

Ajinkya Jawale

#Python #MachineLearning #DeepLearning #Datascience #Algorithms #FullStackDevelopment