Recognize Flowers using Transfer Learning

Ajinkya Jawale
4 min readJul 8, 2019

Retraining a classifier trained on Imagenet Dataset using Tensorflow 2.0 to detect the flower species (Part 1)

What is Transfer Learning?

so, transfer learning is a technique that shortcuts much of this by taking a piece of a model that has already been trained on a related task and reusing it in a new model.

Part 1: Feature Extraction

Part 2: Fine Tuning and Converting model to tensorflow lite(tflite)

Were using google colab here which gives ram and gpu integration in browser

Step 1: Installation

this will install TensorFlow 2.0 in your google colab! to check the version of TensorFlow

> tf.__version__

‘2.0.0-dev20190707’

Step 2: Setup Input Pipeline

Downloading the flower dataset…

Now, Convolutional neural network requires the same dimensions of the images in dataset but they are of varible size so make rescale the images.

  1. Use ImageDataGenerator to rescale the images.
  2. Create the train generator and specify where the train dataset directory, image size, batch size.
  3. Create the validation generator with similar approach as the train generator with the flow_from_directory() method.
rescaling the images

Found 2939 images belonging to 5 classes. & Found 731 images belonging to 5 classes.

now form the batches of images and give the size

output- ((64, 224, 224, 3), (64, 5))

the output will be of classes of the flowers…

classes

Step 3: Create the base model from the pre-trained convnets

Create the base model from the MobileNet V2 model developed at Google, and pre-trained on the ImageNet dataset, a large dataset of 1.4M images and 1000 classes of web images.

First, pick which intermediate layer of MobileNet V2 will be used for feature extraction. A common practice is to use the output of the very last layer before the flatten operation, the so-called “bottleneck layer”. The reasoning here is that the following fully-connected layers will be too specialized to the task the network was trained on, and thus the features learned by these layers won’t be very useful for a new task. The bottleneck features, however, retain much generality.

Let’s instantiate an MobileNet V2 model pre-loaded with weights trained on ImageNet. By specifying the include_top=False argument, we load a network that doesn't include the classification layers at the top, which is ideal for feature extraction.

Step 4: Feature Extraction

You will freeze the convolutional base created from the previous step and use that as a feature extractor, add a classifier on top of it and train the top-level classifier.

> base_model.trainable = False

Add a classification head

Compile the model

You must compile the model before training it. Since there are two classes, use a binary cross-entropy loss.

> model.summary( )

Summary of Our model ❤

for total number of varialbles in model

print(‘Number of trainable variables = {}’.format(len(model.trainable_variables))) output = 4

TRAIN THE MODEL

Here is the most important part of the model training

model.fit()

Learning curves

Let’s take a look at the learning curves of the training and validation accuracy/loss when using the MobileNet V2 base model as a fixed feature extractor.

output of training curves

now up to here we train our model and calculated its loss over each epoch & check the accuracy of the model(val_accuracy: 0.7592)

for Part 2 please follow link below

Part 2: Fine Tuning and Converting model to tensorflow lite(tflite)

References:

find more and more knowledgeable resources related to #ai #machinelearning #deeplearning #python…https://twitter.com/Ajinkya_Tweets

Ajinkya Jawale, https://www.linkedin.com/in/ajinkya-jawale-b3421a12a/

https://angel.co/ajinkya-jawalereach me here, ajinkyajawale14499@gmail.com

Github code: https://github.com/ajinkyajawale14/Flower_tflite

gracies!

--

--

Ajinkya Jawale

#Python #MachineLearning #DeepLearning #Datascience #Algorithms #FullStackDevelopment