TensorFlow: Train a Classifier 🌺

Muna Abdelrazeq
Analytics Vidhya
Published in
3 min readDec 11, 2019

If you have been following my blogs, you know that my last blog covered the basics and types of machine learning. This blog will take a deeper dive into machine learning and go over a specific example of how to make your own classifier in TensorFlow.

Recap:

Supervised learning depends on the construction and use of classifiers, a function that takes in data as input and assigns a label as output.

  • Step One: Collect Training Data, set of known inputs and outputs used to construct a classifier. At the same time, some data should be set aside as examples to later test the classifier’s accuracy.
  • Step Two: Train the Classifier and measure its accuracy.
  • Step Three: Use classifier to predict other data.

We will be following along TensorFlow For Poets in order to develop a classifier that can classify flowers.

End Goal

Step One:

When I initially began coding along, I ran into an error when trying to load my training data

AttributeError: module ‘tensorflow’ has no attribute ‘app’.

In order to avoid this error, I changed the version of tensorflow from 1.7 to 1.13 and that did the trick.

pip install --upgrade "tensorflow==1.13.*"

Step Two:

The code for the lab can be accessed through github using the command below.

git clone https://github.com/googlecodelabs/tensorflow-for-poets-2

Step Three:

cd into the file and then download the training data associated with the lab using the following command.

curl http://download.tensorflow.org/example_images/flower_photos.tgz \
| tar xz -C tf_files

This is a collection of followers (daisy, dandelion, roses, sunflowers, tulip) that will be used to train and test the classifier.

Step Four:

IMAGE_SIZE=224
ARCHITECTURE="mobilenet_0.50_${IMAGE_SIZE}"

Run these two lines of code in order to set image resolution and size, respectively. Having a higher resolution ensures a more accurate classifier.

Step Five:

You can monitor the training progress using the command

tensorboard --logdir tf_files/training_summaries &

To view, go to localhost:6008.

Before Training
After Training

Begin training by running

python -m scripts.retrain \
--bottleneck_dir=tf_files/bottlenecks \
--model_dir=tf_files/models/"${ARCHITECTURE}" \
--summaries_dir=tf_files/training_summaries/"${ARCHITECTURE}" \
--output_graph=tf_files/retrained_graph.pb \
--output_labels=tf_files/retrained_labels.txt \
--architecture="${ARCHITECTURE}" \
--image_dir=tf_files/flower_photos

This will run through all 4,000 iterations. If you would like to run through less, which can and will impact accuracy, you can limit the number of iterations by adding

--how_many_training_steps=numberOfSteps \

to the line immediately — bottleneck_dir=tf_files/bottlenecks \ in the above code.

Step Six:

Wait…

Step Seven:

Test your classifier!!

My classifier info for reference:

INFO:tensorflow:Final test accuracy = 90.3% (N=362)

Result of testing end goal image:

Evaluation time (1-image): 0.318sdaisy (score=0.99227)dandelion (score=0.00635)sunflowers (score=0.00135)roses (score=0.00003)tulips (score=0.00000)

Resources

--

--