CNN with Keras

Kai Wombacher
Deep~Learning (Data 2040) KW
3 min readMay 12, 2018

https://github.com/kwombach/deep-learning

To show an example of Keras’s CNN classification, I decided to use the Simpsons Characters Dataset from Kaggle, where there are directories of images for each character, to classify images as containing Bart or Homer Simpson. Using ‘Deep Learning with Python’ by François Chollet as a template, I start by creating a training, testing, and validation set of 750, 250, and 250 images for each character respectively.

I then create my model:

Model Setup

As you can see from the summary above, the model contains 4 layers of convolution and max pooling, a flattening layer, and then two dense layers.

I can now train the model. Based on computation power and your data, you might need to adjust the steps_per_epoch and the epoch parameters when training. In this example, I will use 100 steps_per_epoch and 10 epochs.

Model Training

As you can see, our training accuracy ended at about 96%. This is very good performance considering the fact that there were a few images that contained both Bart and Homer. We can now look at the model’s performance on our validation set:

Training and Validation Scores

As we can see from the visualization above, not only do we have a very high training accuracy and a very low training loss, we also get a high validation accuracy and a low validation loss. This shows that our model is robust and not overfit to the training data.

We can also create synthetic data to get even better results. I can transform and rotate the images for each character to obtain more images. Below, you can see how I manipulated one image to produce multiple

I now create a second model that is the same as above — except with a dropout layer — and train it using the same steps_per_epoch and epoch parameters.

Model 2 Training

As we can see from the results above. Our new model with rotated images and a dropout layer has a slightly lower training acc/loss but a slightly better validation acc/loss.

--

--