WEEK 6: TRAINING on DENSENET121, Squeezenet and CONCLUSION

FATİH ERBAĞ
bbm406f19
Published in
3 min readJan 5, 2020

Last week we train alexnet, VGG16 and resnet 50. If you have not read our blog last week or would like to review it, you can reach it here.

This week we will be using Squeezenet and densenet 121.

Densenet121:

We used 20 epochs for Densenet121 and 32 for batch and 0.0001 for learning. Our train result was 98% .Validation result is 85% and test result is 85%. Below you will find our Loss graphs. You can also see our play on the pretrained model below. Here we have added FC Layer, dropout, ReLU and softmax. For Densenet121 according to this graph, we can say that a proper model was learned for our model. If you have overfitting problems, you can play with dropout values.

TRAIN and VALIDATION

Squeezenet:

We used 20 epochs for Squeezenet and 32 for batch and 0.0001 for learning. Our train result was 90% .Validation result is 87% and test result is 84%. Below you will find our Loss graphs. You can also see our play on the pretrained model below. Here we have added convolotion layer.For Squeezenet according to this graph, we can say that a proper model was learned for our model.

TRAIN and VALIDATION

Conclusion:

To date, we have seen 5 different models of the train and the result of them all together. These models were Resnet50, Densenet121, VGG16, Alexnet, and Squeezenet. Among these models, we received our highest accuracy from Resnet50. The values obtained from these models are based on the hyperparameters we mentioned last week. If you think you can get better results, you can play with these parameters and try to improve your values.

WRITERS:

Furkan GÜREL Seda ORAN. Mahmut Fatih ERBAĞ

--

--