Understand AlexNet in just 3 minutes with hands on code using Tensorflow
In the 2012 ImageNet LSVRC-2012 competition, the AlexNet model beats all other competitors by an enormous (15.3% VS 26.2% (second place)) error rates. What is this magic sauce behind AlexNet? While the academic paper can be find here, this article aims help you demystify the magic behind AlexNet with practical code implemented in Tensorflow.
Model Architecture
Similar structure to LeNet, AlexNet has more filters per layer, deeper and stacked. There are 5 convolutional layers, 3 fully connected layers and with Relu applied after each of them, and dropout applied before the first and second fully connected layer.
Model Definition
Here we write the definitions for convolution, pooling, LRN, dropout and fully connect in python functions with tensorflow. Now let’s write the definition for Tensorflow:
Model Testing
Finally, let’s try the model with some implementation code written with OpenCV
Here are the results
Image 1:
Image 2:
Image 3:
Image 4:
Perfect! All the code for this article can be found at:
https://github.com/ykpengba/AlexNet-A-Practical-Implementation
If you would like to test out the code yourself, please make sure you download the bvlc_alexnet.npy model here at http://www.cs.toronto.edu/~guerzhoy/tf_alexnet/ and have it in your directory.
Feel free to connect me on linkedin at https://www.linkedin.com/in/yukpeng/. Follow me here on medium for more practical deep learning tutorial in the future.
Join Coinmonks Telegram Channel and Youtube Channel get daily Crypto News
Also, Read
- Copy Trading | Crypto Tax Software
- Grid Trading | Crypto Hardware Wallet
- CBET Review | KuCoin vs Coinbase
- Fold App Review | Kucoin Trading Bot
- How to buy Bitcoin Anonymously | Bitcoin Cash Wallets
- Binance vs FTX | Best (SOL) Solana Wallets
- Crypto Telegram Signals | Crypto Trading Bot
- Best Crypto Exchange | Best Crypto Exchange in India
- Best Crypto APIs for Developers
- Best Crypto Lending Platform
- An ultimate guide to Leveraged Token