Wide Residual Networks with Interactive Code

Jae Duk Seo
4 min readApr 27, 2018
Gif from this website

My finals are finally over, and I wanted to celebrate by implementing wide res net. Also, this would be a very short post since I want to get back into the mood of writing blog post, hope I don’t disappoint you!

Wide Residual Network

Image from this website

Red Box → Number of increased feature maps in Convolutional NN

The main reason why the authors of the paper calls it a wide residual network is due to the increase of the feature map size per each layer. As seen above, when I mean feature map size, I mean the number of channels that gets created on each convolution layer, however, please keep in mind that this feature map size can also decrease.

Network Architecture ( Layer / Full / OOP )

--

--

Jae Duk Seo

Exploring the intersection of AI, deep learning, and art. Passionate about pushing the boundaries of multi-media production and beyond. #AIArt