Notes on SqueezeNet

The paper of SqueezeNet provides a smart architecture as well as a quantitative analysis. For the same accuracy of AlexNet, SqueezeNet can be 3 times faster and 500 times smaller.

The following chart shows the advantages of SqueezeNet.

Comparison [1]

The main ideas of SqueezeNet are:

  1. Using 1x1(point-wise) filters to replace 3x3 filters, as the former only 1/9 of computation.
  2. Using 1x1 filters as a bottleneck layer to reduce depth to reduce computation of the following 3x3 filters.
  3. Downsample late to keep a big feature map.

The building brick of SqueezeNet is called fire module, which contains two layers: a squeeze layer and an expand layer. A SqueezeNet stackes a bunch of fire modules and a few pooling layers. The squeeze layer and expand layer keep the same feature map size, while the former reduce the depth to a smaller number, the later increase it. The squeezing (bottoleneck layer) and expansion behavior is common in neural architectures. Another common pattern is increasing depth while reducing feature map size to get high level abstract.

Fire module [1]
SqueezeNet [1]

As shown in the above chart, the squeeze module only contains 1x1 filters, which means it works like a fully-connected layer working on feature points in the same position. In other words, it doesn’t have the ability of spatial abstract. As its name says, one of its benifits is to reduce the depth of feature map. Reducing depth means the following 3x3 filters in the expand layer has fewer computation to do. It boosts the speed as a 3x3 filter need as 9 times computation as a 1x1 filter. By intuition, too much squeezing limits information flow; too few 3x3 filters limits space resolution. The following charts provides a quantitative analysis. Surprisingly, the influence is not as big as I expected. Does it mean the stacking layers of SqueezeNet gives it abstract power or simply redundancy?

Quantitative Analysis [1]

References

  1. SQUEEZENET: ALEXNET-LEVEL ACCURACY WITH 50X FEWER PARAMETERS AND <0.5MB MODEL SIZE: https://arxiv.org/pdf/1602.07360.pdf
Like what you read? Give Hao Gao a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.