Addition-based skip connections VS Concatenation-base skip connections

David Kim(Changhun Kim)
2 min readMar 28, 2023

--

I’ve seen some skip connection are implemented using just + whereas some are using concatenate. explain about the difference and which methods will be applied certain architecture like unet, segnet and resnet , etc.

Skip connections are a technique used in deep learning to connect earlier layers of a network directly to later layers, bypassing intermediate layers. They can be implemented using either addition or concatenation.

Addition-based skip connections involve adding the output of an earlier layer directly to the output of a later layer. This can be useful in cases where the earlier and later layers have the same dimensions, so that the output of the earlier layer can be added element-wise to the output of the later layer.

Concatenation-based skip connections involve concatenating the output of an earlier layer with the output of a later layer along a new dimension, usually the channel dimension. This can be useful in cases where the earlier and later layers have different dimensions, so that the output of the earlier layer can be concatenated with the output of the later layer along the channel dimension.

In architectures such as UNet and SegNet, which are used for semantic segmentation, skip connections are typically implemented using concatenation. This is because these architectures often involve a “symmetric” structure, where the feature maps are downsampled and then upsampled to produce the final segmentation map. Concatenation-based skip connections help preserve the spatial information during upsampling.

In contrast, in architectures such as ResNet, skip connections are typically implemented using addition. This is because ResNet uses residual blocks, which involve adding the output of an earlier layer to the output of a later layer, so that the network learns residual functions that make it easier to optimize deeper architectures.

However, there is no hard and fast rule about when to use addition-based or concatenation-based skip connections. The choice often depends on the specific architecture and the problem being solved. In general, addition-based skip connections can be useful for improving the optimization of deep architectures, while concatenation-based skip connections can be useful for preserving spatial information and improving the accuracy of segmentation tasks.

--

--