The #paperoftheweek 3 was: Arbitrary Style Transfer in Real-time with Adaptive Instance Normalization

This week’s paper selection was inspired by the recent StyleGAN http://stylegan.xyz/video. In order to look deeper into its architecture, we reviewed the adaptive instance normalization (AdaIN) paper, which was one of the proposed changes in StyleGAN.

Arbitrary Style Transfer in Real-time with Adaptive Instance Normalization contains 3 main contributions. First, the authors provide a theoretical analysis of the effects of a batch norm, instance norm, and conditional instance norm on style transfer networks. They argue that by normalizing the channel-wise mean and variance of the image features, they can normalize the style of the image. Taking this information into account, the authors develop a newly proposed AdaIN technique that aligns the channel-wise statistics of the 2 input images (content and style), producing quality style transfer results. Second, the proposed style transfer runs in real time, allowing the processing of large collections of images and videos. Last but not least, unlike some previous style transfer works, this paper can work with an unlimited number of styles.
 
The animation below shows the pre-trained AdaIN style transfer applied to our office plants:

The paper itself is from 2017, which means that by now repositories with code and pre-trained weights are available online.

Original authors’ code (torch): https://github.com/xunhuang1995/AdaIN-style.

Unofficial Pytorch code ❤: https://github.com/naoto0804/pytorch-AdaIN.

Unofficial Tensorflow code: https://github.com/elleryqueenhomels/arbitrary_style_transfer.

Abstract:

“Gatys et al. recently introduced a neural algorithm that renders a content image in the style of another image, achieving so-called style transfer. However, their framework requires a slow iterative optimization process, which limits its practical application. Fast approximations with feed-forward neural networks have been proposed to speed up neural style transfer. Unfortunately, the speed improvement comes at a cost: the network is usually tied to a fixed set of styles and cannot adapt to arbitrary new styles. In this paper, we present a simple yet effective approach that for the first time enables arbitrary style transfer in real-time. At the heart of our method is a novel adaptive instance normalization (AdaIN) layer that aligns the mean and variance of the content features with those of the style features. Our method achieves speed comparable to the fastest existing approach, without the restriction to a pre-defined set of styles. In addition, our approach allows flexible user controls such as content-style trade-off, style interpolation, color & spatial controls, all using a single feed-forward neural network.”

For more details and a good read, check out the paper: https://arxiv.org/abs/1703.06868

The article was written by Tomas Langer, Deep Learning Researcher & Developer at Brighter AI Technologies.