Style Transfer

Emily Jaekle
Deep Learning Data 2040
3 min readMay 11, 2018

https://github.com/ejaekle/deeplearning

Based on one of the notebooks from “Deep Learning with Python” we can transform images using neural style transfer. To do this we need an image to transform and a style image. The image to be transformed will keep the same content but change to match the style of the style image. The style is the texture, colors, and patterns found in the style image.

The only thing you need to change in the code is the file path to your input image and the file path to your style image. You also may want to consider changing the number of iterations. Sometimes after only one or two iterations you can see very clear results, other times running it up to five times will get the best results. I found that having more than five or six iterations (for example I tried ten iterations) did not make much of a difference in the results and only cleaned up the edges.

Here are some of the results I got:

Example 1

Original Image
Style Image
Result with 5 Iterations

Example 2

Original Image
Style Image
Result with 3 Iterations

Example 3

Original Image
Style Image
Result After 1 Iteration
Result After 5 Iterations

Example 4

Original Image
Style Image
Result After 6 Iterations
Result After 10 Iterations

As you can see the difference between 6 and 10 iterations is not very much. I found that certain style images work better with black and white photos with very little going on in them. For example the wave image works best on this simple black and white photo. For each image and style the number of iterations needed to achieve the desired result differs.

--

--