Neural Style Transfer Across Artistic Styles
Building a web application that transforms photographs into artworks of one’s chosen style
--
This article was produced as part of the final project for Harvard’s AC215 Fall 2021 course. This project consists of team members Benjamin Wu, Catherine Yeo, and Zev Nicolai-Scanio.
Introduction
When planning their artwork, artists often have difficulty deciding the style or visualizing the final piece of artwork they want to produce. Those who aren’t artists may also want to apply artistic styles to portray and embellish their everyday lives, as we see with photo editing apps or Instagram and Snapchat filters.
Our project would help both artists and non-artists achieve their goal by offering them the opportunity to turn a photograph into a painting or artwork of their chosen style. We achieve this using the technique of neural style transfer.
Neural style transfer is a deep learning technique that takes a content image and a style reference image and produces an output image that transforms the content image to imitate the style of the style image. This is done by extracting semantic and contextual information from the images using convolutional neural networks (CNNs).
Data
Dataset
We selected the Unsplash photo dataset for our neural style transfer models to train on. It is a free open library dataset that contains numerous high resolution images, which was ideal for our project’s purposes.
For our style images, we curated a small selection of images that cover a range of art styles, including Vincent van Gogh, Georgia O’Keeffe, Jackson Pollock, sketches, and cartoons.
Data preprocessing
We performed exploratory data analysis (EDA) to assess our dataset of content images and see if there are any sets of images we do not want to use. It allowed us to get a sense of what kind of photos were represented in our data and to compare them visually.
These were the style images we worked with: