[VIDEO] Picasso paints Picasso, with help from Pikazo.

Pikazo
3 min readMar 8, 2016

--

You’ve been amazed by the images.
Seeing them in motion will blow your mind.

With Pikazo, we’ve been very lucky to have helped thousands of people bring life to the art they’ve always been able to imagine, but never before been able to create.

“Hudson River Castle via Hudson River School” by Jon Russek
Untitled by Bill Lambertson
“Che in Crochet” by Jonathan Kalonymus Briskin
“My happy brain” by Tracy Shintaku

But some art wants to be alive.

While we’ve always marvelled at paintings, all art lovers have dreamt of living inside of a painting — what would the world look like if everything were rendered by brushstrokes?

Many of our early adopters are filmmakers and animators, and we’ve gotten lots of interest from them in understanding how Pikazo can become part of a video workflow. We’re really interested too.

A dream of dreaming.

The neural network that powers Pikazo’s image creation engine isn’t really a natural for video, because the rendering process is so (delightfully) unpredictable. But our head of R&D, Karl Stiefvater, has been hard at work to fix that.

Put a cushion under your jaw.

For the first time ever, you’re about to see how the world looks from the other side of the painting.

Picasso paints Picasso, with help from Pikazo

Karl has shared these notes about the technology:

Video is an obvious application of neural style rendering.

The naive approach is to render each frame from scratch, independent of the frames before and after it. Because the neural style algorithm is chaotic, slight alterations in input cause dramatic changes to output, causing adjacent frames to have little visual coherence. Such chaotic video is sometimes described as “noisy” and is often undesired.

We suggest a solution to provide visual coherence and reduce “inter-frame noise”:

1. When initiating the neural style convergence process — use a custom “init” image as the beginning state of the search. This “init image” will be created from the previous frame, to influence convergence on a similar (coherent) image.

2. In cases where structures are moving from frame to frame, use optic flow to measure the motion of the pixels in the underlying video. Use that motion to warp the previous frame for use as an init image in the convergence. This technique was used effectively to create the painterly look of the film “What Dreams May Come”.

3. Use a blend between warped image and underlying image as a parametric control on “inter-frame coherence”.

Video test of Pikazo neural style algorithm. Varying parameter values for coherence, smear, and boil. Testing optic flow.

The source video is the classic Visit to Picasso (1949). Music is “Colors,” by Adult Fur.

You can see more remarkable works of art shared by thousands of Pikazo artists in our user group, the Pikazo Salon. And look for our Android release later this week!

--

--