Motion Brush: A Master Stroke for the Creator Community
--
Is it a photo? Is it a video? No, it’s a hybrid of the two — known as a cinemagraph. This surreal artform conventionally requires carefully recorded video footage, and the painstaking process of analyzing repeating motion within it. The artist will then manually freeze the non-moving areas of the scene to amplify the surreal effect. It’s a work-intensive process, and the result is mesmerizing.
We wanted the best of both worlds: the dramatic visual effect, made accessible for creators.
A new perspective
Most user-recorded footage is unsuitable for creating cinemagraphs — particularly if it isn’t purpose-shot. But what if we turned the process on its head and started from a still image?
We’re of the mindset that laborious processes for users block raw creative power. So we set about building a cinemagraph retro-fit for photos, enabling our creators to focus on the intuitive, fun parts of the process. With Motionleap, if you can take a photo, you can make a cinemagraph.
Motionleap: under the hood
From a single image we add displacement vectors. These vectors then direct the generated motion.
In the original version of Motionleap (previously Pixaloop), we had six different tools for the user to control the output: displacement vectors — user-drawn arrows to indicate position and direction of movement; anchors to restrict movement, and a freeze tool to paint areas that shouldn’t move at all.
These tools work well for fluid elements such as waterfalls, snow and steaming coffee cups, where edges are softer and therefore more forgivable to inaccuracies. But what about man-made structures such as buildings?
Architectural photos often contain repeating patterns, which make them ideal for cinematograph creation. We can simply convert the repeating texture into a repeating motion by translating each element towards the next. However, this requires significant precision — if the displacement is not aligned perfectly with the structure of the image we get ghosting or ‘double edge’ artifacts. In the original Motionleap, even an experienced user would have to spend a lot of time carefully drawing arrows, and marking frozen areas and anchor points.
Our new approach, Motion Brush (presented in our SIGGRAPH 2021 paper, click here to watch a video explanation of the technique), greatly simplifies the animation process — and does so without sacrificing user control. The interface consists of just two tools: A brush tool, that indicates the areas the user wants to animate, and an arrow tool to control the direction of motion.
With these inputs, we developed an algorithm that identifies the best displacement vector for each pixel, in order to render a seamlessly looping video. This all runs directly on the user’s mobile device, eliminating the need for slow, expensive cloud-processing.
How Motion Brush works
Our key insight for the new algorithm was that we could break the solution into two stages and solve each of them more efficiently. First, we compute a rough estimate of the motion by analyzing repetitions in the image in 1D, allowing motion only in a single fixed direction. Then, we refine that initial estimate in 2D, allowing the pixels to move with more flexibility.
The new algorithm works both for fluid elements such as water and smoke, and for more structured objects like buildings — all with the same set of controls. In most cases, the Motion Brush and the arrow tool are all that’s required to produce an impressive animation.
For more complex motion such as rotation animations, the user may need to use the arrow tool more than once to get it just right. But thanks to our speedy algorithm the user can interactively experiment with their creation to get their desired results. We’re currently working on using deep learning to simplify the process further, making the interaction more intuitive for complex motion patterns.
Want to see what all the hype is about? Download the iOS version of Motionleap and check out the new Motion Brush tool — we’re pretty proud of it. Tag your creations with #motionleap on Instagram
Acknowledgements
Contributed to the project: Hanit Hakim, Orestis Vantzos, Gershon Hochman, Gal Nachmana, Netai Benaim, Lior Sassy, Michael Kupchik, Ofir Bibi and Ohad Fried
Create magic with us
We’re always on the lookout for promising new talent. If you’re excited about developing groundbreaking new tools for creators, we want to hear from you. From writing code to researching new features, you’ll be surrounded by a supportive team who lives and breathes technology.
Sound like you? Apply here.