Neon uses neuroscience to clip 3-second mobile video ads optimized for engagement

Nicole Halmi
Neon Open
Published in
5 min readOct 6, 2016
Introducing Spectacles | Snap

As mobile video advertising grew 66% over the last year, it has brought brands into an arms race for users’ attention. At the same time, platforms have shifted the goal posts — encouraging brands to create shorter, native video ads that can be watched on autoplay without sound (81% of video ads are now viewed without sound and shorter ads are proven to perform better, especially on mobile).

Advertisers are now asking the question, “How do we make a lasting impact when we only have seconds to do it?”

At Neon, we wanted to help advertisers drive maximum value when advertising on leading platforms like Google, Facebook, and Snapchat — by optimizing their video content to capture attention, increase dwell time, and deliver lasting memorability and brand affinity — all in just a few seconds!

Today Neon is introducing the ability for all advertisers to take existing video ads and automatically create shorter clips or GIFs, optimized for performance on various advertising platforms and with different demographics. These clips highlight the seconds in the video that will attract users’ unconscious attention and motivate them to engage. We get that successful ads are not just about emotional attention, and that storytelling is important. With that in mind, we have worked especially hard to ensure our software preserves the ad’s story arc and sentiment in our clips.

Some of our favorite clips

A Tituss Burgess Laundry How-to | Unstopables In-Wash Scent Boosters | Downy
Pine-Sol Presents: “Mr. Boddington” | Pine-Sol
Your Future Is Not Mine | Adidas

It’s clear that one size does not always fit all with successful advertising. So, we generate multiple clips per video, customized for user demographics and different platforms that the ad may be distributed on. Check out an example below. Here, we’ve pulled two different clips from Chanel’s “The One That I Want” ad by Baz Luhrmann. The first clip maximizes something we call joint attention — in other words, two or more people in a scene looking at the same thing — and the second clip maximizes action. We know that joint attention and action both drive unconscious emotional engagement, or valence, but one type of interaction may appeal to a given demographic or work better on a given ad platform than the other type of interaction.

The One That I Want | Chanel

Anatomy of a Neon clip

To create effective clips from longer videos, Neon leverages our video processing technology and core deep learning model to identify and clip the most emotionally engaging segments of video. Our algorithms scan through the video, identify discrete scenes, and give each scene a NeonScore. We then determine whether the length of each higher-scoring scene is in line with a particular brand’s needs. Alternately, we can return clips whose length is determined by the clip content, and allowing clip length to vary for different clips from the same video.

A key step in the clipping process occurs when our software takes a fine-grained look at the scene to understand the composition and the action profile. By examining the amount of visual change from frame to frame within a scene, we can identify the action high point or climax. Our software then edits the scene into a clip by modeling a story arc — rising action, climax, denouement — and clipping to highlight that arc.

To ensure that clips are made of scenes that effectively summarize the video, we can use information theory to look for objects within scenes. By doing this, we can make sure that, for example, a face that appears in 90% of the video shots also appears in a summary clip.

Start Me ft. Von Miller | Madden NFL 17

We understand that storytelling is an important way for advertisers to connect with consumers. To make this automatic clip creation feature useful for advertisers, we’ve worked to use computer vision and machine learning to maintain the story arc and key visual elements that creatives have embedded into video ads.

To try our automated clipping service, get in touch.

About Neon

Neon creates technology that uses neuroscience-based algorithms to personalize visual content in real-time for specific audiences, devices, and platforms. Neon was founded on over ten years of research in the science labs of Brown University, Carnegie Mellon, and Massachusetts General Hospital. Over that time, we have developed technology that combines the science of perception, deep learning, and the world’s largest and most comprehensive dataset of emotional response to images to closely mirror human visual preference.

Read more about our app, Neon Pro

Read about what our software found in Beyonce’s Lemonade

Read about how our neural net approach is different

A$AP ROCKY — L$D

--

--