It’s Time For a Revolution In Video Monetization Latency

Eric Hoffert
Mar 3, 2016 · 17 min read

Participating in the creation of Internet video has been fantastic. It’s great to see how far we’ve come but we aren’t there yet. In this post I review what still needs to be done to stream video instantly when video advertising pays for content.

The Magic of Seeing Digital Video For the First Time

It’s Time to Create a Fluid, TV-like Experience on The Open Video Internet

Given the early work I did on video when a QuickTime player was the size of a postage stamp and video streamed at 10-12 frames per second, many recent developments seem extraordinary in comparison; in many ways we are in a golden age of video content. Consumer broadband speeds are in the tens or hundreds of Megabits with literally hundreds of millions of individual video creators around the world. But there are definitely issues which are impacting the potential for video. One of the biggest of these issues is video latency on the open Internet, referring to the world of independent publishers that provides a diversity of rich content and serves as a critical alternative to the walled video gardens of Facebook and YouTube. Publishers use a variety of methods to monetize this content; whether it’s instream (pre-roll), outstream, or other video advertising formats. As video progressed on so many fronts in the last decade, it has moved backwards in one key area — the time it takes to get to the content that you are supposed to watch.

As a kid I remember marveling at the flow of images on my TV, completely seamless. In today’s world of high definition, TV is 100 times better with a fluidity and richness at 1920 x 1080, or better yet in 4K — it’s truly incredible. But traditional TV is starting to fade away as online video and OTT begin to take over. To make this model work for digital video, it’s time to create a fluid, TV-like experience on the open Internet.

It’s Like Watching a Sotheby’s Auction With a Blank Screen Before Your Video Starts

Endless buffering due to long video ad loading times…

Some vendors run sequential video waterfalls and some run mediation in parallel. In many cases these vendors wait until the slowest video demand source returns a result, creating a major video bottleneck. The user experience is like hitting the play button, watching a Sotheby’s auction for a painting take place with many bidders putting in their bids. But guess what — you just see a black screen with a spinning cursor and maybe nobody wins the auction. So you sit there and watch nothing for five, ten, or even 20 seconds, and then if you‘re lucky you get to watch your favorite video afterwards. This is crazy!

Source: Krishnan, S. S., Sitaraman, R. K. (November 2012). “Video Stream Quality Impacts Viewer Behavior: Inferring Causality Using Quasi-Experimental Designs.” Boston, MA: Internet Measurement Conference. This graph shows that for short form video content common on the web, with five seconds of delay 20% of users will abandon a video stream, after ten seconds 40% will depart, and a delay of 20 seconds causes 80% of viewers to stop watching the content.

It gets worse, in those cases where you wait ten seconds to see a video play, the client side process may result in no bid, resulting in no video ad impression. You waited ten seconds for nothing, and the publisher who is depending on your video impression to monetize her web site has nothing to show for it. Zilch, nada. Instead, she is out of pocket for a video impression that someone else could have monetized for her and helped to pay for the great video content that you get to watch on her site. We call this practice no-penalty video arbitrage while you watch. It means a video advertising vendor tried to sell a video impression for a publisher (video arbitrage), they failed to do so, it cost the vendor nothing (no penalty), you saw a black video screen (while you watch), but it hurt both the video user and the publisher. Outrageous!

Video advertising latency can be the result of one or more factors as outlined above

It’s no wonder we have video ad blockers on the rise. But what if we could have a world where video ads loaded instantly, ads were highly relevant to what you are interested in, and the video content that follows the ad plays likewise instantly. Things might change; I doubt most users want to starve all of the video content creators when they visit their sites and remove monetization from the equation via ad blockers (if they do, it’s time to subscribe to their video sites instead). Most of the impact of this type of video latency is on small to medium sized publishers but the issue can be seen selectively with premium publishers monetizing programmatically.

With Video, Every Millisecond Counts

Source: NY Times

To give you an idea of the vision I’d like to see us get to for a revolution in video latency, please check out the video search engine I created as a personal hack project; a demo is in the YouTube video embedded below. The concept is to show results of a video search as a single video stream, compositing all of the videos in a search into a single seamless video experience, with fast MTV like cuts showing just a second or two from each video clip in your search. This is the type of instant, back-to-back fluid experience we want to see for video on the Internet. The video experience should be instant whether the user is viewing video content, video ads, or mixtures of both.

A Call to Arms to Radically Change How Video Monetization Works

We’ve done some inventing of our own that we are excited about and have developed several new approaches to solve the client side video latency issues. Although we are keen to get this to market, we know it’s not enough. It’s just scratching the surface of the problem. No single company can solve this issue, it’s an industry wide challenge. We want to work with our clients, partners, industry standards organizations, and yes, even our competitors, to fix these issues and upend the status quo. Why? So users, publishers, and advertisers can benefit. It’s in our mutual best interest to get to a better place for fast, fluid video. We think this is a war to fight with many battles on the way. It’s time to radically change how video monetization works.

Let’s review a number of ideas on how to fix the broken equation of video latency.

We Need to Push The Latency Out of The Video Play Button

Run the Video Monetization Process Away from the Play Button: It’s not necessary to run a lengthy auction on the client side while a user waits to watch their favorite video content. Push that auction away from the play button and use the auction result only when it’s ready.

Run the Video Monetization Process on RTB Servers: Programmatic RTB (real-time bidding) platforms are incredibly powerful with thousands of servers globally and low latency performance. Move more video demand to use of programmatic RTB, instead of relying heavily on client side sourced demand. Let’s shift demand mix from client to server (full disclosure: I work at AppNexus and RTB is a core business but I truly believe we should have auctions running on powerful servers not consuming the CPU on personal computers). If you look at market projections of programmatic RTB for video, it’s slated to comprise a rapidly growing portion of overall online video revenues so surf that wave.

Use Server Side Video Ad Stitching: Server side video ad stitching moves the auction process away from the client so that it happens on the server for every ad, along with compositing of the video advertising and content, delivering a single, seamless video stream. This approach also thwarts ad blockers because video is delivered as a single integrated stream of content and ads. The IAB is working to standardize this method with VAST 4.0, a promising direction. Would be great to see more of this approach in action.

Server Side Video Ad Stitching

Video Ad Streams Matching Network Bandwidth: VAST 2.0 allows for multiple renditions of video at different bit rates in order to match ad creative to device and network speed. However many demand sources provide only one or two bit rates of video renditions; this can result in a lack of alignment between video stream data rate and network bandwidth. For example, a single VAST video file which is large will cause high latency for playback when the network bandwidth is low; conversely when a VAST video file has a small size but the bandwidth is high, the visual quality may be sub-optimal relative to bandwidth available. The video player should be able to select between renditions to match network speed and video quality. There is also promise for greater use of adaptive streaming formats such as Apple HLS and the open standard MPEG-DASH for lower latency video ad delivery; these formats however may need to be tuned in order to optimize for short form duration spots of 15 to 30 seconds, given the need to rapidly converge to a matched bitrate with dynamic network conditions.

Client-Side Video Auctions and What to Do About Them

Run client side video auctions in VPAID only at VPAID InitAd and run them quickly. With respect to client side auctions, vendors run their client side auctions upon initializing VPAID (using VPAID InitAd) or when a video ad is supposed to start (VPAID StartAd). Never run an auction in VPAID when a user is supposed to start viewing video ads after initialization. This is a “bad actor” scenario and you guys know who you are. Change your code and move client side video auction logic so it can be separated from viewing of actual video advertising content; the use of VPAID StartAd for auctions can cause blank video players with buffering icons that so many of us can’t stand. Switch your logic to running auctions only at VPAID InitAd and make the client side auction process run much faster, respecting your video users. It’s time to get rid of visible spinning icons that last for many seconds in a blank video player!

The video bus has to wait for the slowest kid to get on before it can leave!

Develop IAB Standards for Client Side Video Auctions: It’s time for the IAB to acknowledge widespread use of client side video auctions and develop a standard for conducting monetization. By specifying precisely how auctions are conducted video player and web site developers will be able to control client side processes to allow for a better user experience instead of opaque black boxes we see in today's VPAID. An analogy here is that we need a client side equivalent of OpenRTB, the popular industry standard used for server side interactions for real time bidding of display, video, audio, and native.

Move Flash Away from the Video Play Button: It’s well documented that Flash consumes tremendous system resources — PC World testing showed as much as 80 percent of CPU with a potential doubling of memory utilization. Unsurprisingly this can adversely impact video start times and the frequency of video buffering. And so we ask the video advertising industry please move as much of your video demand and creatives into straight VAST with direct links to video files (mp4, webm) or to VPAID JavaScript. It is especially important for brands and agencies to train your staff of artists and developers to move away from Flash content creation as soon as possible. There is simply too much demand on the open Internet today comprised of VPAID Flash only. Trust me, the Flashpocalypse is coming and we must collectively be ready. Plus a move away from Flash will significantly reduce video latency. It’s been more than five years since Steve Jobs authored his famous “Thoughts on Flash” and the critical need to transition to open web standards (i.e., HTML5) and mobile friendly environments (hello, JavaScript). It’s truly time for us to honor Steve’s visionary call-to-arms and accelerate the recommended transition.

A Video Latency Solution Up to 100x Faster

Run Video Auctions Away from the Play Button: Video mediation is run as a process away from the video play button. Do it before the user hits the play button. Or if users are watching long form video content such as full episode programing and a set of mid-roll video ads are required for an ad pod upcoming, run the mediation process after the pre-roll, but with enough time to conduct the auctions before getting to the mid-roll. And the same approach is used to run the process for a pre-roll ad before it happens: after the mid-roll but before the use gets to the pre-roll. To ensure high win rates with this approach, machine learning and predictive models can be used to run auctions at the right time to maximize likelihood of using the results.

Video Ad Caching: Once a mediation process has been run, the winning VAST XML is kept in a video ad cache. This stores the VAST result, not the actual video ad, so it’s relatively lightweight from storage and bandwidth perspectives.

HTML5 for Video Ad Storage: That’s right, HTML5 local storage. The results of the video mediation process are stored in a video ad cache on the web so it’s available for a virtually instant retrieval when it’s time to play a video ad. Move the latency away from play.

Unwrap the Wrappers: VAST results returned from video mediation often have many levels of re-directions. Waiting for these redirects slows the process and increases latency. By unwrapping the wrapped results, its possible to get directly to the inline VAST content. By storing this in local storage that helps make it fast.

Configurable Deadlines: By setting a timeout for the video mediation process, the bad actors can’t keep running their auctions for too long. If they don’t provide a result quick enough they need to stop. This allows publishers to control the trade off between latency and monetization. It provides a balance between user experience and monetization in a better way.

We are in closed testing for this approach to reduce video latency with an estimated general availability in the second quarter. The plan is to make the solution available for use with both open source and commercial video players. Our initial results have shown that in certain demand scenarios, we can reduce video latency by as much as 100x. That means reducing video latency from 5 seconds down to 50 milliseconds or less. That’s the kind of improvement we need for a revolution in video latency.

Next Steps to The Better Video Internet

Move latency away from the play button and into the cloud.

Video ad caches retrieve video fast.

Develop new IAB standards for client side auctions.

Rewrite VPAID mediation creatives so auctions run on VPAID InitAd and auctions complete rapidly.

Shift demand from VPAID Flash to VAST + video or VPAID JavaScript.

Industry Action

Source: IAB

A number of the issues are bigger than any one company can tackle and so it’s great to see standards body efforts like the IAB’s LeanAds (Light, Encrypted, Ad Choice, Non-Invasive supported) initiative which is targeted to improve user experience for online advertising. The VAST 4.0 standard holds promise to reduce latency on a number of fronts (server side stitching, conditional ads, separation of viewability) but there are questions in terms of how far it goes to address this topic. Its worth noting that open source industry projects to increase monetization and reduce display advertising latency (i.e., prebid.js) are producing positive results for publishers. For mobile there are new open source projects (i.e., Accelerated Mobile Pages) which are worth watching. There is yet to be an open source project to focus on fixing video latency, this could be an opportunity for the industry to work together.

In addition to recommendations outlined here for how the industry can reduce latency for video monetization, AppNexus is coming to market in Q2 with a solution that we believe will start to help reduce video latency. We encourage other companies to challenge themselves to do the same. As mentioned the problem is larger than any solution a single company can provide, so we look forward to collaborating with the video advertising industry to get to a better place for instant video.

There are additional aspects to be successful here to reduce video latency more generally, outside of the video advertising market, these recommendations apply to makers of video players, CDNs, personal computers, and mobile devices:

Optimize video players for near-instant loading (< 250 ms).

Move to home/office speeds at the Gigabit Internet level via Wireless or Fiber.

Minimize network traffic and API calls for advertising and content logic on publisher pages.

Load balancing and fail-over for CDNs to ensure fast network path to video end user.

Push on Moore’s law to speed video across CPUs, GPUs, network bandwidth, and memory.

Explore use of high quality, low-bandwidth video codecs (Alliance for Open Media, H.265).

Measure, benchmark, and report on video latency in a way that is transparent and open in the industry.

For me working on digital video has been a grand adventure ever since that first magical moment seeing moving video images come to life on a personal computer. Let’s work on getting the magic back into the video experience on the open Internet, working towards the goal of fast, fluid, and TV-like for independent publishers and users alike.