Tame macroblocking

IMAX
IMAX Technology Blog
7 min readMar 8, 2019

Saturday night. Leafs versus Habs. Thousands of pulses spike from coast to coast as Auston Matthews takes a pass in the slot, turns, leans into a wrist shot destined for the top shelf of the Montreal net and … suddenly TV screens nation-wide turn to snow. The picture is lost. The live moment, gone forever.

If you’re the director of that hockey broadcast, the imaginary scenario painted above is your worst nightmare come true. It means you’ve been hit by an episode of something called “macroblocking” — an elusive-to-detect-and-diagnose glitch that impacts the transmission of a video and causes viewers everywhere to reach for slippers to throw at their televisions.

“If you’re the content owner, broadcast-rights holder, or the operator delivering the broadcast, and something like this happens for a prolonged amount of time, it’s huge reputational and financial exposure,” says Dr. Abdul Rehman, Chief Product Officer for IMAX.

IMAX’s Streaming and Consumer Technology division builds technology designed to safeguard a video delivery chain and monitor its quality. It’s business is based on helping video companies get their images to their viewers and ensure those viewers are happy with what they see.

“If you’re hit by macroblocking, IMAX’s products can help you quickly detect and troubleshoot the problem and quickly put you in a position to fix it,” Rehman says.

Macroblocking can result from many different causes. They can be technologically based, triggered by a damaged cable or faulty equipment in a transmission truck. They can result from encoding or transcoding equipment doing aggressive video compression, or memory becoming too full or CPU cycles reaching high levels or hard disk space becoming saturated. They can result from introduction of noise in transmission lines causing interference to the signal and therefore affecting its decoding downstream. They can result from network routing and switching equipment dropping packets due to network saturation or errors. They can be man-made, caused by tripping on a loose wire in the control room or transmission facility. They can even be environmental, caused by torrential rain or lightning, sun transit, or maybe too much snow falling and coming to rest on a satellite dish. The list is near-endless.

“The video workflow is vast, it’s complicated, it’s long and it is composed of many stages,” says Rehman. “It might help to think of it as an automotive plant — many different stages, and each stage has a different purpose and different processes.

“In terms of video impacts, I like to think of video itself being like water flowing through pipes. If something happens upstream, then that propagates further downstream. It flows downstream all the way to the consumer’s devices.

“And the higher up in the stream a problem takes place, the more people it affects.”

Troubleshooting an episode of macroblocking is made all the more difficult by the nature of the video delivery chain, which exists as a series of sequential stages. A broadcaster may depend on other stakeholders to move a video through a given stage of the chain and won’t have control over how that stakeholder monitors for quality — how many quality assurance monitoring points they deploy, for example, or where they are deployed in the workflow, or the kinds of monitoring instrumentation in use.

“Detection depends on where your monitoring points are set up and what type of measurements you’re taking at those monitoring points,” says Rehman. “If you’re not monitoring the right thing, at the right place, at the right time, you won’t catch the problem.”

IMAX strives towards levelling and standardizing the challenge of video service workflow assurance. To that end, IMAX enables organizations to deploy end-to-end workflow monitoring solutions. Probes are deployed at specific junctures of the delivery chain, with a quality score available at each stage. If the score reads 95 (out of 100) at the first monitoring point, but drops to, say, 75 at the second, it’s clear a problem is emerging between those two points. That allows our hockey broadcast director in the scenario painted above to quickly narrow in on the trouble and put a fix in place.

“You need to be able to say very quickly: ‘Here is my delineation point,’” says Rehman. “‘I know for a fact I’m picking up the issue here, which means [the problem is] happening upstream from that point.’

“Right away it allows the stakeholder to focus its investigative energy and resources in a certain spot of the workflow. The value is even more evident during high-profile events like the World Cup or Super Bowl. A macroblocking episode can have global exposure, and can result in massive advertising revenue loss or subscriber churn.”

IMAX’s macroblocking-busting technology, he says, offers several advantages over that of its competitors.

For starters, it’s the only solution whose algorithms monitor quality from the standpoint of the viewer. In other words, if one of its monitoring probes registers a quality score of 95, it means that a viewer watching a video at that point would be quite pleased with it. If the score was 75, the viewer would be less happy.

“The value proposition of IMAX streaming products,” says Rehman, “is that we have an algorithm that is the best objective prediction of subjective human perception of video. By far.”

The second advantage IMAX offers is it can deploy at multiple monitoring points throughout the workflow in content-correlated fashion. This enables IMAX to precisely describe the video asset’s “journey” and the video quality impacts the asset “suffers” at each stage of the workflow, as well as the resulting viewer experience.

This is in contrast to today’s common approach of deploying one or more monitoring points without the ability to correlate frame-by-frame video quality scores across sequential workflow stages.

With IMAX’s solution, the obtained monitoring results are correlated down to the pixel and video frame level across the workflow, thereby allowing the IMAX platform to precisely tell an end-to-end quality story as the video asset makes its way through the workflow.

For example, the platform can tell you the exact impact an encoder had on every single frame of the video, and then, in similar fashion, the platform can tell you the impact of downstream stages prior to playout on a device.

This takes place in real-time, in parallel, across all video assets being delivered by the organization in question.

Ideally, a customer would choose what’s called a “three-point monitoring” system. The first monitoring point measures video quality at the outset, as a video begins its journey to the viewer, in order to ensure integrity and avoid what Rehman calls “garbage in, garbage out.

“If you’re sending a bad picture quality out at the beginning of the journey, you’re going to have bad quality at the end,” says Rehman.

The second monitoring point checks on a video’s health at a mid-stage, perhaps after encoding, or compression. And the third monitoring point is deployed at the output of a viewer’s device, ensuring quality integrity from start to finish.

And the platform is flexible, meaning the customer can choose two, three, four monitoring points. Or more.

“It all depends on the customer, their workflow and their requirements,” says Rehman.

Yet another IMAX advantage is its ability to monitor both the video content and the underlying system and network layers that carry the content, or what Rehman likes to call “content and its envelope.”

Macroblocking, he says, can result from problems with either, or both.

“Some vendors will only monitor the envelope and not the content,” says Rehman.

Finally, IMAX offers what Rehman calls “apples to apples comparison.”

In other words, its tools can investigate the quality of a group of picture frames from the beginning of a transmission and check on the health of precisely that very same set of video frames later on, irrespective of how the video content gets processed and delivered along the way

“So we’re talking content-level, frame level, pixel-level correlation of scores,” says Rehman.

“This unlocks a helluva lot of value. You can actually speak precisely to the impact of the encoder, transcoder, packager, content delivery network or device playout on your content. If you don’t have frame-level pixel-level correlation, you cannot precisely talk about the impact of the various workflow stages on your content.”

That ability to do an apples-to-apples comparison opens up valuable use-cases. Rehman described a scenario where a legacy content provider — a cable or satellite TV network, for example — decides to make the changes necessary to compete with the likes of a Netflix or Amazon using an OTT, or over-the-top (internet) connection. IMAX products would allow that customer to run both streams using the same content and compare the quality of each.

“There are no solutions in the market that allow you to compare legacy to OTT in such a precise way,” he says. “I can make a direct comparison and say one is performing, say, seven points better. I can delineate and diagnose to the source.”

At the end of the day, no content provider wants to become a victim of macroblocking. The stakes are too high. Viewers want what they want when they want it. So do advertisers and shareholders.

“[Macroblocking] completely ruins the viewing experience, ruins reputations, damages advertising revenue,” says Rehman.

The best protection?

“IMAX. We’re the only ones who can say we are doing monitoring pixel-by-pixel, frame-by-frame, in real-time, at scale, in an end-to-end asset- and frame-correlated manner across multiple monitoring points, and only we can say how and where the workflow impacted the asset during its journey, and most importantly, how the end-viewer is actually perceiving those pixels.”

--

--

IMAX
IMAX Technology Blog

IMAX technology solves critical pain points for filmmakers, studios, streamers, broadcasters, consumer electronics, exhibitors and consumers .