The Measurement Gap

IMAX Technology
IMAX Technology Blog
4 min readAug 13, 2018

Quality of Delivery.

Quality of Service.

Quality of Experience.

These are the terms that the video delivery industry typically uses to talk about video quality.

Each one has a specific, yet related, meaning. Each one speaks to the many pieces of the quality puzzle. Each one is based on particular measurements taken at particular stages of the journey that a video takes from its inception to its delivery to the viewer. They’re all relevant.

The problem?

“There’s a gap between what they measure and the final user experience,” says Dr. Abdul Rehman, IMAX Chief Product Officer.

That gap means that these traditional quality measurements, while relevant, don’t tell the whole story. They don’t represent the entire picture — literally the picture, in this case. Rather, they highlight isolated elements of the story.

The upshot? They don’t accurately predict whether the viewer’s experience watching a video will be a good one — or not.

“That’s why I call them pseudo quality of experience measures,” says Dr. Rehman.

A true quality of experience measurement, says Dr. Rehman, takes into consideration all these terms, all these measurements, and does so, critically, from the standpoint of the viewer’s experience.

“To put it in simple terms, it’s about experiencing the pixels, on the right device, with exactly the right resolution, the right frame rate and the right dynamic range that the user is experiencing.

“In other words, if you really want to create a quality metric, that metric has to take all these things into consideration.”

In the early days of video delivery, says Dr. Rehman, when industry people talked about quality assurance, they focussed on the network that delivered the video — quality of service.

A network’s ability to deliver packets of information in a timely, reliable, fashion certainly has a bearing on video quality outcome.

Comparing networks, however, isn’t easy because networks aren’t necessarily the same. Everyone’s connection to the internet is different. Some have good service to their home or office, some poor. Some pay for better speed and bandwidth, others don’t.

As a result, coming up with a useful, apple-to-apples metric that measures network performance isn’t easy, or particularly relevant: It’s entirely possible for a strong, well-performing network, to deliver a video of poor quality due to myriad other factors influencing viewer experience.

There are other problems with measuring network performance. The internet itself is, to use Dr. Rehman’s words, “a complicated monster,” meaning information is never routed the same way, making predictive analysis of a viewing experience based on network difficult. It is possible, for instance, to measure the number of packets that get lost, or the frequency with which they arrive out of order, but those metrics aren’t necessarily relevant to the problem to predicting user video quality. In fact, it’s possible for all the relevant bits of information to arrive on time, in order, and for the final video product to be far less than ideal.

“[Quality of service] never really touches the user experience,” says Dr. Rehman.

Similar problems exist with respect to measuring quality of delivery.

Quality of delivery refers to the processes that a video is subjected to as it travels from its source to its viewer. Typically an industry player will measure losses, standard compliance issues or delays at each of the five layers of the transport process.

“You can deploy checks at every one of these layers and provide measurements at every layer,” says Dr. Rehman.

“The problem, is, yes, you can do that, but you’re not telling the story from a viewer experience perspective.”

It’s entirely possible, he says, that the information flagged by measurement tools can vary “drastically” in importance, ranging from making a video unviewable to being “something nobody would notice at the end of the day.”

“The goal is that we understand the impact of any network impairments, whether those are losses, delays or syntax level irregularities on actually what’s presented to a viewer.”

Thus the importance when talking about delivery problems to start from the point of view of the viewer’s experience and work backwards to find out how each layer is having an impact on that experience.

“And this is not what the industry does at this point,” says Dr. Rehman.

IMAX’s streaming technology products, built on top of IMAX VisionScience™, on the other hand, does. It reliably “sees” what a viewer would see, simultaneously delivering metrics on quality of service, quality of delivery and quality of experience, with the ability to drill down and see specific data and potential problems.

“VisionScience is the only thing that covers all the formula specifications for quality of experience assessment,” says Dr. Rehman. “Not only that, IMAX is doing this in large scale, with millions of data points, and everything in real time, out of the lab, in operations.”

This, then, is what Dr. Rehman means by a “true quality of experience measure” versus a “pseudo quality of experience measure.”

“We have to let the metric experience the same thing as a human,” says Dr. Rehman.

“Because the experience of the final viewer is what matters.”

--

--

IMAX Technology
IMAX Technology Blog

Learn about the technology related to delivering premium quality video, from content capture to exhibition