TV’s Interface Solution: GPUs

Stuart Russell
You.i TV
Published in
4 min readMay 12, 2016

All too often, TV interfaces can feel like they’re from another era.

We’re all used to such fluid, immersive experiences on our phones and tablets, that the differences between those and a set top box interface can be jarring.

In most cases, this boils down to the graphics processing unit (GPU) — or, more accurately, whether or not the interface was built to take full advantage of it.

When it comes to TV, the GPU is often overlooked, because it’s seen as a video-game-specific piece of hardware. If you look at the history of GPUs, it makes sense: they were built specifically for the needs of video games.

It all began with PC gaming.

As games became more complex and immersive, they relied solely on the CPU to run 3D calculations and bring the games to life on the screen. This wasn’t a great solution, and caused games to lag and stutter, until a new hardware solution came along that could power incredibly performant, incredibly smooth video game experiences: the graphics card.

It freed the CPU to handle other tasks, and was built to render video game visuals: complex animations that relied on 3D motion and immersed the user in a different world.

As a user, this gave you a better experience, and reduced your costs. You could buy a $2,000 computer with a CPU powerful enough to render all of the graphics smoothly, or you could buy a $500 computer and augment it with a graphics card. Both options could power the same experience.

The video game world was turned on its head, but it wasn’t until 2008 that anyone made a successful case for leveraging the GPU for anything else.

Enter, the iPhone.

The iPhone leveraged hardware that was already on board — the GPU — to power a new kind of interface that featured motion graphics and smooth, animated transitions.

If they had tied the interface performance to the CPU, as was the widely accepted standard at the time, they would have run into problems quickly. The CPU onboard the iPhone would have either needed to be much bigger, much more expensive, or in all likelihood, both, if it had to handle the motion designs of the interface and all other processing requirements at the same time.

So they used the GPU to handle all of the motion-based interface elements that have become so prevalent since. It makes sense: motion and graphics are what the GPU does best, and this approach frees the CPU to handle all of the data and memory requirements.

Apple was the first, and they won’t be the last, but there has been some real resistance to adopting this approach to interface development.

Specifically, in TV.

Set top boxes are a classic example. Most set top boxes (STBs) have a GPU onboard, comparable to the hardware you’d find on a 10-inch tablet from a major manufacturer. So why do their interfaces look so different? HTML5.

Interfaces that use the default HTML5 webkit usually don’t take advantage of the GPU to accelerate interface performance. That keeps most HTML5-based interfaces tied to the CPU, and it means that compromises need to be made between interface performance and the other tasks the CPU has to handle.

Let’s take the consumer perspective on this.

TV is trending towards an ever-higher resolution future. 4K screens are gaining widespread adoption, and content is being delivered in 4K from major cable and OTT providers alike.

A consumer with a 4K TV in the home is getting an amazing visual experience when they’re watching content, but as soon as they pull up a guide or navigate an interface, they’re pulled right out of that great experience. The contrast between an interface powered through the CPU and a 4K video is jarring. But it doesn’t have to be.

If you did a hardware comparison between a set top box and a tablet, you’d find only minimal differences between them, beyond the hardware in the set top box that handles content delivery. They’re working with largely the same level of CPU, and they both have a GPU on board. This is not a hardware problem — the set top box is a powerful machine.

It’s a software problem.

If interfaces were written to take full advantage of the hardware that’s already on board a set top box, the possibilities are as varied as they are on any device.

Here’s just a quick look at what could be possible — yes, on a set top box.

Don’t want to watch the full video? Here’s a taste of the interfaces that are possible when you unlock the GPU.

Breaking bad

Consumers today expect interfaces to be a seamless part of their experience, and that holds true of set top boxes as well. All that’s required to deliver on those expectations is a GPU, and a software approach that maximizes the potential of the GPU to help power that interface.

And there’s already a GPU on board.

So what are we waiting for?

Stuart Russell will be speaking to this topic, and how TV and media companies can unlock the GPU to power stunning interfaces, on stage at Imagine Park as part of INTX this week. Catch his presentation at 11:15AM on Wednesday at the Imagine Park stage.

If this is something you enjoyed reading, please hit the Recommend button so others can join in on the fun.

Twitter: You.i TV.

--

--