Pioneering application design on TVs & TV-connected devices
We just launched a new Netflix experience for TV and game consoles.medium.com
Understanding the motivations for our most recent platform advancement requires some context. In the past we’ve explored many different approaches. In 2009 we implemented a Flash Lite based app. Soon after, in 2010, we shifted to a WebKit based app using predominantly the QtWebKit port. Each app eventually reached a critical point where our innovation goals required us to adapt our technology stack.
Evolution of an application platform
We released our first QtWebKit app in 2010. Over the next 3 years our engineers shared our innovations and approaches with the WebKit community. Our platform engineers contributed our accelerated compositing implementation. Meanwhile our user interface engineers shared best practices and identified rendering optimizations deep in WebCore internals. In addition, we continue to drive standardization efforts for HTML5 Premium Video Extensions and have adopted them for desktop.
Devices running our core app on TVs and TV-connected devices subject WebKit to unique use cases. We deliver a long-lived, single-page, image and video heavy user interface on hardware with a range of CPU speed and addressable RAM, and varied rendering and network pipelines. The gamut of devices is considerable with significant variations in functionality and performance.
Our technology stack innovation
We strive to provide customers with rich, innovative content discovery and playback experiences. All devices running the new Netflix TV app are now running our own custom JS+native graphics framework. This framework enables us to reach our customer experience goals on the broadest set of TVs and TV-connected devices. We own the feature roadmap and tooling and can innovate with minimal constraints. Our framework is optimized for fast 2D rendering of images, text and color fills. We render from a tree of graphics objects. The approach offers ease of use over immediate mode rendering contexts such as HTML canvas. Display property changes in these objects are aggregated then applied en masse post user interaction.
A bespoke rendering pipeline enables granular control over surfaces, the bitmap data representation of one or more graphics objects. Our surfaces are similar to accelerated compositing surfaces used by modern browsers. Intelligent surface allocation reduces surface (re)creation costs and the resulting memory fragmentation over time. Additionally we have fine-grained control of image decode activity leading up to surface creation.
We also rely on WebKit’s Web Inspector for debugging. Our framework integrates directly with a standalone Node server (and ultimately the Web Inspector) using the public remote debugging protocol. The Elements tab displays a tree of graphics objects. The Sources, Network and Timeline tabs work mostly like you’d expect. Familiar tools help while we debug the app running on a reference framework implementation or development devices.
An A/B test of our app written in this new framework performed better than our existing app. Our future is ours to define and we’re not done having fun.
Join our team
We’re working on exciting new features, constantly improving our platform, and we’re looking for help. Our growing team is looking for experts to join us. If you’d like to apply, take a look here.
Hello, my name is Joubert Nel and I'm a UI engineer on the TV UI team here at Netflix. Our team builds the Netflix…techblog.netflix.com
Originally published at techblog.netflix.com on December 16, 2013.