How It’s Built — NETFLIX

Streaming Netflix

These days getting access to your favorite movie is as simple as reaching for your device (phone or laptop), opening the Netflix app and hitting that Play button, pretty simple.

What’s not so simple is all the processes, jobs and functionality that’s being carried out by Netflix to give you that viewing experience.

So this is, in short, a brief overview of how Netflix was built.

The Architecture

Microservices, Microservices, Microservices

Microservices are an approach to designing software systems that are made up of small independent services that each have a specific purpose.

Netflix around ten years ago re-wrote the applications that run their entire service to fit into a microservices architecture — meaning that each application and resources are it's very own. It will not share any of it with any other app. When two applications do need to talk to each other, they use an application programming interface (API).

During a conference in 2016, Ruslan Meshenberg — Director of Platform Engineering at Netflix estimated that it uses over 500 microservices to control each of the many parts of what makes up the entire Netflix service: one microservice takes a look at your watching history and uses algorithms to generate a list of movies that you will like, another microservice stores all the shows you watched, one deducts the monthly fee from your credit card, one provides your device with the correct video files that it can play, and one will provide the names and images of these movies to be shown in a list on the main menu. And that’s just scratching the surface.

With the application of microservices into the Netflix service, it then enables Netflix to scale their services in days rather than months — an unlikely situation they would encounter if they went with a monolithic architecture system

Content Delivery Network — CDN

buffer screen

Let’s face it we all hate this buffer loading screen — it kills the thrill and excitement you get from watching a movie and is one of the many reasons why we hate it. Here is how Netflix solved the issue.

To solve this buffer issue, Netflix implemented the use of a CDN system in its application. So say for instance, you log in from Kampala and you try to access the Netflix website, instead of connecting you to the main Netflix server, it will connect you to a CDN server which is closest to you in Kampala.

The CDN server already has a copy of the main Netflix server which it serves to its users in Kampala. This greatly reduces the latency — the time taken between a request and a response, and everything loads really fast. That’s the reason why you see that the video suggestions from Netflix are all different based on the user's location.

CDNs are the reason why websites with a huge number of users like Google, Facebook, or YouTube manage to load really fast irrespective of where you are or what the Internet speed is like.

Netflix used Limelight, Level 3, and Akamai for a while. There is also CloudFront from Amazon and CloudFlare, just to name a few more. But Netflix wanted the absolute best streaming they could get while lowering cost. That’s why they created their own CDN — Open Connect.

Instead of relying on AWS servers, they install their very own around the world. Netflix strikes deals with internet service providers and provides them with their Open Connect box at no cost. ISPs install these along with their servers. These Open Connect boxes download the Netflix library for their region from the main servers in the US.

Think of it as hard drives around the world storing videos, and the closer they are, the faster you can get to them and load up the video.

Adaptive Bitrate Streaming (ABS)

Have you ever noticed that while watching a YouTube video(for those who don’t have stable internet anyways 😂) that at a certain point the video quality begins to reduce based on your internet connection — that’s adaptive streaming.

When encountering slow network conditions, instead of buffering your video, ABS switches dynamically to a lower-bitrate (lower-quality) version of the video, something the connection speed can handle.

Adaptive Bitrate Streaming also takes into account the available resolution of the viewer. Say, you’re viewing the video on a smartphone, ABS does not deliver the 4K version even if sufficient bandwidth is available to handle such a large file because there are not enough pixels in the display to take advantage of the higher-resolution version.

Using AWS Services

Initially, Netflix once owned their own massive network of computer servers, but they realized that the breakneck pace that they grew at — and needed to continue doing so — was difficult if they spent their time building computer systems that can support their software and keep fixing and modifying them to fit their needs. They made a courageous decision to get rid of maintaining their own servers and move all of their stuff to the cloud — i.e. Amazon Web Services (AWS).

Netflix makes use of AWS S3 service which allows them to store all of the video or media data in the cloud. Coupled with this, they then convert the media files into other formats suitable for the various streaming device via the Amazon Elastic Transcoder service.


So from Microservices to CDN, Adaptive Streaming to AWS Services, we have a collection of amazing technologies which come together to amount to the final user experience we get when we engage with Netflix. On their own, they might not seem that important but when it comes to architecture, the mix of these technologies is truly what defines the amazing user experience.

So in a nutshell, every time you hit play on that Netflix movie, smile to yourself and say “I know what’s going on there 😄”.

PS — I am aware that I didn’t cover everything like the 30-seconds video preview you get when you visit the homepage, Netflix recommender system, Netflix tech stack. That was intentional, I wanted you to also do your own research, as the saying goes — Give a man a fish and you feed him for a day; teach a man to fish and you feed him for a lifetime.