Building and running our online platform for the Olympic Games in Pyeongchang

NOS Digital
NOS Digital
Published in
7 min readMar 20, 2018

As a leading news and sports company in the Netherlands, NOS was proud and happy to bring the Wintergames in Pyeongchang to the Dutch audience like we have done for many years. This post is about the olympic project at NOS Digital and NPO streaming. It’s about digital concept, execution and high volume live streaming!

The audience, the event and the license

When planning digital products for an event like the Wintergames, first of all we take into account the customers’ needs. Research and analytics is the foundation on which we design the first concepts. Big events like the Wintergames are driven by stories, live experiences, video and convenience.

Another thing we dig into is the event itself, the dates, the schedule, Dutch athletes, the location. The Wintergames took place in Pyeongchang. There’s an 8 hour time difference between the Netherlands and Pyeongchang. This is a vital factor to consider when thinking about how the Dutch audience enjoys the Wintergames online.

This year the media-license was also a very important consideration. Compared to previous Olympic Games, where we were able to acquire a full license from the IOC and had over 15 livestreams at our disposal, our possibilities were restricted. But even though this year’s license was a sublicense, we were very happy with the possibilities.

Shaping the Wintergames in existing apps and websites

The NOS websites and mobile apps attract millions of visitors every day. The Olympic Games also attract a huge audience. Therefore we decided to shape the Olympic experience within our existing websites and apps. Customers wouldn’t have to bother downloading a new app. During events like this we also hope to convince new users of our products, so they keep using them when the event is over. On top of this, the existing product enables us to re-use functions we already have on board like livestreaming, on demand video, liveblogs and storytelling.

But this also meant we had to operate carefully without breaking the existing product, and respecting the existing audience.

We designed a dedicated Wintergames section. In the weeks building up to the start of the Wintergames we made this section available gradually. At first, only articles about the Olympic Games would lead users to the section. A while later the Sports section would also give access. And eventually the Wintergames section was available from all main pages on our websites and apps.

the Olympic Games in the NOS-app

Our editorial team continuously kept users up to date in the olympic section with livestreams, liveblogs and the latest stories. These were the most important elements driving traffic throughout the event. We typically see heavy video consumption during Summer and Wintergames, but there’s still a large part of the audience enjoying the games by reading articles and liveblogs.

Daily starts of videoclips on the NOS-app and NOS-website during the Olympic Games in Pyeongchang (marked red), source : Comscore

50,000 requests per second

As mentioned above the Olympic experience was part of the existing infrastructure with regards to the website and apps. On a regular day the websites and apps services approximately 15,000 requests per second. During the Olympics we witnessed peaks of over 50,000 requests per second.

Fortunately most data can be cached in the frontend webservers which results in almost a 1000:1 reduction of requests needing to be serviced by the backend. Even during the highest peaks the backend, where all the heavy lifting with regards to functionality takes place, was only as busy as it would be on a regular day and never broke a sweat.

The frontend webserver farm is also, relatively speaking, a modest operation consisting of 12 webservers running a tuned but otherwise off-the-shelf open-source webserver software flavour, loadbalanced by a stock linux kernel based loadbalancer.

A catch-up service to bridge the time difference

Prime time events at the Wintergames took place during office hours in the Netherlands. And during evening hours, when people usually enjoy sports in the Netherlands, everyone was asleep in Pyeongchang.

To service our customers after the working day we designed a recap service called ‘Dit was de dag’ (rough translation : “the day in Pyeongchang”). We published the recap service every day at a fixed time: 17.00. The service was the recap of the most spectacular moments that took place that day. The highlights of the medal events, the performing of Dutch athletes, things that happened inside and outside the Olympic arena and other stories not to be missed. All in video.

With this service, users coming home from work could easily catch up. They could easily subscribe to this recap service, so they were notified as soon as today’s edition was available. The service was also available on social media and on our website.

Catch-up service “Dit was de dag” on the nos.nl website

Users were happy with this catchup service, eventually around 18,000 fans subscribed to this daily update. The recaps were heavily consumed with an average of 140,000 pageviews for every edition.

We think the potential is bigger then our results. We could have done some more work bringing the service under the attention of the fans, which is notoriously hard in a short timespan and a wave of content.

Dealing with schedules and reminders

NOS has been covering the Summer and Winter Olympics for many years. These events are huge. Every day a lot of things are happening. Far too much to enjoy at once. What we see happening every time, is people looking for events most important to them. When is Slopestyle snowboarding taking place? When are the Dutch competing? When are the 10km speed-skating finals?

We decided to meet these requirements with two functions:

· A schedule hand-picked by our editorial team. This schedule consisted of the most important events every day. Most of the times we covered these events live. Customers could flag these events, and get notified when the event started. A convenient service, and an immediate driver for traffic.

· An extensive schedule with all sports and all events. In this schedule a customer could look up the event and the results of his/her interest.

The hand-picked and the extensive schedule

In the events ahead of us (Olympic Games, but also other events like the FIFA World Cup) we want to build further on these choices and learnings. Getting grip on what’s happening, and what’s important to customers is decisive if you want visitors returning to your product.

Record breaking live streaming

Due to the time-difference we anticipated a lot of online live viewers and therefore scaled our entire streaming environment to a peak capacity of 320,000 concurrent viewers. That’s 70,000 over our previous peak.

We planned to realize this with the help of our CDN partner and our on premise infrastructure, consisting of multiple datacenters with multiple uplinks to large Internet Exchanges, private peers and a large Linux based compute farm powering all the streams.

However, as with every enormous operation, the event did not go as planned during the first days. One of our partners suffered outage during the Olympic Games and made us come up with a plan B. And C and D.
Therefore, within the Olympics, at day 3, we connected more compute capacity in our datacenters, connected extra 100 Gbps uplinks to the internet Exchanges and created some extra capacity in the public cloud.

As a result, the team consisting of linux engineers, network engineers and of course streaming engineers, were able to facilitate more than 430,000 concurrent viewers after a few long days of tremendous effort. Because of bandwidth constraints from some ISP’s in the Netherlands, not all customers got the highest available bitrate, therefore the total traffic amounts to 398 Gbps.

We are now planning new events like the FIFA World Cup and the Tour de France. Since the start of our streaming platform we experienced an enormous growth in the demand for quality (bitrates) as well as the demand for capacity. Linear TV is moving towards a dominant internet (OTT) model and as such requires more capacity in people and technology. Smart technologies like P2P live streaming have our main focus now, to deliver the ever increasing demand for high quality streams news and sports to the Dutch public.

Monitoring the event from our operations room

--

--