How to deliver 410 Gbps in under 48 hours

CDN77.com & CDN hot news
5 min readJun 27, 2016

--

The story of how we constantly learn to do the impossible @ CDN77.com.

I’ve recently noticed an increasing number of unusually large bandwidth enquiries. Mostly, but not strictly, related to the recent sports events. Whatever the reason is, we at CDN77.com and 10Gbps.io are always open to any challenge within our industry. In fact, that’s what we find interesting about our job. We rarely say no to a customer.

About 4 weeks ago, we received a request from one of our clients:

We need a 550 Gbps stream capacity. In 48 hours. For 8-hours streaming event.

We hesitated whether or not it was within our capabilities. Yet you know what they say, there’s only one way to find out. With this in mind, we went for this sweet technical and operational challenge.

In several emails we agreed on a price (under 15 000 EUR one-time) and made the deal for 500Gbps.

Yes, 0.5 Terabits per second of the internet.

By comparison, total traffic of DE-CIX, the world’s largest exchange point in terms of peak traffic, has a maximum throughput of around 4 Terabits per second. Simple math, we had 48 hours to deliver what would account for about 12% of the world’s largest IXP.

Sounds like fun, doesn’t it?

What did the process of delivery look like in a nutshell? Well..

Time to the event: 42 hours

I was calling our CTO and Head of Network and I informed them about the deal. “Boys, we have to deliver 500 Gbps this Thursday” (yes, we do address ourselves boys and girls). They both first thought I was joking, they laughed. When they realised I was not, they admitted we could do it and got to work.

Operation room @ time of live event.

Time to event: 40 hours

We started with our own spare servers and IP capacity across 5 datacenters in Europe and the United States. We then discussed the distribution of traffic with the client, we agreed on 28 dedicated servers, each with 2 x 10 Gbps uplink, 2x E5 v4 CPU and 96 gigabyte RAM and SSD small local cache.

30 x server, each with 20 Gbps, total uplink active capacity: 600 Gbps

We created a Google spreadsheet with available hardware, server IDs, uplink capacity and status of installation.

We shared it with the client and regularly updated the progress and the status of each server. This way they knew which servers were ready and what we were working on. The real-time process overview helped us avoid a long chain of emails.

Time to event: 36 hours

The first batch of 15 servers was installed in racks, with uplinks and live OS.

Time to event: 30 hours

The customer received first ready-to-use 100 Gbps.

Time to event: 24 hours

We managed to deliver 80% of the requested capacity. Altogether 400 Gbps, servers, uplinks, IPs and other settings, all of it was ready to be used and approved by the customer.

Doors of our Network department

Time to event: 12 hours

The last batch of servers was set up in Prague, our home point of presence.

Time to event: 10 hours

Total delivered capacity: 600 Gbps in 5 datacenters with the following distribution:

  • Frankfurt: ~160 Gbps
  • Amsterdam: ~140 Gbps
  • Prague: ~140 Gbps
  • Atlanta: ~80 Gbps
  • Los Angeles: ~80 Gbps

Customer approved the capacity and settings roughly 8 hours before the event. Everything worked like a clockwork thanks to the team of 15 people — network guys, data center guys across 2 continents and 5 locations, sales girls and one DHL guy with HW package to Frankfurt datacenter.

Time to event: 5 hours

Lastly, we did IP settings and we manually checked the capacity inside our network between routers even though we had been in touch with upstream providers that had ensured us we had had enough capacity. Better safe than sorry.

Time to event: 3 hours

After a thorough check of all the settings and equipment, we took a breather, drank some Club Mate and waited.

Time to event: 1 hour

We stared at our terminal to be sure everything was OK.

Time to event: 0

Kick off.

Result:

Total traffic: 410 Gbps

Total capacity: 600 Gbps

Split on 30 servers, each with 20 Gbps on two continents.

At a first glance, it looked like a mission impossible. However, when I look back, things hasn’t changed. It’s only the scale that has changed.

Ten years ago, we had a proportional amount of “fun” with 1 Gbps per client, now we “play” with lower hundreds of Gbps per client.

This wasn’t the only time something like this happened.

Quite the contrary, it is becoming a routine matter.

Last week, another client requested 200+ Gbps traffic in 12 hours.

We broke a sweat, but we made it. We delivered final 280 Gbps in 3 European data centers under 12 hours from customer order.

Prior to the 2016 Summer Olympics, I feel there will be more of such requests.

If you are looking for 30–500 Gbps network capacity + servers with delivery in 24 hours, look no further.

We can do it thanks CDN77.com Content Delivery Service or 10Gbps.io pure Dedicated servers with 2 x 10 Gbps / server.

Are you looking for something impossible to deliver? Drop us a message: sales@cdn77.com.

Zdenek Cendra, CDN77.com founder

twitter.com/zdendac

--

--

CDN77.com & CDN hot news

Our insights to CDN market & secrets from CDN77.com Content Delivery Network kitchen.