If you’re a company that streams data, performance for you and your users is going to be a reasonably huge priority — Nobody likes a crummy connection or slow streaming.
Finding the sweet spot amongst user connection speed, cloud hosting performance, and the buffer window you use to request data might seem like a challenging task, but keep reading to find the winning combination for your needs!
The need for speed
Let’s set the stage: say you’re a company that’s looking to duplicate your content for an international audience. This is an exciting move, and you want to make sure that your expansion doesn’t mean a compromise in your service.
Your architecture setup is pretty simple: You’re serving data directly from Google Cloud Storage to users on mobile and desktop.
Anyone who has tried to write this type of code knows this to be true: it’s all about understanding your connection speed, the fetch size, and balancing all of that with the client performance and buffering.
A slower client might need to fetch a larger window of data — a fast internet connection speed might need to fetch smaller chunks, at a more repeated interval, etc….
But at its core — it’s about balancing the request size against a bunch of different metrics for the client.
Which begs the question — Does GCS have an ideal fetch size?
Time to gather some data!
To make sure we’re on the right track, we’ll need to profile how the read performance adjusts at various window sizes of download, which means we need to take a file (let’s say 8MB in size), and fetch it through different sized chunks to determine the performance.
**Note** For those of you playing the home game — it’s super critical to disable caching headers for your fetches during this test. Failing to do so could mess up your data!
What we see here is that as chunk size decreases our performance gets worse — This is mainly due to the fact that the transactional overhead of those fetches starts to slow everything down.*
The sweet spot
What this also tells us, is that in terms of single stream throughput and write throughput, GCS is very strong for both uploads and downloads, where the request is no smaller than 1MB in size.*
With this information, you’re in a better spot to determine what level of regional redundancy is needed for serving your video content, and optimize the 1MB+ GCS Target to improve your buffering speeds.
Stay tuned for more optimization-ready tips from Cloud Storage Bytes!
*Optimize instead of getting bogged down with transactional overhead by following the steps in parallel uploads for small files.