Big video uploads

Zain
OneFootball Tech
Published in
4 min readJul 15, 2021

OneFootball strives to be the go-to platform for all publishers to distribute their football multimedia content and reach football fans around the world. This works because we are able to ingest content in various formats using various methods. My team — aptly named the Source team — focuses on this part of our platform.

Context

We have a B2B portal that our content partners can use — among other things — to manually upload video content. Until recently, our solution had some technical limitations and we were only able to support uploading of video files up to a couple of hundred MBs in size. Unsurprisingly, our content partners wanted to publish bigger video files on our platform, and so, we went back to the drawing board and designed a new solution that would allow our content partners to upload video files up to 25 GB in size.

Then

The old solution was using a single HTTP POST request to upload the video file to an endpoint of one of our backend services. The endpoint internally used a memory buffer to receive the entire video file, and then uploaded it to an AWS S3 location of our video storage platform using a single HTTP PUT request. This meant that we were limited by:

  • Maximum file size that could be reliably uploaded in a single HTTP request from a browser, which depended on:
    • Network stability of our user’s internet connection and our cloud provider.
    • Auto-scaling of our backend service containers.
  • Maximum memory available to the container running the backend service.
  • Maximum file size limit of AWS S3 HTTP endpoint (i.e. 5 GB, for single uploads).

We tackled all of the above mentioned limitations, by switching to using multipart uploads from our B2B portal and transferring the uploaded video files to our video storage platform using a pull-based mechanism. The new solution turned out to be quite neat, but the details were a bit gory at times.

Now

We decided to remove our backend service from the equation, as in, not transmitting the video file data through it, since it was acting as a middle man anyways. Instead, we opted to leverage AWS S3 for transferring the video file from our B2B portal to our video storage platform. The entire flow (as described in detail later) was a bit more complicated than it sounds, because of the following reasons:

  • Our B2B portal is an SPA, which makes it tricky to authenticate AWS S3 requests.
  • Our video storage platform requires a public URL to pull the video file.

We solved the above mentioned problems by utilizing AWS S3 presigned URLs, which are temporary links with specific access to an AWS S3 bucket/object. The new solution ended up having the following flow:

  1. User initiates video upload on our B2B portal.
  2. B2B portal requests a set of AWS S3 presigned URLs from our backend service by sending the file name & size.
  3. Backend service generates an appropriate number of AWS S3 presigned URLs based on the file size and returns them to the B2B portal.
  4. B2B portal uploads the video file directly to AWS S3 in parallel chunks (with retries on failed chunks) using the presigned URLs.
  5. B2B portal requests a private URL of the uploaded video from our backend service.
  6. B2B portal requests the backend to trigger the import of the uploaded video to our video storage platform by sending the private URL of uploaded video.
  7. Backend service generates a AWS S3 presigned URL from the private URL and supplies that to our video storage platform.
  8. Video storage platform pulls the uploaded video using the AWS S3 presigned URL.

In addition to the above flow, we had to orchestrate the cleanup of our AWS S3 bucket, which was achieved by using lifecycle rules of AWS S3 to automatically perform the cleanup.

Next

One future improvement that we’ve identified is to support the pause & resume of video uploads from our B2B portal. It’s something that we think can be supported with our current solution, so we don’t expect another revamp in the near future.

Outcome

The result is a more reliable and scalable solution that can — in theory — support uploading video files of up to 25 GB in size (this limit comes from our video storage platform). It has worked fine, so far, until a thrill seeking soul attempts to upload a 26 GB video file.

You can also work on such challenges and many more, by becoming a part of OneFootball. Check out our jobs page!

--

--

Zain
OneFootball Tech

Senior Engineering Manager | Video Platform | OneFootball