Create and serve HLS video content to your app with Google Cloud CDN

Dario Banfi
Google Cloud - Community
6 min readDec 27, 2022

In this tutorial, we will create HLS video streams from raw media files and stream them to a Flutter mobile application.

Video content is everywhere nowadays and users have come to expect low latency and quick loading times. There are multiple protocols that let you do adaptive streaming, but undoubtedly one of the most popular is HTTP Live Streaming (HLS).

We will make use of different Google Cloud products like Cloud Storage, Cloud CDN and Transcoder API to fully automate the process.

All the steps in this guide are available in the following GitHub repo as Terraform scripts, which can be quickly deployed in your Google Cloud account.

This is a high-level overview of what we are going to build:

Solution Architecture

Video editors will upload raw files to a cloud storage bucket and this will automatically trigger the transcoder API through a Cloud Function. The resulting .m3u8 playlist will be streamable through the mobile apps.

We are going to use Cloud CDN here because we plan to stream short video files. If you plan to do large-scale VOD streaming, you should look into Media CDN. Here’s an overview of the difference between the two.

Setting up the transcoding pipeline

The first step is building the transcoding pipeline. We want to automate the process so that every time a video is uploaded to Cloud Storage, the transcoding will automatically be started and take care about generating the .m3u8 playlist.

For this, we will use Google’s Transcoder API, which enables you to do multiple operations on video files like transcoding, cropping, generating thumbnails and much more.

Enable APIs

First step is to enable all the necessary APIs we are going to use for this solution

gcloud services enable \
transcoder.googleapis.com \
cloudfunctions.googleapis.com \
cloudbuild.googleapis.com \
pubsub.googleapis.com \
logging.googleapis.com \
eventarc.googleapis.com \
artifactregistry.googleapis.com \
run.googleapis.com \
compute.googleapis.com \
--quiet

Let’s also set the project id to a variable:

export PROJECT_ID=hls-streaming-gcp

Create Cloud Storage Buckets

Next we want to create the cloud storage bucket where we will upload the raw video content:

gsutil mb -b on -l europe-west1 gs://hls-streaming-gcp-raw-files-${PROJECT_ID}/

We will also create our bucket for the processed video stream:

gsutil mb -b on -l europe-west1 gs://hls-streaming-gcp-processed-files-${PROJECT_ID}/

And while we are at it, let’s also make the processed bucket public:

gsutil iam ch allUsers:objectViewer gs://hls-streaming-gcp-processed-files-${PROJECT_ID}/

Create and configure Service Account

We will now configure the service account for the cloud function. It will need Transcoder Admin permissions to call the Transcoder API and Storage Admin permissions to create the output buckets.

gcloud iam service-accounts create transcoder-service-account \
--display-name="Transcoder service account"

And give it the necessary permissions:

gcloud projects add-iam-policy-binding ${PROJECT_ID} --member="serviceAccount:transcoder-service-account@${PROJECT_ID}.iam.gserviceaccount.com" --role="roles/run.invoker"
gcloud projects add-iam-policy-binding ${PROJECT_ID} --member="serviceAccount:transcoder-service-account@${PROJECT_ID}.iam.gserviceaccount.com" --role="roles/transcoder.admin"
gcloud projects add-iam-policy-binding ${PROJECT_ID} --member="serviceAccount:transcoder-service-account@${PROJECT_ID}.iam.gserviceaccount.com" --role="roles/storage.admin"
gcloud projects add-iam-policy-binding ${PROJECT_ID} --member="serviceAccount:transcoder-service-account@${PROJECT_ID}.iam.gserviceaccount.com" --role="roles/eventarc.eventReceiver"

Next we will also need to grant pub/sub publisher to the cloud storage service account so that it can notify Eventarc

GCS_SERVICE_ACCOUNT="$(gsutil kms serviceaccount -p ${PROJECT_ID})"
gcloud projects add-iam-policy-binding $PROJECT_ID \
--member="serviceAccount:${GCS_SERVICE_ACCOUNT}" \
--role="roles/pubsub.publisher"

Create a cloud function that starts the transcoding API whenever a video is uploaded

Now, let’s create a cloud function that will kick off the transcoding process.

Cloud Functions are excellent for glue code and event-driven use cases like this one.

You can find the source in the git repo.

Or you can create it yourself, as it is very short.

mkdir transcoding_function

echo 'import functions_framework
import os
from google.cloud.video import transcoder_v1
from google.cloud.video.transcoder_v1.services.transcoder_service import (
TranscoderServiceClient,
)
@functions_framework.cloud_event
def handle_gcs_event(cloud_event):
data = cloud_event.data
bucket = data["bucket"]
name = data["name"]
project_id = os.environ.get("PROJECT_ID")
region = os.environ.get("REGION")
input_uri = f"gs://{bucket}/{name}"
output_uri = f"gs://hls-streaming-gcp-processed-files-{project_id}/{name}/"
preset = "preset/web-hd"
client = TranscoderServiceClient()
parent = f"projects/{project_id}/locations/{region}"
job = transcoder_v1.types.Job()
job.input_uri = input_uri
job.output_uri = output_uri
job.template_id = preset
response = client.create_job(parent=parent, job=job)
print(response)
' > transcoding_function/main.py

echo "google-cloud-video-transcoder==1.4.0" > transcoding_function/requirements.txt

Now we are ready to deploy! Make sure to use the service account previously created:

gcloud functions deploy transcoding-function \
--gen2 \
--region=europe-west1 \
--runtime=python310 \
--source=./transcoding_function \
--entry-point=handle_gcs_event \
--set-env-vars PROJECT_ID=${PROJECT_ID},REGION=europe-west1 \
--trigger-bucket=hls-streaming-gcp-raw-files-${PROJECT_ID} \
--service-account=transcoder-service-account@${PROJECT_ID}.iam.gserviceaccount.com

Testing the setup

Perfect, now that we have everything in place we can test our setup.

Let’s upload a raw video file to the bucket to check. You can use any video file. Just uploaded in to the bucket with:

gsutil cp sample.mov gs://hls-streaming-gcp-raw-files-${PROJECT_ID}

After a short time, we should be able to see our processed file in the hls-streaming-gcp-processed-files bucket.

You can always monitor the job progress with:

gcloud transcoder jobs list - location europe-west1

And

gcloud transcoder jobs describe job_id

Configuring the load balancer and CDN

Now that we have our m3u8 playlist generated and saved in Cloud Storage, we want to expose it to our users. For this we will set-up a load balancer address and use Google CDN to cache the content and serve it as close as possible to our users.

Let’s first reserve an external address we plan to give to our load balancer

gcloud compute addresses create load-balancer-ip \
--ip-version=IPV4 \
--global

And let’s write down the address we just created with:

export LB_IP_ADDRESS=$(gcloud compute addresses describe load-balancer-ip --format="get(address)" --global)

Next, we want to create a load balancer bucket which will be the target for our load balancer:

gcloud compute backend-buckets create hls-streaming-bucket \
--gcs-bucket-name=hls-streaming-gcp-processed-files-${PROJECT_ID} \
--enable-cdn \
--cache-mode=CACHE_ALL_STATIC \
--default-ttl=2419200 \
--max-ttl=2419200

We use CACHE_ALL_STATIC to cache all static content

We change the default TTL to and max TTL to 2.419.200 seconds (28 days) because we don’t expect changes in the content and we want to maximize cache hit ratio, to reduce our egress costs.

Let’s now configure URL Map and Target Proxy with:

gcloud compute url-maps create hls-streaming-load-balancer \
--default-backend-bucket=hls-streaming-bucket
gcloud compute target-http-proxies create hls-streaming-load-balancer-proxy \
--url-map=hls-streaming-load-balancer

Configure the forwarding rules with:

gcloud compute forwarding-rules create hls-streaming-load-balancer-forwarding-rule \
--load-balancing-scheme=EXTERNAL_MANAGED \
--network-tier=PREMIUM \
--address=load-balancer-ip \
--global \
--target-http-proxy=hls-streaming-load-balancer-proxy \
--ports=80

Let’s give the load balancer ~5 minutes to propagate the forwarding rules.

After that, we will be able to test it with:

curl http://${LB_IP_ADDRESS}/sample.mov/manifest.m3u8

Amazing, our endpoint is ready to be streamed to our applications!

Streaming to the mobile application

Why stop here? Let’s deploy a simple Flutter application and test it out.

You can find a sample video player application on GitHub.

For this part we’ll assume you already have a flutter development environment set-up, if not you can follow this guide.

Let’s clone the repository and run our application.

git clone https://github.com/dariobanfi/hls-streaming-gcp/tree/main/flutter_app
cd hls-streaming-gcp-demo/flutter_app
flutter run

And now open the file flutter_app/lib/main.dart and add the endpoint of the m3u8 playlist in the videoSource variable.

And here’s the result, a video player with HLS live streaming! 🎉

--

--

Google Cloud - Community
Google Cloud - Community

Published in Google Cloud - Community

A collection of technical articles and blogs published or curated by Google Cloud Developer Advocates. The views expressed are those of the authors and don't necessarily reflect those of Google.

Dario Banfi
Dario Banfi

Written by Dario Banfi

Customer Engineer @ Google Cloud