Deploy machine learning models to the edge server with Cloud Deploy and Cloud Workflows

Background

To test the model, I need to:

  1. Update my Kubernetes deployment yaml file with the latest image tag.
  2. Connect to my Kubernetes cluster.
  3. kubectl apply the yaml file.

This is a perfect use case for creating a workflow to run this automatically.

Requirements

  • When a new container image is pushed to Artifact Registry, it automatically triggers a workflow to deploy the container to target Anthos Clusters (Testing and Staging).
  • The solution must be able to handle multiple container images and services mapping. Meaning if I have modelA and modelB pushed to the registry, I want them to be automatically deployed to Anthos Cluster as ServiceA and ServiceB.
  • It requires my approval to deploy to the next environment.

Solution — Cloud Pub/Sub and Cloud Build

  • When a new container image is pushed to the registry, it fires a notification event. And the event is routed to a Cloud Pub/Sub topic
  • Cloud Build pipeline triggered by the event. It
  • Pulls Kubernetes manifest template files from Source Repositories.
  • Update template files.
  • Deploy to target environments.

Challenges

  • It does not supports approval. So each release will be deployed to the target environment once the workflow completed.
  • It requires Cloud Build trigger, which in terms requires a Git or Cloud Source Repositories. In my case, I do not need a Git repository. In addition, if I want to honor resource location restrictions, Cloud Source Repositories only provides a preview “regional instance”.

Solution — Cloud Deploy and Cloud Workflows

In this architecture. When a new container image is pushed to the registry, It

  • Fires a notification event. Eventarc Trigger receives the event and triggers a Cloud Workflows workflow.
  • The workflow invokes Cloud Deploy API to kick-off a delivery pipeline with Skaffold manifest files
  • Cloud Deploy delivery pipeline renders the Kubernets manifest files, and create a release.

To have the end-to-end flow works, I

  • Create Skaffold configuration with target environment configurations
  • Create a Delivery Pipeline configurations
  • Upload Kubernets manifest template file for this model container to Cloud Storage Bucket
  • Create a Cloud Workflow workflow, which invokes Cloud Deploy API to kick-off the Delivery Pipeline created above. In the API call I configure skaffoldConfigUri and skaffoldConfigPath so the Delivery Pipeline knows where to pull Skaffold configurations.
  • Create an Eventarc Trigger which listens to the specific Artifacts Registry container image and triggers the Cloud Workflow workflow created above.

Create Cloud Deploy Delivery Pipeline

  • Create a Skaffold yaml file

Skaffold is used to render manifests for different environments, in my case I use the same model in each environment, so this one is straightforward. Here is my sakffold.yaml.

  • Create the delivery pipeline definition file.

This is where you define how your pipeline looks like. In my case, I have two environments, staging and production. Both are Anthos attached clusters.

I have also created a service account deploy-service@${GOOGLE_CLOUD_PROJECT}.iam.gserviceaccount.comand have granted it required roles. Which are

  • roles/eventarc.eventReceiver
  • roles/workflows.invoker
  • roles/eventarc.serviceAgent

And below is the definition of the delivery pipeline.

  • Create a Kubernetes manifest files

These are kubernetes yaml files that will be rendered by Skaffold and eventually applies to the target cluster.

  • Update template files

Each delivery pipeline is bound to a specific environment configuration. So I wrote a simple script to update environment variables when I have a new release.

  • Create the delivery pipeline
gcloud deploy apply --file pipeline.yaml --region="${GOOGLE_CLOUD_DEFAULT_REGION}" --project="${GOOGLE_CLOUD_PROJECT}"

Create Cloud Workflows

In my case, The Cloud Workflows workflow does nothing but invokes the Cloud Deploy API. I need this because Cloud Deploy itself is integrated with external applications or CI solutions via API calls.

Note that here we configure buildArtifacts to tell Skaffold that inference-model needs to be rendered with value of ${image_location} which is a variable we created and calculated in the workflow’s decode_message step.

And finally deploy the workflow.

Deploy the Cloud Workflows workflow

Upload Skaffold configuration files

tar -czvf skaffold.tar.gz skaffold.yaml viai-model*gsutil cp /skaffold.tar.gz gs://${GOOGLE_CLOUD_PROJECT}_cloudbuild/${PROD_CLUSTER}/”

Enable Audit Log

Create Eventarc Trigger

Eventarc has a “path pattern” that allows you to set a resource filter on the Eventarc trigger. In my case, I want to listen to specific container images pushed to the Artifact Registry. Say if the image URL is https://us-east1/my-project/my-repository/my-image:tagThe path pattern is resourceName=/projects/my-preoject/locations/east-us1/repositories/my-registry/dockerImages/my-Image*

To create an Eventarc Trigger that triggers Cloud Workflow when a container image is pushed.

Test the Workflow

To test the end-to-end scenario, simply push a new image to the Artifact Registry. Note that the image must be pushed to exact the URL you specified in Eventarc Trigger path pattern filter.

--

--

A collection of technical articles and blogs published or curated by Google Cloud Developer Advocates. The views expressed are those of the authors and don't necessarily reflect those of Google.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store