CI/CD pipeline to deploy applications on Google Kubernetes Engine (GKE) using Cloud Build and Cloud Deploy.

Vishal Bulbule
Google Cloud - Community
5 min readFeb 20, 2024

Introduction

In the modern software development landscape, Continuous Integration/Continuous Deployment (CI/CD) pipelines play a pivotal role in automating the process of building, testing, and deploying applications.

In this blog post, we’ll deep dive into the implementation of a robust CI/CD pipeline to deploy applications on Google Kubernetes Engine (GKE) using Google Cloud Build and Cloud Deploy services.

Requirements:

To achieve our goal, we have the following requirements:
- Deployment of two simple Flask applications (app1 & app2) on the GKE clusters.
- Automation of the entire deployment process, triggered by a developer’s code push.
- Dev-cluster deployment precedes production deployment, allowing for review before promoting to the prod-cluster.

Architecture

Soource : Author’s Creation

Technical Stack Summary:

This CI/CD pipeline leverages several key components:
- Google Kubernetes Engine (GKE): Google’s managed Kubernetes service, providing a scalable and reliable platform for deploying containerized applications.
- Cloud Build: A fully managed continuous integration and continuous delivery platform that allows developers to build, test, and deploy applications on Google Cloud Platform.
- Cloud Deploy: A service for continuous delivery that automates the deployment of containerized applications to Google Kubernetes Engine and other platforms.
- GitHub: A popular version control platform where developers collaborate, manage, and version control their codebase.

Solution:

We are going to implement solution using the following steps to implement the CI/CD pipeline for deploying applications on GKE:

  1. Create two simple Flask applications (app1 & app2)

2. Set up a GitHub repository and push the application code.
3. Create two GKE clusters: dev-cluster and prod-cluster, using Google Kubernetes Engine.

4. Create kubernetes manifest file in kubernetes folder to deploy application and expose as service.

app1.yaml

apiVersion: apps/v1
kind: Deployment
metadata:
name: my-app1
spec:
replicas: 3
selector:
matchLabels:
app: my-app1
template:
metadata:
labels:
app: my-app1
spec:
containers:
- name: my-container
image: us-central1-docker.pkg.dev/prj-poc-001/gke-repo/quickstart-image
ports:
- containerPort: 8080
---
apiVersion: v1
kind: Service
metadata:
name: my-service1
spec:
selector:
app: my-app1
ports:
- protocol: TCP
port: 8080
targetPort: 8080
type: LoadBalancer

app2.yaml

apiVersion: apps/v1
kind: Deployment
metadata:
name: my-app2
spec:
replicas: 3
selector:
matchLabels:
app: my-app2
template:
metadata:
labels:
app: my-app2
spec:
containers:
- name: flask-container
image: us-central1-docker.pkg.dev/prj-poc-001/gke-repo/flask-image
ports:
- containerPort: 8081

---

apiVersion: v1
kind: Service
metadata:
name: my-service1
spec:
selector:
app: my-app2
ports:
- protocol: TCP
port: 8081
targetPort: 8081
type: LoadBalancer

5. Create skaffold.yaml file now.

skaffold.yaml

apiVersion: skaffold/v4beta9
kind: Config
build:
tagPolicy:
gitCommit: {}
local: {}
manifests:
rawYaml:
- ./kubernetes/*
deploy:
kubectl: {}
logs:
prefix: container

Don’t be confused with so many files , your directory structure should look like below.

6. Now Create cloudbuild.yaml file to build and push docker images to Artifact registry for both application and Configure a Cloud Build trigger to initiate the pipeline upon code push events in the GitHub repository.

Cloudbuild.yaml

steps:
- name: 'gcr.io/cloud-builders/docker'
args: ['build', '-t', 'us-central1-docker.pkg.dev/<your-project-id>/gke-repo/quickstart-image', './app1' ]
id: 'Build Docker Image'

- name: 'gcr.io/cloud-builders/docker'
args: ['push', 'us-central1-docker.pkg.dev/<your-project-id>/gke-repo/quickstart-image' ]
id: 'Push Docker Image'

- name: 'gcr.io/cloud-builders/docker'
args: ['build', '-t', 'us-central1-docker.pkg.dev/prj-poc-001/gke-repo/flask-image', './app2' ]
id: 'Build Docker Image2'


- name: 'gcr.io/cloud-builders/docker'
args: ['push', 'us-central1-docker.pkg.dev/prj-poc-001/gke-repo/flask-image' ]
id: 'Push Docker Image2'


- name: 'google/cloud-sdk:latest'
entrypoint: 'sh'
args:
- -xe
- -c
- |
gcloud deploy apply --file deploy/pipeline.yaml --region=us-central1
gcloud deploy apply --file deploy/dev.yaml --region=us-central1
gcloud deploy releases create 'app-release-${SHORT_SHA}' --delivery-pipeline=gke-cicd-pipeline --region=us-central1 --skaffold-file=skaffold.yaml


options:
logging: CLOUD_LOGGING_ONLY
cloud build trigger

5. Implement the necessary code to define the Cloud Deploy pipeline and targets for both dev-cluster and prod-cluster.

Her we will create 3 Yaml file for pipeline,dev target GKE cluster & prod target GKE cluster as below

dev.yaml

apiVersion: deploy.cloud.google.com/v1
kind: Target
metadata:
name: dev
annotations: {}
labels: {}
description: dev
gke:
cluster: projects/<your_project_id>/locations/us-central1-c/clusters/dev-cluster

prod.yaml

apiVersion: deploy.cloud.google.com/v1
kind: Target
metadata:
name: prod
annotations: {}
labels: {}
description: prod
gke:
cluster: projects/<your_project_id>/locations/us-central1-c/clusters/prod-cluster

6. Push the updated code to the GitHub repository, triggering the Cloud Build and Cloud Deploy processes.

7. Monitor the progress of Cloud Build and Cloud Deploy pipelines for successful deployment.

By default Cloud deploy pipeline trigger deployment on first target and provide manual option to promote to next target. If you set approval for next target you have to promote ,review and approve deployment.

Cloud Deploy Pipeline

8. After reviewing the deployment on the dev-cluster, promote the application to the prod-cluster using the automated CI/CD pipeline.

If you encounter any confusion, please refer to the YouTube video linked at the beginning of the blog.

By following these steps, developers can seamlessly push code changes, triggering an automated CI/CD pipeline that ensures the deployment of updated applications to the production environment while maintaining a smooth transition from development to production.

About Me

As an experienced Fully certified (11x certified) Google Cloud Architect, Google Cloud champion Innovator, with over 7+ years of expertise in Google Cloud Networking,Data ,Devops, Security and ML, I am passionate about technology and innovation. Being a Champion Innovator and Google Cloud Architect, I am always exploring new ways to leverage cloud technologies to deliver innovative solutions that make a difference.

If you have any queries or would like to get in touch, you can reach me at my email address vishal.bulbule@techtrapture.com or connect with me on LinkedIn at https://www.linkedin.com/in/vishal-bulbule/. For a more personal connection, you can also find me on Instagram at https://www.instagram.com/vishal_bulbule/?hl=en.

Additionally, please check out my YouTube Channel at https://www.youtube.com/@techtrapture for tutorials and demos on Google Cloud.

--

--