Deploy Application on Kubernetes Using Jenkins Multicloud-Jenkins Pipeline with GCR

Siddhesh Patil
6 min readMar 31, 2023

--

Organizations can escape being reliant on a single cloud vendor by adopting a multi-cloud strategy. It also makes it easier for customers to negotiate with service providers for better rates and service level agreements. Data centers for various cloud service companies are spread across various regions. Through the use of a multi-cloud approach, businesses can split up their tasks among various service providers, cutting down on latency and enhancing the user experience for clients in various geographies.

Figure: Architecture diagram for the implementation of the proposed solution

In this Architecture, we’ve specified how quickly we can deploy our application to multiple clouds using Jenkins, and today’s businesses are switching from monolithic to microservice architectures to improve their operations.

High-level Steps

  1. AWS and GCP infra by Aniket Kumavat.
  2. EKS creation and Jenkins pipeline setup in AWS by Bhavesh Dhande
  3. Jenkins pipeline setup in GCP by Siddhesh Patil
  4. GKE creation and Route traffic using Route53 GKE and EKS by Rushabh Mahale

Note- Beginning with step number 3, I’ll create a Jenkins pipeline setup with GCR.

Prerequisites -

  1. GCP infra setup.
  2. You should have SA credentials having permission “Storage Admin”.

What is Jenkins-pipeline?

You can construct and manage your build, test, and deployment pipelines as code using the Jenkins Pipeline plugin for the Jenkins automation server. You can define your pipeline using Jenkins Pipeline as a script that can be written in Groovy. The pipeline for your application, including the building, testing, packaging, and deployment phases, is described in groovy script.
Jenkins Pipeline provides many features, including:

  • Declarative and scripted syntax options for defining pipelines
  • Integration with source control systems like Git
  • Ability to define and reuse stages and steps across multiple pipelines
  • Support for parallel and sequential stages
  • Rich visualization of pipeline status and progress
  • Integration with external tools and services through plugins

Overall, Jenkins Pipeline is a strong tool for building and managing intricate continuous delivery pipelines. It can also assist teams in streamlining their software development and delivery procedures.

Setup of Jenkins-pipeline for GCR

So before we start we need to install some plugins and add credentials of Service Account so that we will have permission to perform the necessary tasks.

Required Plugins

  • Google Container Registry Auth Plugin
  • Google OAuth Credentials plugin
  • Docker Pipeline

Service Account

In IAM we create SA to give permission to the particular service to perform some task or to create a communication between two services. Now I have created a service account already and have given it the permission of Storage Admin.

Note: Aniket Kumavat has shown Node connection So I will directly start with Jenkins Job configuration.

Jenkins Job configuration

In our architecture 1st Jenkins pipeline is for ECR (AWS side) and 2nd Jenkins pipeline is for GCR (GCP side) Bhavesh Dhande has shown 1st pipeline already, So lets start with 2nd jenkins pipeline.

My gcp-node is already connected with Jenkins Master -

Next, Here you can see i have created GCP Node a pipeline based Job -

In this job go to configure and in the build triggers section use Build after other projects are built and choose your main job AWS Master and in options I am choosing Trigger even if the build fails.

Now for the main part Pipeline Code i already have groovy code with me i will explain the code step by step here -

pipeline{
agent { label 'gcp-node' }

We write our whole jenkins pipeline code inside Attribute ‘pipeline’ and curly brackets ‘{}’ are syntax of Groovy code that’s the way it is supposed to be written and agent here specifies on which Virtual Machine/Instance/Node we have to perform this code on, we have gcp-node named labelled with ‘gcp-node’ so this full will be performed on that machine.

stages{
stage('checkout'){
steps{
checkout([$class: 'GitSCM', branches: [[name: '*/GCP']], extensions: [], userRemoteConfigs: [[url: 'https://github.com/bhaveshdhande/multicloud-deploy.git']]])
}
}

We write the jenkins pipeline code in stages where we describe each stage, instated of checkout you can give any name to your stage its just a way to show what is this particular stage is about, inside stage depending on your build you write steps, scripts etc.
In Jenkins pipeline “checkout” is written for getting your application data or files from your github repository.

stage('docker build & push'){
steps{
script{
withDockerRegistry(credentialsId: 'gcr:<project-id>', url: 'https://asia.gcr.io/<project-id>/gcp-app') {
def customImage = docker.build("asia.gcr.io/<project-id>/gcp-app:${env.BUILD_ID}")
customImage.push('latest')
}
}
}
}
}
}

Now this stage is about build the image from Dockerfile which we got from last stage of checkout and it will rename my images with ‘asia.gcr.io/<project-id>/gcp-app’ which is my repository location with tag of the current build no., “customImage.push(‘latest’)” shows that it will re-tag my current build no image with ‘latest’ tag and will push to gcr repository using the credentials added in credential id “credentialsId: ‘gcr:<project-id>’”.

Full code:

pipeline{
agent { label 'gcp-node' }

stages{
stage('checkout'){
steps{
checkout([$class: 'GitSCM', branches: [[name: '*/GCP']], extensions: [], userRemoteConfigs: [[url: 'https://github.com/bhaveshdhande/multicloud-deploy.git']]])
}
}
stage('docker build & push'){
steps{
script{
withDockerRegistry(credentialsId: 'gcr:<project-id>', url: 'https://asia.gcr.io/<project-id>/gcp-app') {
def customImage = docker.build("asia.gcr.io/<project-id>/gcp-app:${env.BUILD_ID}")
customImage.push('latest')
}
}
}
}
}
}

After running this code successfully your image will be stored in GCR -

What is GCR?

Google Container Registry, also known as GCR, is a managed service for storing Docker container images that is offered by Google Cloud Platform. Your Docker container images can be stored, managed, and deployed on the Google Cloud Platform ecosystem. GCR offers a secure and dependable solution to host your container images.

The integration of GCR with other Google Cloud Platform services is one advantage of using it. You may easily deploy your container images to Google Kubernetes Engine (GKE) or instance, or to any other service offered by the Google Cloud Platform, such as Cloud Run or App Engine.

Overall, GCR is an effective solution for managing your Docker container images within the Google Cloud Platform ecosystem. It can also assist you in streamlining the workflows involved in developing and deploying container-based applications.

As you can see, Jenkins Pipeline has pushed my image to the GCR repository, at the same path given “asia.gcr.io/ — — — — — /gcp-app” in the groovy code.

Conclusion :

There are numerous choices available on the market to aid developers’ workflow. Some of these are free and open-source, while others are for a fee, as stated above. Now we know, Jenkins is one of the most well-known open-source tools available, as well as one of the most widely used. Jenkins is used to generate Docker containers, develop code, and deploy to staging and production environments. Hope this article helped you to understand well about Jenkins and its related pipeline.

In case of any questions regarding this article, please feel free to comment in the comments section or contact me via LinkedIn.

I’d like to give a shout-out to my team at Guysinthecloud for all the support.

Thank You !!

--

--