A to Z of Google Cloud Platform a personal selection — J is for Jenkins
Jenkins is hugely popular and there’s no surprise that you can easily use Jenkins ( and Git) as part of a CI/CD pipeline targeting deployments on the GCP compute services namely compute engine, app engine and container engine.
And as I like to point out GCS ( cloud storage) is a valid target too and I will cover that in detail in the second half of this post !
There are a number of plugins that specifically target GCP and in the first part of this post all I’m doing is rounding them up for you together with a few that are just really useful anyway like the Docker ones.
To integrate Jenkins with your CI/CD deployment pipeline you will need to use a combination of a number of plugins specifically targeted for use with Google Cloud Platform .
If you know of any plugins I’ve missed out please let me know so I can keep the list updated.
Google Container Registry Auth Plugin — This plugin provides the credential provider to use Google Cloud Platform OAuth Credentials (provided by the Google OAuth Credentials plugin) to access access Docker images from Google Container Registry (GCR)
Google Cloud Storage Plugin — This plugin provides the “Google Cloud Storage Uploader” post-build step for publishing build artifacts to Google Cloud Storage
Google OAuth Plugin — This plugin implements the OAuth Credentials interfaces for surfacing Google Service Accounts to Jenkins
Google Source Plugin — provides the credential provider to use Google Cloud Platform OAuth Credentials (provided by the Google OAuth Credentials plugin) to access source code from https://source.developer.google.com as well as https://*.googlesource.com. It supports both kinds of credentials provided by Google OAuth Credentials plugin
Google Metadata Plugin — provides a basic framework for steps in a build’s lifecycle to attach JSON-serializable metadata to a build
The docker image here can be used to create a jenkins docker instance with the plugins described above automatically installed
Jclouds Plugin This plugin provides the capability to launch jenkins slaves on any Cloud provider supported by JClouds. You can use it to spin up jenkins slaves running on Google’s Compute Engine
There are also a series of Docker and kubernetes focused plugins that are also worth configuring :
Kubernetes Plugin — Allows you to use multiple docker hosts managed by Kubernetes to dynamically provision a slave (using Kubernetes scheduling mechanisms to optimize the loads), run a single build, then tear-down that slave.
Docker plugin — use a docker host to dynamically provision a slave, run a single build, then tear-down that slave.
CloudBees Docker Custom Build Environment Plugin — lets you configure your build to run inside a Docker container
Jenkins workflow plugin — A Groovy based domain-specific language (DSL) to allow the creation of scripted workflow steps
CloudBees Docker workflow plugin — This provides a convenient domain-specific language (DSL) for performing some of the most commonly needed Docker operations in a continuous-deployment pipeline from a Workflow script.
Okay if you’ve been following this series you’ll know I said I’d conclude the topic of hosting static sites here. So although the section above about plugins can be read as a stand alone entry like the previous entries in this series the rest of this post assumes that you have read or are at least familiar with the entry for H. This is a hands on walkthrough so you may want to actually save reading the rest of this post till you’re sat in front of a machine if you do want to follow through .
Note from here on in I’m making a number of assumptions if you decide to follow along
- You already have a GCP account and have the gcloud command sdk installed
- You have set up a static site generator and set up a website bucket ( for the purposes of this walkthrough you don’t need to setup a DNS CNAME record for following along) See my post for H .
- You know how to use Jenkins you don’t have to but I won’t be spending time talking about the fundamentals but focusing on the steps required to set up the pipeline.
- you have familiarity with git
- Note this will incur charges so do shutdown / delete your project if you do not intend to keep the configuration.
If you’re going to follow along I’d suggest a creating a new project first .
Ensure you are using the correct project by typing the following to check the project you are working in:
$ gcloud config list
If it’s not the right project then type
$ gcloud config set project your-project-id
Configuring Git with your content and Cloud repositories
The steps assume you are using Hugo as that is what I used but you can use any static site generator you want . You just need to adjust the instructions below to reflect the location of your content
If you do not already have some web pages set up then take time up to use a static site generator to create some local pages . In the entry for H I discuss using Hugo to accomplish this.
Note: using the hugo command sans server will create content in a public folder in your working directory
The next thing to do is to configure git ( It’s beyond the scope of this already rule busting post to detail installing git follow the link for that)
Change to your Hugo working directory and initialise git in the public folder by typing the following command
$ git init public
$ cd public
Log into the Cloud platform development console
When you create a project you automatically get access to a cloud repository. This repository needs to be initialised so from the side menu under tools select development then source code .
Click the get started button
Select Push code from a local Git repository to your Cloud repository
From the public folder follow the instructions you see as shown above.
note: doing it this way round will re initialise git so it will synchronise with your cloud repository.
follow through with the various questions responding appropriately
Make sure you choose the following for the two questions below (This is because we already have content in our local public folder)
Pick configuration to use:
 Re-initialize this configuration [default] with new settings
 Create a new configuration
Please enter your numeric choice: 1
This project has one or more associated git repositories.
Pick git repository to clone to your local machine:
 Do not clone
Please enter your numeric choice: 2
Then use git to add and commit the files
Then Use the following command to push the changes to your remote repository( in this case all the files you added with the git add command)
$ git push --all google
Now if you look at the repository you will see the files you pushed.
Create a bucket to use for hosting a static website
Create a target bucket to host your static web site . See the entry for H in this series for a discussion .
For the purposes of this walkthrough we do not need to setup a CNAME as it’s the mechanics of the pipeline we are looking at.
Create a bucket
Leave it empty at this stage
Setting up Jenkins to use GCP resources
Firstly for this stage I hope you are familiar with jenkins as it’s UI is all over the place and finding the right place to do stuff can be frustrating so there are plenty of screenshots in this section!
A BIG WARNING WHEN YOU SET UP YOUR JENKINS INSTANCE CHANGE THE PASSWORD IMMEDIATELY ( yes I shouted that on purpose) try not to lock yourself out as I managed to do as I’d not touched Jenkins for a while .. just saying :-)
GCP have published an article and detailed walkthrough on github on setting up a highly available Jenkins configuration . As I am familiar with that I used that to fire up a jenkins server . You may find it simpler to use the bitnami jenkins image.
The jenkins configuration steps I outline below should work regardless of how you decide to deploy jenkins but I have not tested the bitnami image though so don’t quote me :-)
Once you have jenkins started
Make a note of your cloud repository url or keep a tab open on the repository settings page in the console
Launch the jenkins image.
Once it is ready for use
Note: Both approaches have the required plugins already installed so all you need to do is configure it.
First create a credential by clicking on the Credentials link in the left nav, then clicking the Global credentials link, then clicking the “How about adding some credentials?” link
Select Google Service Account from metatdata
Click the create new jobs if you see that or click new item from the left nav. Select freestyle project, enter a job name in the item box
Under source code management select Git and paste in your repository URL from the repository settings page
Configure the post build steps so that it looks like the following:
So at this stage we have your web pages in your repository ( and the build logs . For a production environment you would push the logs to a different bucket ) and your bucket is empty.
Now select build now from your newly created jobs page .
Assuming you have followed the steps above the build artifacts in this case the files in the public folder of your hugo working folder has now been copied to the bucket you created earlier.
You can see what has been copied to the bucket from within jenkins by looking at the Google cloud storage upload report and also by looking at the bucket itself
Now you have some content you can configure the bucket to serve static pages as discussed in the post for H in this series
Jenkins can be set to poll for any changes and automatically push changes to the target bucket or you can invoke a manual build and push .
You may well need to tweak things a little but you should see that creating additional files in your public folder , doing a git commit & push then triggering a build either manually or automatically the updated files get pushed to your target bucket.
Adding jenkins and a cloud repository to your pipeline allows you to:
- Treat your website content like you do with code providing version control and the ability to roll back to a specific point in time .
- You can now easily add contributors who you need to give access to the project so that they can use git to keep local copies in sync and push changes up to the repository .