Using GitLab and Cloud Build to achieve CI/CD for Cloud Run
A client came to me and said “I have this Python code that I want to run on a regular schedule so I figured I would package it in Cloud Run and use Cloud Scheduler to invoke it. This works. However, I don’t want to keep manually re-building container images each time I tweak the source of the program. Can we automate these tasks to achieve CI/CD?”.
I knew the theory of how it all should work but hadn’t had the opportunity to use it in practice. After hitting the books and doing a little study, I found that it worked as advertised but there were a lot of steps. This article walks us through the activities to get it running from scratch. If you wish, you can follow along or else watch the narrated video found at the end of the article.
The overall architecture is as follows:
The way to interpret this diagram is that when an application makes a REST request, it is Cloud Run that will execute the Python code. Cloud Run expects a Docker Image to execute and will ask Artifact Repository for that image. Artifact Repository will then serve up that image. It is the Docker Image that will actually contain the Python code for execution. Let us now look at where the Docker Image can come from. We assume that the client is using GitLab as a source code management system. Code can be exported, edited and then committed back into the GitLab repository once source file changes have been made. The act of pushing the changes back into GitLab can be used to trigger Cloud Build to do its part. Cloud Build is the heart of the CI/CD story. It will clone the latest source from Gitlab and produce a new Docker Image from the code. Cloud Build will cause the newly built Image to be stored in Artifact Repository and finally instruct Cloud Run to begin using the newly built image. All of this will be automated meaning that the developer who makes changes to code need know nothing about the process that causes it to be deployed.
In this article we will:
- Create a GCP project
- Enable a set of GCP APIs
- Create an Artifact Registry Repository
- Create a Cloud Build trigger
- Create SSH keys
- Create GCP Secret Manager secrets
- Install and configure GitLab from Cloud Marketplace
- Wire it all together and test
And now … the recipe …
- Create a Project
We need a GCP project in which to work. For this demonstration we will assume a completely new project and create one here.
2. Create a VPC
Following Google best practices, we won’t assume the use of a default VPC network but rather we will create one for ourselves. First we enable the Compute Engine API:
Next we create a VPC that we call my-vpc
. We are working in the us-central1
region.
3. Create a firewall rule allowing port 22 (SSH)
In this exercise, we need to enable one firewall setting for our new VPC. This will allow ingress (incoming connections) using the SSH protocol. When we install GitLab, we will be using the SSH protocol to interact with it.
4. Install GitLab CE from Bitnami
While we could assume that the client already has GitLab installed and configured, for this exercise we wanted a GitLab sandbox that we could use for testing the linkage between code changes and Cloud Build execution. GCP Marketplace allows us to install a fully configured GitLab Community Edition in just a few clicks.
During installation, we will be asked to enable some GCP APIs.
Once we have agreed to using the APIs, a configuration page is shown for the GitLab instance we will create. We name the GCP Zone (us-central1-a
) into which it will be installed. We also specify how large a machine we want to use to run it.
When we click Deploy, the GCP Marketplace environment creates a Compute Engine instance and installs and configures GitLab into that VM. Unfortunately, in my tests, the install appears to stall … however, everything is working fine and I suspect this is a result in my company’s organization policies being very restrictive.
It takes about 10 minutes for the installation and configuration to complete.
5. Note the IP of the GitLab machine (eg. 34.68.171.109
)
At this point, we have an instance of GitLab up and running. In order to use it, we must determine the public IP address of the instance. We visit the Compute Engine page in Cloud Console and look for a VM with the name of our GitLab instance. Make a note of the public External IP address. We will need this value many times in the subsequent steps.
6. Login with root/password
In an incognito browser window, visit http://<IP Address>
where the IP address is that of the VM instance. You may get an SSL warning, simply accept and move on. Login with the userid of “root
” and the admin password shown in the Marketplace install screen (eg … as seen in our screenshot — a17XHUq71y4f
).
7. Change the root password
The root password is random, this step has us change it to a memorable value.
Select Menu > Admin
then
Overview > Users
Edit the Administrator user and supply a new password:
You will be immediately signed out. Sign in again using root and the new password.
8. Create a GitLab user called gcpbuild
Go to Menu > Admin
and then
Overview > Users
Click the New user button:
Give the username as gcpbuild
and specify a junk email address for the Email field.
Once created, go back to Overview > Users and edit the new “gcpbuild
” user. You will now find that you can set a password.
9. Become gcpbuild
Log out of the root user and login as the gcpbuild
user. You will be prompted to set a new password. I find that I can set the password to the same password I just entered.
10. Create a gitlab project called myproject
Now that we have GitLab up and running and we have a user for which we can login, we can create a project to host our source files. In our example we called our project myproject
.
11. Open a GCP Cloud Shell
In the next step, we are going to be creating some SSH keys and giving us an environment where we can edit source code. I suggest opening a GCP Cloud Shell.
12. Create some SSH keys
GitLab requires that requests sent to it be authenticated. The easiest way to achieve this is to use SSH authentication. This means that we create some SSH keys (public and private). We will give the public key to GitLab and hold on to the private key. The following commands create a key and a config file in the .ssh
directory. The config file needs edited to include the GitLab IP address:
cd ~/.ssh
ssh-keygen -t ed25519 -f gitlab_key -q -N ""
cat << EOF >> config
Host [GITLAB_IP]
IdentityFile ~/.ssh/gitlab_key
EOF
Edit the config file and set the IP address of the GitLab server replacing “[GITLAB_IP]
”
Host [GITLAB_IP]
IdentityFile ~/.ssh/gitlab_key
13. Add the SSH keys (public key)
We need to associate the public key that we just created (the content of the file gitlab_key.pub
) with the gcpbuild
user. Back in the GitLab browser window, we will see a button to “Add SSH Key
”:
Paste the content of the gitlab_key.pub
(the public part of the SSH key pair) into the key text area and click “Add key
”.
We can test that security is setup correctly by running the following command from Cloud Shell:
ssh -T git@[GITLAB_IP]
14. Make a working directory
We are now going to start populating the GitLab project with files. Create a working directory into which we will place our artifacts.
cd ~
mkdir gitlab
cd gitlab
15. Checkout the project
We can now clone our project.
git clone git@[GITLAB_IP]:gcpbuild/myproject.git
This will create a local folder called myproject
. Change into that folder:
cd myproject
16. Add the following files to the GitLab project.
We are going to add a Dockerfile
, a python app (app.py
) and a Python requirements file (requirements.txt
)
Dockerfile
FROM python:3
WORKDIR /usr/src/app
COPY app.py ./
COPY requirements.txt ./
RUN pip install --no-cache-dir -r requirements.txt
CMD exec gunicorn --bind :$PORT --workers 1 --threads 8 --timeout 0 app:app
app.py
from google.cloud import logging
import os
from flask import Flaskapp = Flask(__name__)@app.route('/')
def runApp():
print("Starting!")
logging_client = logging.Client()
log_name = "my-log"
logger = logging_client.logger(log_name)
text = "Hello, world! - said the Python app!"
logger.log_text(text)
print("Logged a record to the log called my-log")
return "All Done"if __name__ == '__main__':
server_port = os.environ.get('PORT', '8080')
app.run(debug=False, port=server_port, host='0.0.0.0')
requirements.txt
google-cloud-logging
Flask
gunicorn
The sample app logs a record into Google Cloud Logging and returns a text string to the caller. What the app does isn’t nearly as important as what we are illustrating in terms of CI/CD.
17. Commit and push the files
git add .
git commit -m ""
git push
18. Enable Cloud Run API
We are now ready to start fleshing out the CI/CD tasks. One of these is enabling Cloud Run so that it is ready to start serving up the app for client calls.
19. Enable Secret Manager API
As part of the configuration of our story, we will create two GCP secrets. One is used to validate that an incoming request is entitled to start a Cloud Run build. This will arrive from GitLab when the code changes. The second secret is used to hold the private key for the gitlab user called gcpbuild
. This will be used by Cloud Build to retrieve the latest version of the source.
20. Create a secret called gitlab-key
We create the first of the secrets. The first one is called gitlab-key
.
21. Upload the gitlab private key file or copy in its data
We copy the private key file data into the value of the secret. This is the secret that GCP keeps safe for use with Cloud Build.
22. Enable the Cloud Build API
Before we can use Cloud Build, we must enable its API.
23. Change the settings to allow Cloud Run and Cloud Secrets
Cloud Build may need additional permissions in order to perform its tasks. We need to enable “Cloud Run” and “Secret Manager”. This grants IAM permissions for Cloud Build to create/modify Cloud Run definitions and retrieve values from Secret Manager.
24. Enable Artifact Registry API.
We will be storing the built Docker Image in a repository in the Artifact Registry. We must enable the registry API before we can use it.
25. Create an Artifact repository called my-repo.
With the Artifact Registry enabled, we now create a named repository within it that can hold Docker Images.
26. Create a WebHook trigger in Cloud Build
We are getting close to the end. We wish Cloud Build to execute and build a Docker image when a request arrives from GitLab informing us that the source has changed. GitLab has the ability to invoke a WebHook when it detects changes. We create a trigger in Cloud Build that we will associate with GitLab. The trigger also specifies the recipe that we wish Cloud Build to run. In this article we aren’t going to go through the details of Cloud Build but it is important to understand that you must modify the _GITLAB_IP
and _PROJECT_NUMBER
values to correspond to the IP address of your GitLab instance and your GCP Project number.
At the highest level, the steps contained in the recipe are:
- Copy in the private key to allow Git running in Cloud Build permission to clone the project.
- Clone the project from GitLab to the local directory.
- Build a Docker Image from the Docker file and artifacts in the GitLab project.
- Push the resulting Docker Image to Artifact Registry
- Deploy the new Docker Image to our Cloud Run environment
Name: mytrigger
Event: Webhook URL
Secret: CREATE SECRET
Configuration: Inline with the following
steps:
- name: gcr.io/cloud-builders/git
args:
- '-c'
- |
echo "$$SSHKEY" > /root/.ssh/id_rsa
chmod 400 /root/.ssh/id_rsa
ssh-keyscan ${_GITLAB_IP} > /root/.ssh/known_hosts
entrypoint: bash
secretEnv:
- SSHKEY
volumes:
- name: ssh
path: /root/.ssh
- name: gcr.io/cloud-builders/git
args:
- clone
- 'git@${_GITLAB_IP}:${_GITLAB_NAME}.git'
- .
volumes:
- name: ssh
path: /root/.ssh
- name: gcr.io/cloud-builders/docker
args:
- build
- '-t'
- '${_ARTIFACT_REPO}'
- .
- name: gcr.io/cloud-builders/docker
args:
- push
- '${_ARTIFACT_REPO}'
- name: gcr.io/google.com/cloudsdktool/cloud-sdk
args:
- run
- deploy
- my-service
- '--image'
- '${_ARTIFACT_REPO}'
- '--region'
- us-central1
- '--allow-unauthenticated'
entrypoint: gcloud
substitutions:
_PROJECT_NUMBER: '689284316634'
_GITLAB_IP: 34.123.168.233
_ARTIFACT_REPO: 'us-central1-docker.pkg.dev/${PROJECT_ID}/${_ARTIFACT_REPO_NAME}/myimg'
_GITLAB_NAME: gcpbuild/myproject
_ARTIFACT_REPO_NAME: my-repo
availableSecrets:
secretManager:
- versionName: 'projects/${_PROJECT_NUMBER}/secrets/gitlab-key/versions/1'
env: SSHKEY
27. Grant [PROJECT_NUMBER]-compute@developer.gserviceaccount.com
the Logs Writer Role
The application we wish to run in Cloud Run wishes to write to Cloud Logging. We must thus permit Cloud Run to write to Cloud Logging.
28. Create a WebHook trigger in Gitlab
We created the trigger in Cloud Build that, when called, will rebuild the image and deploy to Cloud Run. Now we must configure GitLab to invoke the WebHook and hence trigger Cloud Build:
Under Settings > Webhooks
Grab the URL from the Cloud Build trigger:
29. Test the webhook from within GitLab.
GitLab has its own test environment to unit test a WebHook.
30. Run the Cloud Run function and show that it works.
When the re-build is complete, we can run the Cloud Run function and see that it works.
31. Change code and view that the Cloud Build runs and changes take place.
In GitLab, we can now change the source code and see that the new code is redeployed to Cloud Run.
As we saw, a lot of steps but each served an important purpose in our illustrative story. As an aid, here again is the same story but this time as a recorded walk through executing on GCP with additional commentary.