Continuous Deployment of Node.js applications on Google Cloud Run using Dagger and GitHub Actions

Siddharth Satpute
Google Cloud - Community
11 min readJul 19, 2023

Context:

The Cloud DevOps transformation in today’s era has disrupted the way an application is built and delivered to the customers with the main focus areas being quality and velocity.

There have been many tools which are enabling full control to the developers to handle CI/CD for their applications for example we have HashiCorp Waypoint which acts as a PaaS (Platform as a Service), we have Harness CI/CD and recently Pulumi also has joined the club to strengthen developer productivity by enabling them to carry out CD part by writing in their familiar coding language.

In all these tools and ecosystem complexity we have one platform called Dagger.io from the founders (Solomon Hykes) of the famous container king Docker.

What is Dagger?

Dagger is an open source dev kit for CI/CD. It works using Cue, a powerful configuration language made by Google that helps to validate and define text-based and dynamic configurations. To put in a sentence,Dagger is a programmable CI/CD engine that runs your pipelines in containers.

Who should use Dagger.io?

Dagger may be a good fit if you are…

  • If you’re a developer who desires to define your CI pipelines using code rather than YAML, there are several options available to you. You can explore using programming languages like Python or Go to define your pipelines or consider using advanced tools like GitLab or Jenkins, which offer more robust pipeline definition capabilities.
  • As the designated devops person for your team, you may want to upgrade from artisanal scripts to more powerful automation and streamlining tools. You can use tools like Ansible, Puppet, and Chef to define your infrastructure as code and manage it programmatically, making it easier to scale and maintain your systems.
  • A platform engineer may be interested in creating custom tooling to unify continuous delivery across organisational silos. Tools like Kubernetes can provide a unified platform for deploying and managing containerised applications. Kubernetes allows you to define deployment pipelines using YAML or programming languages like Python and provides a powerful set of tools to manage and scale your applications across multiple clusters.
  • If you’re a cloud-native developer advocate or solutions engineer with a need to demonstrate a complex integration on short notice, you can consider using Terraform or CloudFormation. These tools allow you to define your infrastructure as code and quickly deploy it across multiple cloud providers, simplifying complex integrations and workflows.

Features of Dagger.io:

Dagger.io offers several powerful features that enhance the development of CI/CD pipelines. Here’s an overview of the key functionalities:

  • CI/CD Pipelines as Code: Dagger.io allows you to develop your CI/CD pipelines using the same programming language as your application.
  • Execution with OCI Containers: Dagger.io executes your pipelines entirely using standard OCI containers. This approach provides isolation, reproducibility, and consistency across different environments.
  • Cross-Language Instrumentation: Dagger.io enables teams to utilise tools developed in different programming languages without requiring them to learn each other’s languages.
  • Compatibility with the Docker Ecosystem: Dagger.io integrates seamlessly with the Docker ecosystem. You can incorporate any containerised tool or service into your pipeline. This compatibility allows you to leverage the vast array of existing Docker containers and tools to enhance your CI/CD workflows.Typical Workflow of Dagger.io
Workflow chart for Dagger.io CI/CD

Implementing CI/CD of a NodeJS Application Using Dagger.io with GitHub Actions and Google Cloud

Prerequisites for the implementation:

  • You have a basic understanding of the NodeJS programming language.
  • You have a basic understanding of GitHub Actions.
  • You have a Node.js development environment.
  • You have Docker installed and running on the host system.
  • You have the Google Cloud CLI installed.
  • You have a Google Cloud account and a Google Cloud project with billing enabled.
  • You have a GitHub account and a GitHub repository containing a Node.js Web application. This repository should also be cloned locally in your development environment.

Demo:

Step 1 : Create a GitHub repository with an example express application

Follow the steps below to create a GitHub repository and commit an example Express application to it.

  1. Log in to GitHub using the GitHub CLI:
gh auth login

2. Create a directory for the Express application & Create a skeleton Express application inside it.

mkdir myapp
cd myapp
npx express-generator

3. Make a minor modification to the application’s index page:

sed -i -e 's/Express/Dagger/g' routes/index.js

4. Initialize a local Git repository for the application & Add a .gitignore file and commit the application code.

git init
echo node_modules >> .gitignore
git add .
git commit -a -m "Initial commit"

5. Create a private repository in your GitHub account and push the changes to it:

gh repo create myapp --push --source . --private

Step 2 : Create a Google Cloud service account, Configure Google Cloud APIs and a Google Cloud Run service

  1. Create a Service account using gcloud command and assign it Service account token creator and Editor roles.
gcloud iam service-accounts create [SA-NAME] --description="[SA-DESCRIPTION]" --display-name="[SA-DISPLAY-NAME]"

Assign the service account with the roles mentioned above.

gcloud projects add-iam-policy-binding [PROJECT-ID] --member="serviceAccount:[SA-NAME]@[PROJECT-ID].iam.gserviceaccount.com" --role="roles/editor"

Step 3: Configure Google cloud APIs and Cloud Run service.

  1. From the navigation menu in the google cloud console search enable APIs and services and enable the IAM Service Account Credentials API & Cloud Run API.
  2. Create a Google Cloud Run service and corresponding public URL endpoint. This service will eventually host the container deployed by the Dagger pipeline.

Step 4: Create the Dagger Pipeline.

In this step we will create a Dagger pipeline to do the heavy lifting: build a container image of the application, release it to Google Container Registry and deploy it on Google Cloud Run.

  1. In the application directory, install the Dagger SDK and the Google Cloud Run client library as development dependencies:
npm install @dagger.io/dagger@latest @google-cloud/run --save-dev

2. Create a new sub-directory named ci. Within the ci directory, create a file named index.mjs and add the following code to it. Replace the PROJECT placeholder with your Google Cloud project identifier and adjust the region (us-central1) and service name (myapp) if you specified different values when creating the Google Cloud Run service in Step 2.

import { connect } from "@dagger.io/dagger"
import { ServicesClient } from "@google-cloud/run";const GCR_SERVICE_URL = 'projects/PROJECT/locations/us-central1/services/myapp'
const GCR_PUBLISH_ADDRESS = 'gcr.io/PROJECT/myapp'
// initialize Dagger client
connect(async (daggerClient) => {
// get reference to the project directory
const source = daggerClient.host().directory(".", { exclude: ["node_modules/", "ci/"] })
// get Node image
const node = daggerClient.container({ platform: "linux/amd64" }).from("node:16")
// mount cloned repository into Node image
// install dependencies
const c = node
.withMountedDirectory("/src", source)
.withWorkdir("/src")
.withExec(["cp", "-R", ".", "/home/node"])
.withWorkdir("/home/node")
.withExec(["npm", "install"])
.withEntrypoint(["npm", "start"])
// publish container to Google Container Registry
const gcrContainerPublishResponse = await c
.publish(GCR_PUBLISH_ADDRESS)
// print ref
console.log(`Published at: ${gcrContainerPublishResponse}`)
// initialize Google Cloud Run client
const gcrClient = new ServicesClient();
// define service request
const gcrServiceUpdateRequest = {
service: {
name: GCR_SERVICE_URL,
template: {
containers: [
{
image: gcrContainerPublishResponse,
ports: [
{
name: "http1",
containerPort: 3000
}
]
}
],
},
}
};
// update service
const [gcrServiceUpdateOperation] = await gcrClient.updateService(gcrServiceUpdateRequest);
const [gcrServiceUpdateResponse] = await gcrServiceUpdateOperation.promise();
// print ref
console.log(`Deployment for image ${gcrContainerPublishResponse} now available at ${gcrServiceUpdateResponse.uri}`)
}, {LogOutput: process.stdout})

This file performs the following operations:

  • It imports the Dagger and Google Cloud Run client libraries.
  • It creates a Dagger client with connect(). This client provides an interface for executing commands against the Dagger engine.
  • It uses the client’s host().workdir() method to obtain a reference to the current directory on the host, excluding the node_modules and ci directories. This reference is stored in the source variable.
  • It uses the client’s container().from() method to initialize a new container from a base image. The additional platform argument to the container() method instructs Dagger to build for a specific architecture. In this example, the base image is the node:16 image and the archiecture is linux/amd64, which is one of the architectures supported by Google Cloud. This method returns a Container representing an OCI-compatible container image.
  • It uses the previous Container object's withMountedDirectory() method to mount the host directory into the container at the /src mount point, and the withWorkdir() method to set the working directory in the container.
  • It chains the withExec() method to copy the contents of the working directory to the /home/node directory in the container and then uses the withWorkdir() method to change the working directory in the container to /home/node.
  • It chains the withExec() method again to install dependencies with npm install and sets the container entrypoint using the withEntrypoint() method.
  • It uses the container object’s publish() method to publish the container to Google Container Registry, and prints the SHA identifier of the published image.
  • It creates a Google Cloud Run client, updates the Google Cloud Run service defined in Step 2 to use the published container image, and requests a service update.

Step 5: Test the Dagger pipeline on the local host.

Configure credentials for the Google Cloud SDK on the local host, as follows:

  1. Configure Docker credentials for Google Container Registry on the local host using the following commands. Replace the SERVICE-ACCOUNT-ID placeholder with the service account email address created in Step 1, and the SERVICE-ACCOUNT-KEY-FILE placeholder with the location of the service account JSON key file downloaded in Step 1.
gcloud auth activate-service-account SERVICE-ACCOUNT-ID --key-file=SERVICE-ACCOUNT-KEY-FILE
gcloud auth configure-docker

2. Set the GOOGLE_APPLICATION_CREDENTIALS environment variable to the location of the service account JSON key file, replacing the SERVICE-ACCOUNT-KEY-FILE placeholder in the following command. This variable is used by the Google Cloud Run client library during the client authentication process.

export GOOGLE_APPLICATION_CREDENTIALS=SERVICE-ACCOUNT-KEY-FILE

Once credentials are configured, test the Dagger pipeline by running the command below:

node ci/index.mjs

Dagger performs the operations defined in the pipeline script, logging each operation to the console. At the end of the process, the built container is deployed to Google Cloud Run and a message similar to the one below appears in the console output:

Deployment for image gcr.io/PROJECT/myapp@sha256:b1cf... now available at https://...run.app

Browse to the URL shown in the deployment message to see the running application. The output is shown below.

Step 6: Create a GitHub Actions workflow.

Dagger executes your pipelines entirely as standard OCI containers. This means that the same pipeline will run the same, whether on on your local machine or a remote server.

This also means that it’s very easy to move your Dagger pipeline from your local host to GitHub Actions — all that’s needed is to commit and push the pipeline script from your local clone to your GitHub repository, and then define a GitHub Actions workflow to run it on every commit.

  1. Commit and push the pipeline script and related changes to the application’s GitHub repository:
git add .
git commit -a -m "Added pipeline"
git push

2. In the GitHub repository, create a new workflow file at .github/workflows/main.yml with the following content:

name: 'ci'
on:
push:
branches:
- master
jobs:
dagger:
runs-on: ubuntu-latest
steps:
-
name: Checkout
uses: actions/checkout@v3
-
id: 'auth'
name: 'Authenticate to Google Cloud'
uses: 'google-github-actions/auth@v1'
with:
token_format: 'access_token'
credentials_json: '${{ secrets.GOOGLE_CREDENTIALS }}'
-
name: Login to Google Container Registry
uses: docker/login-action@v2
with:
registry: gcr.io
username: oauth2accesstoken
password: ${{ steps.auth.outputs.access_token }}
-
name: Setup node
uses: actions/setup-node@v3
with:
node-version: 16.13.x
cache: npm
-
name: Install
run: npm install
-
name: Release and deploy with Dagger
run: node ci/index.mjs

This workflow runs on every commit to the repository master branch. It consists of a single job with six steps, as below:

  • The first step uses the Checkout action to check out the latest source code from the main branch to the GitHub runner.
  • The second step uses the Authenticate to Google Cloud action to authenticate to Google Cloud. It requires a service account key in JSON format, which it expects to find in the GOOGLE_CREDENTIALS GitHub secret. This step sets various environment variables (including the GOOGLE_APPLICATION_CREDENTIALS variable required by the Google Cloud Run SDK) and returns an access token as output, which is used to authenticate the next step.
  • The third step uses the Docker Login action and the access token from the previous step to authenticate to Google Container Registry from the GitHub runner. This is necessary because Dagger relies on the host’s Docker credentials and authorizations when publishing to remote registries.
  • The fourth and fifth steps download and install the programming language and required dependencies (such as the Dagger SDK and the Google Cloud Run SDK) on the GitHub runner.
  • The sixth and final step executes the Dagger pipeline

The Authenticate to Google Cloud action looks for a JSON service account key in the GOOGLE_CREDENTIALS GitHub secret. Create this secret as follows:

  1. Navigate to the Settings -> Secrets -> Actions page in the GitHub Web interface.
  2. Click New repository secret to create a new secret.
  3. Configure the secret with the following inputs:
  • Name: GOOGLE_CREDENTIALS
  • Secret: The contents of the service account JSON key file downloaded in Step 1.

4. Click Add secret to save the secret.ADD HERE THE IMAGE FROM GITHUB SECRET CREDENTIAL

Step 7: Test the Dagger pipeline on GitHub.

Test the Dagger pipeline by committing a change to the GitHub repository.

git pull
sed -i 's/Dagger/Dagger on GitHub/g' routes/index.js
git add routes/index.js
git commit -a -m "Update welcome message"
git push

The commit triggers the GitHub Actions workflow defined in Step 6. The workflow runs the various steps of the dagger job, including the pipeline script.

At the end of the process, a new version of the built container image is released to Google Container Registry and deployed on Google Cloud Run. A message similar to the one below appears in the GitHub Actions log:

Deployment for image gcr.io/PROJECT/myapp@sha256:h4si... now available at https://...run.app

Browse to the URL shown in the deployment message to see the running application. If you deployed the example application with the additional modification above, you see a page similar to that shown below:

Updated Deployment page with newer welcome note

Conclusion

This blog walked you through the process of creating a Dagger pipeline to continuously build and deploy a Node.js application on Google Cloud Run. It used the Dagger SDKs and explained key concepts, objects and methods available in the SDKs to construct a Dagger pipeline. Dagger executes your pipelines entirely as standard OCI containers. This means that pipelines can be tested and debugged locally, and that the same pipeline will run consistently on your local machine, a CI runner, a dedicated server, or any container hosting service. This portability is one of Dagger’s key advantages, and this tutorial demonstrated it in action by using the same pipeline on the local host and on GitHub.

Drop your comments, feedback, or suggestions below — or connect with me directly on Linked @https://www.linkedin.com/in/siddharth-satpute/

Happy learning !!!

--

--