The Startup
Published in

The Startup

Github Package Lifecycle: Tag, Publish & Usage in Native Docker Build

Github Package Registry, announced in early 2019 is a fully Github integrated package management service where we can store package of npm, gem, mvn, nuget, gradle. By the effective usage of Github Actions and Package, we can build a good flow of release and package.

In this post, we are going to deal with the whole lifecycle of a npm package. Here’s what we will achieve:

  1. Create a nodejs project as a module
  2. Use published module in app
  3. Configure Github Actions for new tag and publish package with auto-increment in version
  4. Use the npm package in docker build
  5. Create docker image on Google Cloud Build

So, let’s dive in. All the codes and configs I am going to use is available on this repo: https://github.com/dwdraju/github-npm-package-demo

Only additional step apart from git is generation of Github token. Head over to Personal access tokens from Developer Settings of your profile and create a new token with access of repo, read:packages and write:packages.

Create a Node.js project as a module

First step would be to create a package.json file and a simple module which can be used as a module in our real app.

Here, the publishConfig should have github package url in registry key.

Time to publish:

Let’s try publishing our package

$ npm publish
npm notice
npm notice 📦 github-npm-package-demo@1.0.0
npm notice === Tarball Contents ===
.....
npm notice
npm ERR! code E401
npm ERR! Unable to authenticate, need: Basic realm="GitHub Package Registry"

This is expected as we haven’t configured our access to publish the package. We need to add scope for the user/github org package.

npm login --registry=https://npm.pkg.github.com --scope=@USER/ORG

Enter your user and token to authenticate. You can check ~/.npmrc file what the above command did.

$ cat ~/.npmrc 
@dwdraju:registry=https://npm.pkg.github.com/
//npm.pkg.github.com/:_authToken=5d4*****14b1**413cf*******5cd****a6

Now, publishing the package:

$ npm publishnpm notice 
npm notice 📦 github-npm-package-demo@1.0.0
npm notice === Tarball Contents ===
npm notice 13B app/.dockerignore
npm notice 391B app/Dockerfile
npm notice 123B app/app.js
npm notice 114B index.js
npm notice 425B app/package.json
npm notice 410B package.json
npm notice 1.0kB .github/workflows/cicd.yaml
npm notice === Tarball Details ===
npm notice name: github-npm-package-demo
npm notice version: 1.0.0
npm notice package size: 1.3 kB
npm notice unpacked size: 2.5 kB
npm notice shasum: 84a1778a580dd763146b9f43d31b7ca9dd7f90a9
npm notice integrity: sha512-gMDyZuUK9kziX[...]ZxnK3GUm3l+jQ==
npm notice total files: 7
npm notice
+ github-npm-package-demo@1.0.0

Now, if we go to packages section on our github repo, we can see 1.0.0 version of package is published.

Use published module in app

Two cases here: if we are using public package, we don’t need any authentication but for private packages in private repositories, we need to have the package:read access.

We have our nodejs application inside app folder. The content of package.json looks like this:

{
"name": "node-app",
"description": "My app",
"version": "1.1.0",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
"start": "node app.js"
},
"repository": {
"type": "git",
"url": "https://git@github.com:dwdraju/github-npm-package-demo.git"
},
"author": "",
"license": "ISC",
"dependencies": {
"@dwdraju/github-npm-package-demo": "1.0.0"
}
}

The dependencies section is where we specify the package we published.

We can do npm install and it will pull the package. Checking out the usage on index.js file:

var npmPackage = require('@dwdraju/github-npm-package-demo');console.log(`Words from package: ${npmPackage.printMsg()}`);

And we get the nice response by using module.

$ node app.js 
Words from package: Hey! Thanks for using my awesome package!!

Configure Github Actions

Time for automation!

We published and used the package but only from our local system with all manual approaches. Wouldn’t it be awesome if we could create a release of module, publish package with semantic versioning and all without any external tools? Here’s what we plan about the flow:

  1. Merge to master branch of the module
  2. Create a new Github tag auto-incrementing the version
  3. Publish Github Package based on tag
  4. Use Github Actions for all of these process

We add a new workflow file on .github/workflow/cicd.yml

https://github.com/dwdraju/github-npm-package-demo/blob/master/.github/workflows/cicd.yaml

We have used version bump actions which will auto increment the github tag and also the package. For example, it will start with v0.0.0 and gradually go towards v0.1.0 , v0.2.0 which is minor release. If we need to publish patch, just specifying #patch on merge commit will bump tov0.2.1 . Likewise, for major release, incuding #major will bump to v3.0.0 and so on.

Use the npm package in docker build

For any immutable build and deployment, we need github authentication for using the package if the package is private. For public also, we need the .npmrc file.

Here, we pass the github token as build arg, keep it on npmrc file and just after doing npm install, we delete the npmrc file so that it won’t be accessible by reverse engineering of Dockerfile in any layer.

Building docker image inside app folder:

:app $ docker build -t app . --build-arg GITHUB_AUTH_TOKEN=my-123-github-456-token

Create docker image on Google Cloud Build

Google cloud build is a build, test and deploy service from Google Cloud where we can define custom workflow, use compute resources on demand for the workflow with speed.

In this demo, we will build a docker image of our app on Google Cloud Build where we use our published package. So, this will cover flow for managing github token as secret and usage.

Let’s set the github token secret on Cloud KMS in GCP project.

$ gcloud kms keyrings create GIT_TOKEN --location global --project gkerocks$ gcloud kms keys create GITHUB_AUTH_TOKEN \
--location global --keyring GIT_TOKEN \
--purpose encryption
$ echo -n YOUR_GITHUB_TOKEN | gcloud kms encrypt --plaintext-file=- \ --ciphertext-file=- --location=global --keyring=GIT_TOKEN \
--key=GITHUB_AUTH_TOKEN | base64

Next step is to add access for Cloud Build to read the secret. Cloud Build uses service account id of format:7***3*62***1@cloudbuild.gserviceaccount.com . You can find the ID from IAM and add Secret Manager Secret Accessor role to the account.

$ gcloud kms keys add-iam-policy-binding GITHUB_AUTH_TOKEN \
--location global \
--keyring GIT_TOKEN \
--member serviceAccount:7***3*62***1@cloudbuild.gserviceaccount.com \
--role roles/cloudkms.cryptoKeyDecrypter

Now, going to Cloud Build. We need to add a trigger or webhook from github to Google Cloud Build when any action is performed on github repo.

Connect Cloud Build with Github Repo

I would prefer GitHub(mirrored) way. And then connect the repo on next page. Then Add trigger for the mirrored repository.

Add Trigger for Google Cloud Build

Here, we configured trigger for push event to git repo and especially for the master branch which is referred by ^master$ regex. And the build configuration will be provided from cloudbuild.yaml file on the repo itself.

Here is the cloudbuild.yaml file which has the encrypted and encoded github token.

cloudbuild.yaml

We are done!

When we send a commit to our repo, it will publish a new package along with creating github tag. Also, it triggers Google Cloud Build for creating a new image with ${SHORT_SHA} i.e. commit hash tag which is unique everytime. Now, whether we want to publish the docker image on Google Kubernetes Engine(GKE), Cloud Run or Compute Engine Instance, we can use the image id. In terms of accessing the docker image, the services should have access to the docker registry.

Do you have some more effective way or want to see some more words on the topic? Feel fee to drop comment.

Say Hi to me on Twitter and Linkedin where I keep on sharing interesting updates.

--

--

--

Get smarter at building your thing. Follow to join The Startup’s +8 million monthly readers & +756K followers.

Recommended from Medium

Migrating your data from DynamoDB to MongoDB Atlas

3 Enduring Eating regimen Tips for Overeaters

Advantages (and Disadvantages) of Microservices — Prime TSR

The Week In Review — 31 December, 2021

Spring Boot Twilio to enhance Customer Experience.

3 cost-cutting tips for Amazon DynamoDB

How to get started with OpenCV and Dlib for face detection in Windows 10.

Abstract Factory Pattern — Design Patterns

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Raju Dawadi

Raju Dawadi

DevOps | SRE | #GDE

More from Medium

Serverless on AWS with CDK #1: App Runner with VPC Integration

AWS Serverless Pattern: ECS Fargate Cluster Secure Access via VPC Link

Why Do We Use Kubernetes, Anyway?

Nginx: Automatic Googlebot Whitelist