Gitlab-Docker-Kubernetes Magic: Integrating your microservice with Gitlab CI, Docker & Kubernetes

Mayank Kapoor
Running a Software Factory
6 min readAug 6, 2017
Gitlab, Docker & Kubernetes — A match made in heaven

We’re going to add Continuous Integration (CI) to a Node.js microservice, Dockerize it and then deploy it using Kubernetes. The source code is available here. Fork & clone this repo, then checkout the begin-tutorial branch to start:

$ git clone https://gitlab.com/mayankkapoor/node-with-gitlab-docker-kubernetes.git node-app
$ cd node-app/
node-app$ git checkout begin-tutorial
node-app$ ls
Procfile README.md app.json index.js package.json public

Let’s run this service on our local machine so we know it works.

node-app$ npm install
npm notice created a lockfile as package-lock.json. You should commit this file.
added 43 packages in 3.302s
node-app$ npm start
> node-js-sample@0.2.0 start /Users/mayankkapoor/Code/gitlab.com/node-app
> node index.js
Node app is running at localhost:8080

If you hit http://localhost:8080 in a browser or execute $ curl http://localhost:8080 you’ll see our service works and sends back “Hello World!”.

1. Add continuous integration using Gitlab CI & Docker

Gitlab is fast turning out to be the git system of choice for many. This is likely due to their CI server, private repos, private container registry, advanced merge request & review capabilities, all integrated in one nice seamless package.

Add automated tests

Let’s add automated tests that execute in Gitlab CI. Running automated unit tests before building the docker image for our app helps us catch code issues early. Create a new file test/app.js under folder test with our unit test:

var expect = require("chai").expect;
var app = require("../index.js");
describe("App", function() {
describe("Add two numbers", function() {
it("Adds the two numbers given to it", function() {
var sum = app.addTwoNumbers(2, 3);
expect(sum).to.equal(5);
});
});
});

Add a function in index.js that implements the spec outlined in the unit test above to get it to pass.

var express = require('express')
var app = express()
exports.addTwoNumbers = function(number1, number2) {
return number1 + number2;
}
app.set('port', (process.env.PORT || 8080))

Install the mocha and chai frameworks using npm, and add them to package.json automatically using the --save option:

node-app$ npm install mocha --save
node-app$ npm install chai --save

Add a test script to package.json that tests our app using the mocha framework:

"main": "index.js",
"scripts": {
"start": "node index.js",
"test": "./node_modules/.bin/mocha --reporter spec"

},
"dependencies": {

Run the tests to check if the tests are working:

node-app$ npm test> node-js-sample@0.2.0 test /Users/mayankkapoor/Code/gitlab.com/node-app
> mocha --reporter spec
Node app is running at localhost:8080
App
Add two numbers
✓ Adds the two numbers given to it
1 passing (8ms)

Let’s make the tests run automatically with every commit using Gitlab CI by adding a .gitlab-ci.yml file with following contents:

stages:
- testing
test:
stage: testing
image: node:boron
script:
- npm install
- npm test

Push your changes to your Gitlab repository and make sure the Gitlab CI pipeline is executing properly:

node-app$ git add .
node-app$ git status
Changes to be committed:
(use "git reset HEAD <file>..." to unstage)
new file: .gitlab-ci.yml
modified: index.js
new file: package-lock.json
modified: package.json
new file: test/app.js
node-app$ git commit -m 'Add automated tests and .gitlab-ci.yml'
node-app$ git push <your repo remote> # remember to push to your own Gitlab repo, not mine
Gitlab CI pipeline executed our tests

Build a docker image using Gitlab CI and store it in Docker Hub

Let’s use Gitlab CI to build a docker image for our Node.js service and push it to the Docker Hub container registry automatically. You can also choose to push the image to the Gitlab private container registry that is automatically created with your repository. I’ll assume you have a working Docker installation on your machine. First Dockerize our Node.js app (reference) by creating a Dockerfile in the top level folder:

FROM node:boron# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
COPY package.json .
# For npm@5 or later, copy package-lock.json as well
COPY package-lock.json .
RUN npm install# Bundle app source
COPY . .
EXPOSE 8080
CMD [ "npm", "start" ]

Create a .dockerignore file in the same directory as your Dockerfile with following content to prevent your local modules and debug logs from being copied onto your Docker image:

node_modules
npm-debug.log

On Gitlab.com, go to your repository>Settings>Pipelines>Secret Variables, and add your Docker Hub password as a secret variable to use in Gitlab CI. E.g. Enter Key DOCKERHUB_PASSWORD and your Docker Hub password in Value, and click “Add new variable” button.

Build the docker image in Gitlab CI by adding the docker build steps to .gitlab-ci.yml file:

stages:
- testing
- build
test:
stage: testing
image: node:boron
script:
- npm install
- npm test
build_image:
stage: build # build image only after test passed
only: [master] # build and push images only for master branch commits
image: docker:git # use simply git docker image
services:
- docker:dind
script:
- docker login -u mayankkapoor -p $DOCKERHUB_PASSWORD
- docker build -t $CI_PROJECT_PATH .
- docker tag $CI_PROJECT_PATH $CI_PROJECT_PATH:$CI_COMMIT_SHA
- docker push $CI_PROJECT_PATH:$CI_COMMIT_SHA

The script above will login to your Docker Hub registry, build the image, tag it with the commit SHA value, and then push it to the Hub. Make sure you put your username instead of mine. Push your changes to Gitlab and watch as Gitlab CI tests your code, builds the docker image and pushes it to Docker Hub.

Gitlab CI tested and built the docker image and pushed it to the registry

Optional: You can choose to push the built image to your private Gitlab container registry instead of the public Docker Hub. Just modify the last 4 lines of the script above to:

  script:
- docker login -u gitlab-ci-token -p $CI_BUILD_TOKEN registry.gitlab.com
- docker build -t registry.gitlab.com/$CI_PROJECT_PATH .
- docker tag registry.gitlab.com/$CI_PROJECT_PATH registry.gitlab.com/$CI_PROJECT_PATH:$CI_COMMIT_SHA
- docker push registry.gitlab.com/$CI_PROJECT_PATH:$CI_COMMIT_SHA

2. Kubernetes Magic

Let’s now use the docker image we created and deploy it to real servers using Kubernetes. Docker and Kubernetes make this so easy. For this, we’ll create a kubernetes cluster on Google Cloud Platform (GCP). GCP makes it quite easy to create Kubernetes clusters. Login to the Google Cloud Shell after creating a project and create a kubernetes cluster:

Welcome to Cloud Shell! Type "help" to get started.
$ gcloud container clusters create k0 --zone asia-southeast1-a
Creating cluster k0...done.
Created [https://container.googleapis.com/v1/projects/mayank-k8s/zones/asia-southeast1-a/clusters/k0].
kubeconfig entry generated for k0.
NAME ZONE MASTER_VERSION MASTER_IP MACHINE_TYPE NODE_VERSION NUM_NODES STATUS
k0 asia-southeast1-a 1.6.7 35.186.148.30 n1-standard-1 1.6.7 3 RUNNING

Now that your Kubernetes cluster is up and running, let’s deploy our Node.js docker image to this cluster. First use the kubectl run command to run your image, then the kubectl expose command to expose the deployment using a service.

$ kubectl run hello-node --image=mayankkapoor/node-with-gitlab-docker-kubernetes:d6f71411107faaf95a1b8520759fd82bab4a6d24
deployment "hello-node" created
$ kubectl expose deployments hello-node --port=80 --target-port=8080 --type=LoadBalancer
service "hello-node" exposed

That’s it. Let’s get the external IP of the hello-node service and see if our node service is up and running.

$ kubectl get services
NAME CLUSTER-IP EXTERNAL-IP PORT(S) AGE
hello-node 10.47.252.164 35.187.240.245 80:31555/TCP 3m
kubernetes 10.47.240.1 <none> 443/TCP 20m
$ curl http://35.187.240.245
Hello World!

Hello World, I’m live!

You’ve now added automated tests to your microservice, tested and built the service using Gitlab CI, Dockerized and pushed the docker image to Docker Hub, and deployed the image to production using Kubernetes.

--

--