Docker for your CI CD Pipeline

Anuj Kumar
kodeyoga
Published in
4 min readNov 5, 2019

Containerization has revolutionized the software development world, all aspects of development now has a powerful technology to help them in many ways. Docker is the most popular of these container technologies available today, with a huge backing of open source community.

Docker has changed the way we setup our build pipelines. It helps us in every stage of the pipeline, be it compilation, build, test, package or deliver. Using docker we can isolate individual tasks from each other and achieve a great degree of parallelism in task runs by designing the pipeline thoughtfully. The benefits of doing so is faster builds and reduced feedback time, and it ultimately increases developers productively.

Role of Docker in CI CD

As mentioned earlier docker can play important role in all the stages of pipeline.

Build: Docker container for this stage will have all the tools required for build, like SDKs, it can also cache the dependencies required by our app.

Test: Docker is very helpful in integration and functional testing, we can create the container of services like database, identity and others required by our app before starting the integration or functional tests. This approach also provides us the capability to parallelize the tests as there won’t be issues like port conflict while trying to create two instances of our app for testing. Its very cost effective and reduces efforts required for environment management as there is no need to keep the dependent services always running. Lastly, we can also per-load test data in databases for our integration/functional tests to run, this saves a significant amount of time as earlier we used to load data as part of test setup.

Deliver: For this stage, we can setup the docker image to have the tools required for packaging and delivery. For example, we can have docker inside docker container for packaging our artifact as a docker image.

Let’s run through an example

Let us consider a simple example of a spring-boot application for employee management using postgres as data store. We have some integration tests for REST APIs available for employee management. By the end of this article we will create a Jenkins pipeline with 3 stages for build, integration test and package.

Prerequisites

We will not discuss about setting up spring-boot app and writing integration tests for it. You can refer other resources if you need help on these.

This article assumes you are familiar with basics of docker and has some experience with Jenkins. Even if you are not familiar with Jenkins you can read through it, you will understand the concept and use it in any other CI/CD tool of your choice.

Docker images for the pipeline

We will use official docker images of maven and docker(docker within docker) for build and package stages of build pipeline respectively.

Follow below steps to create a docker image having postgres and employee test data for our integration tests.

First initialize a docker container from postgres official docker image.

docker run --name emp-postgres -p 5431:5432 -i postgres

Now, connect to this postgres container using any method you are convenient with and load some employee data for our integration tests.

After data is loaded ssh into the docker container and copy data folder of postgres to new path say /postgres. Once data is copied, commit the container’s state in a new image called image-for-it.

mkdir /postgres
cp -r /var/lib/postgresql/data/* /postgres
docker commit emp-postgres image-for-it

Note: Above step is very important because by default postgres data folder is a mounted volume inside the container and we need to make data part of our image for integration tests.

We are ready with a docker image for our integration test in our build pipeline. Please refer below for it’s usage in Jenkinsfile.

Jenkinsfile for defining build pipeline

The below Jenkinsfile describes a Jenkins Job having 3 stages for build, integration test and package. Each stage is using a different docker container for performing the specific tasks of that stage.

pipeline {
agent none
stages {
stage('Build With Unit Test') {
agent {
docker {
image 'maven:3-alpine'
args '-v $HOME/.m2:/root/.m2'
}
}
steps {
sh 'mvn install -DskipITs'
}
}
stage('Integration Test Suites') {
agent {
docker {
image 'image-for-it:latest'
args '--env PGDATA=postgres'
}
}
steps {
sh 'mvn failsafe:integration-test'
sh 'mvn failsafe:verify'
}
}
stage('Package') {
agent {
docker {
image 'docker:latest'
}
}
steps {
sh 'cp $WORKSPACE/target/emp-app.jar $WORKSPACE/build
/emp-app.jar'
sh 'sudo docker build $WORKSPACE/build -t emp-app:1.0'
}
}
}
}

The last stage will create a docker image having our employee app, this steps needs us to check-in the Dockerfile needed for packaging at {root-app-directory}/build/Dockerfile.

We need to check-in the Jenkinsfile in our code and configure a Jenkins Job for our repository, this will configure a pipeline for us. This might take time, please refer https://jenkins.io/doc/tutorials for more details on this.

Thank you for reading this, hope it helps you in your project.

--

--