Building an E2E DevOps Pipeline

Ahmed KHALED
19 min readMay 22, 2019

--

What is DevOps ?

DevOps is a set of software development practices that combines software development (Dev) and information technology operations (Ops) to shorten the systems development life cycle while delivering features, fixes, and updates frequently in close alignment with business objectives.

Requirements

Before you start, I assume you have the following:

  • Docker.
  • Docker Compose.

You can find source code of the project ( the Spring boot project with “Swagger, Unit Test, Integration Test, etc” and all the Docker images “Jenkins, SonarQube, Nexus, Ansible, etc” necessary for our example) on my Github.

Pipeline Architecture

The pipeline that we will create is represented by the following diagram:

  1. The developers push their code changes / changes in Github.
  2. Jenkins downloads the project in the workspace.
  3. In this step Jenkins will perform in parallel the compilation of the Maven project (generation of * .class) and the static analysis of the code with the CheckStyle plugin using Jenkins Docker agents.
  4. Running Unit Tests using JUnit and Mockito.
  5. Running Integration Tests using JUnit and Mockito.
  6. In this step we find the generation of documentation and the parallel execution of a set of code analyzes using SonarQube, PMD and FindBugs.
  7. Storing artifacts in the Nexus repository.
  8. Provisioning with Ansible. That is the preparation of the environment to ensure the installation of jdk and tomcat on our servers.

8.1. Deploy the artifact in the staging server which is a Docker container with the Ubuntu OS.
8.2. Deploying the artifact in the production server that is an EC2 instance in our case.

Blue Ocean UI
Jenkins pipeline UI

Setting up Jenkins

Jenkins is an open source automation tool written in Java with plugins built for Continuous Integration purpose. Jenkins is used to build and test your software projects continuously making it easier for developers to integrate changes to the project, and making it easier for users to obtain a fresh build. It also allows you to continuously deliver your software by integrating with a large number of testing and deployment technologies.

With Jenkins, organizations can accelerate the software development process through automation. Jenkins integrates development life-cycle processes of all kinds, including build, document, test, package, stage, deploy, static analysis and much more.

  • During this project we will use a Jenkins container created by Docker.
docker-compose.yml

Line # 6: The image we are going to use is jenkinsci / blueocean. Docker and the blue Ocean plugin are installed in this image.

Why do we need the docker in the Jenkins image ?

Simply to allow the Jenkins server to build images locally and use docker agents in pipelline for running the stages. For more info Docker-in-Docker for CI

Line # 7: The default user of this image is Jenkins so he does not have the right to execute Docker commands. To solve this problem, we have two solutions:

  • Launch the container with the root user
  • Add jenkins user to the docker group.

If we try to run a docker command as jenkins user, we get the errors shown below:

# 10: Expose the docker host daemon in jenkins container so that it can create containers and images in the host machine.

  • We run this command to create the Jenkins container
  • Open the browser with this address “docker-host-ip: 8085” (localhost for Linux), and you will get the following page:

In our case you will find the password in “./jenkins-data/secrets/initialAdminPassword” since we created the jenkins-data volume in docker-compose.

  • Select install suggested plugins:

After installation we will get the dashboard of Jenkins and we notice the appearance of Blue Ocean.

SCM Stage

In this stage we will define the Github repository of our project which will be cloned at each launch of the pipeline.

  • In the Blue Ocean Dashboard, click on create a pipeline, then select the Github and the repository of our project.

Blue Ocean gives us the possibility to create our pipeline from scratch using pipeline-editor or to retrieve declaration of the pipeline from Jenkinsfile in the repository.

Note: To authenticate to Github, I advise you to use a token instead of email and password. The figure below shows how to generate a token in Github.

  • By running our pipeline, we get the following result
  • We check that Jenkins has cloned our project under the directory “jenkins-data/workspace”

Compile Stage

In order to verify that Jenkins has created a Maven container for the compilation of the project, we can execute the following command to visualize our running containers in real time:

$ docker stats

We note that a container with a random name ‘practical thompson’ in my case is compiling our project.

Persist the Maven dependencies:

Now if we rerun the pipeline, we note that Jenkins has found the docker image of Maven since it has already downloaded previously. However, Maven will re-download all the dependencies again. Why ?

Just because the dependencies downloaded by Maven are stored inside the container and it will be destroyed at the end of the execution.

The solution is very simple, just cache the dependencies of Maven locally. To do this we create a docker volume with the -v option.

Checkstyle Stage

CheckStyle is a development tool to help programmers write Java code that adheres to a coding standard. It automates the process of checking Java code to spare humans of this boring (but important) task. This makes it ideal for projects that want to enforce a coding standard.

  • We will use the Maven Checkstyle plugin as follows:
$ mvn checkstyle:checkstyle
  • After the execution of the command, you can verify that ‘checkstyle-result.xml’ file has been generated in the target directory:

Integration of Jenkins and Checkstyle:

Installing the Checkstyle plugin from Jenkins:

Given that we are going to use Maven’s Checkstyle plugin, so the docker image to be used by docker agent to generate checkstyle-result.xml is Maven.

#51 : reuseNode a boolean, false by default. If true, run the container on the node specified at the top-level of the Pipeline, in the same workspace, rather than on a new node entirely.

#56–63 : Jenkins checkstyle plugin configuration. The most important thing is to specify the path of the checkstyle-result.xml file in the ‘pattern’ argument.

  • We launch our pipeline:
  • After the execution of our pipeline, we can see the results of the analysis with Jenkins plugin that has already been installed:
  • The following figure shows that we need to add the Javadoc comments of the main method.

Unit Test Stage

#69–71 :This stage will be executed only on the develop or master branches.You can change them, just it’s my choice to speed up the development time and not run this stage at every feature branch.

#80 : Maven command to run the unit tests.

#82–86 : Always after running tests, the results will be exposed to Jenkins/Blue Ocean Dashboard using the JUnit plugin of Jenkins. UT results can be found under the target/surfire-reports directory with the xml extension.

Integration Tests Stage

#88–91 : This stage will be executed only on the develop or master branches.

#99–101 : Maven command to generate artifact and run the integration tests.

Note: -Dsurefire.skip=true : argument to skip unit test (not rerun UT)

#102–105 : Always after running tests, the results will be exposed to Jenkins/Blue Ocean Dashboard using the JUnit plugin of Jenkins. The IT results are located under the target/failsafe-reports directory with the xml extension.

#107–108 : save the artifacts and pom file for future uses such as deployment. This is useful if we are in a distributed environment. For example, if Jenkins has a node for deployment and another node for build and testing, then instead of rerunning all steps ( compile-> UT-> IT-> Artifact ) in the deployment node, just recover the artifacts generate by the other node.

#110 : Expose the artifacts to Jenkins/Blue Ocean Dashboard.

Code Quality Analysis Stage

In this part of the pipeline, several code analysis stages will be run in parallel. There is also the generation of Javadoc documentation.

PMD:

PMD is a static source code analyzer. It finds common programming flaws like unused variables, empty catch blocks, unnecessary object creation, and so forth. It’s mainly concerned with Java and Apex, but supports six other languages.

#115 : declare in this part the stages that will be run in parallel.

#116–123 : use Maven image as Jenkins agent.

#125 : Maven command to run pmd plugin of maven. The results will be stored in the pmd.xml file.

#127 : Specify the path of the pmd.xml file that will be used to display the results in the Jenkins interface (PMD warnings).

FindBugs:

FindBugs is a tool for analyzing the compiled files (bytecode)/(* .class) to detect bugs such as “unused private methods, execution of a method on a null object (NullPointerException), …”

Note: If the project is not compiled, FindBugs does not report anything. So you have to compile the project before running FindBugs (mvn clean compile findbugs: findbugs) / (findbugs: gui => graphical interface).

  • Installation of the Jenkins Findbugs plugin.
  • The following figure shows the lines to add in the Jenkinsfile file.

#139 : Maven command to run the Maven Findbugs plugin. The result will be stored in the findbugsXml.xml file.

#141 : specify in the Jenkins Findbugs plugin the path of the file that will be used to display the result in the Jenkins interface.

JavaDocs

#154 : specify the path of javadoc to the Jenkins JavadocArchiver plugin.

  • By clicking on JavaDoc, we get the following page

Warning Next Generation plugin:

If you noticed that during the installation of Pmd, Checkstyle and Findbugs plugins, it is noted that they are deprecated and all of them are integrated the Warning Next Generation plugin. So no need to install each plugin separately, just install Warning Next Generation.

  • In this part we will install this plugin and use it in our pipeline after the execution of “Code Quality Analysis” stage.
Post {
always {
// using warning next gen plugin
recordIssues aggregatingResults: true, tools: [
javaDoc(),
checkStyle(pattern: '**/target/checkstyle-result.xml'),
findBugs(pattern: '**/target/findbugsXml.xml',useRankAsPriority: true),
pmdParser(pattern: '**/target/pmd.xml')
]
}
}
FindBugs Warnings
PMD warnings
CheckStyle warnings

SonarQube

SonarQube (formerly Sonar)[1] is an open-source platform developed by SonarSource for continuous inspection of code quality to perform automatic reviews with static analysis of code to detect bugs, code smells, and security vulnerabilities on 20+ programming languages. SonarQube offers reports on duplicated code, coding standards, unit tests, code coverage, code complexity, comments, bugs, and security vulnerabilities.

Note:

SonarQube uses H2 as the default database

#19: we find in this folder the configuration files of SQ (i.e sonar.properties, etc).
#20: data files (i.e H2 database, Elasticsearch indexes, etc.)
#21: here we find the installed plugins.

  • To launch the SQ container, type the following command:
$ docker-compose up -d sonarqube
  • To access to SQ Dashboard, just open the browser with this address ‘docker_host-ip: 9000’.
  • login: admin and password: admin123
  • If we try to run the sonar plugin of Maven on our project with the following command:
$ mvn sonar:sonar -Dsonar.host.url=http://192.168.99.100:9000
  • We get the following error because no code analysis plugin is installed in SQ
  • To solve this problem, you need to install the Java code analysis plugin in SQ Marketplace
  • After installation we restart our server.
  • Or we can put the plugin directly in the folder ‘/extensions/plugins’.
  • In order to verify that the plugin has been installed, we should find it in the extensions/plugins folder.
  • rerun this command:
$ mvn sonar:sonar -Dsonar.host.url=http://192.168.99.100:9000
  • And finally our project has been analyzed by SQ

Integration of SonarQube with Jenkins

  • The first step is to specify the SonarQube address and port in the environment variables.

In the environment section just put the address and port of the SQ server.

For those using linux, just put localhost as adresee. For Windows users put the address of docker’s host in my case 192.168.99.100.

The following command returns your IP address

$ docker-machine ip
$ 192.168.99.100

#166 : to run sonar plugin of Maven.

Jenkins Blue Ocean pipeline Dashboard
SonarQube Dashboard

Nexus 3 Repository

Nexus is a software tool designed to optimize the download and storage of binary files used and produced in software development. It centralizes the management of all the binary artifacts generated and used by the organization to overcome the complexity arising from the diversity of binary artifact types, their position in the overall workflow and the dependencies between them.

  • Open the browser with port 8081 to access Nexus UI. login: admin and password: admin123.
  • Default repositories
  • maven-snapshots repository:
  • In our example, we will use maven2 hosted repository (created by default) for storing our artifacts. Otherwise you can create one.

Nexus integration with Jenkins

  • installation of Nexus plugin under Jenkins
  • Create credentials to connect to the Nexus server
  • Installation of pipeline-utility-steps plugin.
  • This plugin allows us to read and extract information from the pom.xml file. For more info click here and here.
  • Add the address, port, version, protocol, repository (where we are going to store our artifact) and the id of the credentials (that we already created before) of Nexus in the Jenkinsfile environment block.

Notice : I put as url nexus: 8081 but I could also put 192.168.99.100 (localhost for Linux) but since in our case Jenkins and Nexus are in the same network named “devops”, so the resolution of DNS between the containers will be made automatically by Docker. i.e we can use either @ip, hostname (default service name) or the id of the container to ping/connect to the containers.

Nexus Stage:

#7–8: To retrieve the artifact that is already generated in the “Integration Tests” step and to avoid rerunning mvn package again in the case where we have several slave nodes in Jenkins. For example, if node_1 is responsible for running Integration Tests and generating the artifact, and node2 is responsible for running “Deploy Artifact”. Instead of rerunning mvn package, use the unstash function to retrieve the artifact

  • We check that our artifact has been stored in our Nexus repository.

Ansible

Ansible is an open-source software provisioning, configuration management, and application-deployment tool. It runs on many Unix-like systems, and can configure both Unix-like systems as well as Microsoft Windows. It includes its own declarative language to describe system configuration.

Staging Server

  • In our example we will use as a development server a Docker container based on an image that I created in Dockerhub named ansible-target. Below the Dockerfile of this image.

#1 : ubuntu-sh-server is an ubuntu image with an ssh server installed. For more details ubuntu-ssh-server.

#8–9 :creation of an “ansible” user and assigned him to the “ansible” group.

#10 : set password of ansible user to ansible

#11 : add ansible user in sudoers because Ansible uses sudo to execute commands.

#12 : Ansible is based on Python so you have to install python in ansible_target and by default Ansible checks for the existence of python under /usr/ bin/python. Or python3 is installed by default in Ubuntu under /bin/python3, so you can just create a symbolic link.

  • The following command will launch our dev server:

In order to connect to this server using ssh, it is necessary to add the public key of Ansible master in Ansible target. to do this there are several methods but I will use the simplest (unsecured).

  • Execute the following command in Ansible target:
echo "ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDGHBsoki/RIm9uMwp+c1LcxHOo46YWYNjypGWpNWlsVB7S+Kibh+73LiPRRxwFRhSCkUwYyi4EEG6cstd8vELA4Mggv5A2uS/siciNcMCmF7Lr28yPfJMt3yX9LjDkHRDz9W28ncaeTLE0vuGphjx8kKG8h+zc5maLEcFwzbMv31ULbd3qCqhK35rgBP/OQT/bww4TikUprgdYX6+wkx5f3QflmaVTsM1jtmeTm8ME+XqWml8Nm8mZlxmzos2Pz84F3ilxrc41eStQk/FXaGaxlLihd8LFoFoqiYO4KlIdszOTd3jq6oMrj6Fy0HSE1gqe6hW+RQqN69mH3SRPDbwX root@7aecbf9c557f" > /home/ansible/.ssh/authorized_keys

Or using the cmd below

ssh-copy-id -i ~/.ssh/id_rsa.pub ansible@ip_machine_ansible_target

Notice: I retrieved the public key from the Docker image Ansible_management, just launch a container and you will find the key in /root/.ssh/id_rsa.pub

You can also use your own keys or regenerate a key pair with the ssh-keygen command.

  • By launching our pipeline, we get the following result:
  • Logs of ansible playbook:
  • Check in your ansible-target that Java has been installed and our application has been deployed on Tomcat.
java version
application deployment
Tomcat Server
Swagger documentaion

Prod Server Amazon EC2

Before we start and since we are going to use a remote EC2 server, so we need our Nexus repository to be accessible from the internet since EC2 will download our artifact (.war). To do this you have several solutions, either you use the method of port forwarding, create the Nexus container in EC2 instead locally or simply by using ngrok as following:

  • For Linux users replace 192.168.99.100 by localhost.
  • ngrok provides us a public address to access Nexus
  • We check that our server is accessible using the address generated by ngrok.
  • Now we only have to change the NEXUS_URL environment variable in Jenkinsfile by this address:
  • In the configuration of our EC2 instance, create a key pair and download it. The downloaded file is used to connect to our instance.
  • In the “Security Group” section, we add the following two rules to make our tomcat and ssh server accessible from any machine.
  • The following command connects to our EC2 instance
  • Creating the ansible user on EC2
  • Add ansible to sudoers
  • Creating a symbolic link to python3
  • Added the public key of Ansible management to make EC2 accessible from Ansible management using ssh.
  • Finally, add the IP address of our instance in the “hosts” file located under the “ansible_provisioning” directory.
  • Launch only the “Deploy to Prod Servers” stage to test.
  • We notice in these logs that Ansible is installing the JDK.
  • After running Ansible, we note that the deployment is complete and no errors have occurred.
  • Finally, We check the installation of Java, Tomcat and the deployment of our application on our EC2 instance

Related Resources:

Swagger Generation With Spring Boot

Setting Up Swagger 2 with a Spring REST API

Jenkins Part 4.2: Code Quality Tests via Checkstyle

SonarQube image

Integration Tests with @SpringBootTest

Testing in Spring Boot

Mockito When/Then

Mocking Void Methods with Mockito

Nexus 3

Jenkins-plugins-in-jenkins-pipeline

Plugins offer Pipeline-compatible steps.

--

--