Updated May 25th, 2018
Maven For Pipelining
This is part one of a three-part series describing how we use Maven as a build tool in CICD pipelines. This first part introduces our approach to using Maven and showcases a parent POM that sets up the Maven Lifecycle. The second part focuses on a child project and presents the individual Maven commands used for each pipeline step. The third part concludes the series with a discussion on test design and showcases a fully functional HTTP API.
Many people decide to move away from Maven when implementing automation and CICD pipelines. They prefer to use a tool like Gradle which does not lock them into a structured build cycle. Others simply feel configuration via XML is too cumbersome and would prefer to use code. But Maven’s maturity and vast community make it a good choice for automation. There is an immense bounty of extensions and examples ready for the taking. If there is something you want to do outside Maven’s core functionality (core plugins) — there is a good chance there is a community plugin to do it. And as we’ll see, by making a small adjustment in how we use the tool we can build a robust pipeline for any project with little effort.
Maven’s Lifecycle is set and rigid by design. This is because there is an order to its build process. Executing a goal at the bottom of the lifecycle forces Maven to start at the beginning and run through all the goals in between. For example, running mvn install
will run through the entire lifecycle — validating the project, generating and processing source code, generating and processing resource files, compiling and processing classes and test classes, generating and processing test resources, running unit tests, packaging the project, preparing for integration tests, running integration tests, running any post-integration test executions, and finally installing the build artifact into the local repository.
Each one of these lifecycle phases has a plugin (or plugins) bound to it providing the desired functionality. Some plugins, like the SureFire plugin which runs unit tests, are bound by default. Others, like the FailSafe plugin which is used to run integrations tests, can be bound if needed.
From the command-line or within an IDE, embracing Maven’s lifecycle can be advantageous. Explicitly executing your final goal will implicitly execute the lifecycle up to that point. But this is not very convenient for pipelining. When pipelining we want to execute a series of smaller, more granular steps, and make judgments on whether or not to proceed based on their output. A command like mvn install
does too much. If the CI job running the command breaks there are several things that could have gone wrong. Could it not compile? Could it not run tests? Did the tests fail? Which tests failed, unit tests or integration tests? So when setting up the project’s CI pipeline we won’t use a single step that runs a command like mvn install
. Instead we’ll create discrete pipeline steps which directly call their corresponding Maven plugins.
The Parent POM
So how will we do all this? To start, we’ll use a parent POM to bring boilerplate functionality to all our projects. Many people believe that a parent POM is only used for multi-module projects. Not so. By using a parent POM as a separate Maven project we greatly decrease the complexity of our child projects and provide separate versioning of common concerns. Let’s take a look at the parent POM we will be using for this series. You can clone the project from GitHub.
$ git clone https://github.com/eonian-technologies/parent-pom.git
The parent POM project is a multi-module project that provides a hierarchy of POMs for specific packagings.
$ cd parent-pom
$ tree
.
├── jar-parent-pom
│ ├── pom.xml
│ └── war-parent-pom
│ └── pom.xml
└── pom.xml
pom
projects would use the parent-pom
(the top-level pom.xml
) as their parent, jar
projects would use the jar-parent-pom
as their parent, and war
projects would use the war-parent-pom
as their parent. This allows overrides and iheritance, and places concerns in the correct location.
Dependency Management
While not specifically related to pipelining, a benefit of using a parent POM is that we can set up dependency management for all child projects. For example, if you want all your child projects to use the same version of a logging framework, you can lock to that version in the parent POM and the child project will inherit the version in its dependency
section. Look at how the jar-parent-pom
sets up its dependencyManagement
section to lock versions of slf4j
, Logback
, and Logback Contributions
. These frameworks have multiple JARs. A child project could depend on any combination of them. By using their BOM in the dependencyManagement
section, we ensure that any direct dependency or transitive dependency on any JAR in the bill of materials will use our specified version. These managed dependencies are also inherited by the war-parant-pom
.
Plugin Configuration
Another benefit of our parent POM (again, not necessarily related to pipelining) is to provide configuration to build plugins. For example, we can ensure that we always see compiler warnings by configuring the maven-compiler-plugin
in the pluginManagement
section. This plugin is bound by default to the compile
phase of the Maven Lifecycle. By configuring it here we ensure that every child project (jar or war) inherits this configuration.
Profiles
Our parent POMs also group properties,
pluginManagement
, and plugins
, into profiles
. Many of these profiles are activated by default and allow us to add some really useful functionality to child projects without them even knowing. We enforce a dependency blacklist, write a build.properties
file containing build and SCM information, set up code coverage for unit tests and integration tests, and bind the FailSafe plugin to the integration-test
phase.
Notice how these profiles are activated by the existence of the src
directory. Since that directory is always there, these profiles are always on when executing goals that are part of the Maven Lifecycle.
Unit Tests
The unit-test
profile is placed in the jar-parent-pom
and is inherited by the war-parent-pom
. It configures the JaCoCo plugin for code coverage, and the SureFire plugin to run test files that end in *Test.java
. Both plugins are then bound to the Maven Lifecyce.
Integration Tests
The integration-test
profile is also set up in the jar-parent-pom
. It too sets up the JaCoCo plugin for code coverage, but uses the FailSafe plugin to run test files that end in *IT.java
.
Integration Tests For WAR Projects
The war-parent-pom
adds the javaee-web-api
as a managed dependency, configures the maven-war-plugin
, and reconfigures the integration-test
profile to deploy the project to a local Tomcat Server before running integration tests. The server is configured for code coverage, and the FailSafe plugin is used to run test files that end in *IT.java
. Integration tests are written to make requests to the local server and validate their responses. After the tests have run, coverage information is dumped from the server, the server is stopped, and a coverage report is generated.
If you do not want to use the FailSafe plugin to run your integration tests, you can easily bind Newman, JMeter, or Selenium to the integration-test
phase in your child project. As with Java test classes, the test files for these other platforms should be included in the project’s source. We will showcase this in later articles.
Tests Are Not Optional
Testing is a major pillar of automation. If you do not have tests, or you cannot measure your tests, then you should not attempt automation. Why? Because there is no way to judge if what worked yesterday still works today, and there no way to know if new code works as intended. And if we cannot determine these things, then we cannot guarantee a low-risk deployment. So testing is fundamental to our goal of automation.
Do all projects need integration tests? No. But if your code relies on external systems, then integration tests are a good idea. You must be able to ensure that the response you assumed a downstream system would return when you wrote your code, is actually what the deployed system returns. Even if you wrote unit tests with mocks for these calls, you need to verify that the downstream system behaves as intended.
Next…
Take some time to review the parent POM project before continuing to part 2, where I’ll present the Maven commands for each discrete step in the CI pipeline (Build, Unit Test, Integration Test, Code Analysis, and Publish Artifacts).
ABOUT THE AUTHOR:
Michael Andrews is an experienced platform/cloud engineer with a passion for elegant software design and deployment automation. He is committed to developing light-weight malleable software and decoupled event-driven code using Domain-driven Design principles and hexagonal architecture. He is a specialist in Kubernetes, Java, Spring, DevOps, CICD, and fully tested low risk no downtime automated deployments.