Unpopular opinion: Jenkins is still the best CI/CD tool

Dario Simonetti
Attest Product & Technology
7 min readApr 19, 2017

The competition is getting much better, but Jenkins has a hidden killer feature that no other tools offer (with video)

I know, I know… There isn’t much love for good old Jenkins. I often hear about people saying that Jenkins has now been surpassed by the competition and that it is now history. Even ThoughtWorks recommends to not use it and to instead use one of the alternatives to model deployment pipelines.

I know the UI/UX isn’t great. Even though the recent release of Jenkins Blue Ocean improves things a lot, real-time logs and updates still rely on polling the backend every now and then. All the major alternative automation servers have adopted WebSockets — it makes sense given they’ve been supported by all the major browsers for more than 3 years.

Because Jenkins has grown so quickly, its documentation isn’t great either. It’s good for basic stuff but for some more advanced things the documentation just isn’t good enough, especially for some plugins. This is the reason why some very good and powerful features don’t get much attention — the aim of this article is to uncover one of these gems.

So why Jenkins?

I’m not advocating that everyone should be using Jenkins. I think it really depends on your requirements and on what you want your CI/CD tool to do for you. If you don’t have many jobs (say less than 10) and most of them roughly do the same thing, the CI/CD tool you use really won’t make much difference. In fact in this scenario you could be better off not using Jenkins as other tools can do the job better.

Jenkins becomes really powerful in an environment where there are a lot of different projects and if you’re building a microservice architecture this is probably the case. You’re using a variety of different technologies but a lot of the deployment pipelines still share quite a bit of logic.

Pipelines are a quite recent concept for Jenkins and users could initially decide to create them by installing specific plugins that supported them. In April 2016 the second major version of Jenkins was officially released, shipping pipelines as part of its core logic (well, technically they’re pre-installed plugins but see them more as modules that make up the core logic). Pipelines are defined by a pipeline script that is written in Groovy and normally lives in a file called Jenkinsfile in the root folder of each project. Pipelines are simply a way of defining jobs as a combination of steps as opposed as to one single block of logic (more on this later).

At Attest we currently have 18 different pipelines and they roughly look like this:

Overview of Attest’s current pipelines

As you can see most of the jobs use Gradle as most of our stack is Java. We also have a service written in Scala and another built in Go. As we took the monolith-first strategy, we have far more libraries than services and most of these will eventually become standalone microservices. More microservices will likely arise to support new business capabilities and we’ll use the right tool for the job, which might not necessarily be Java.

The key thing is that the pipelines share quite a bit of logic, yet they can be quite different. In a setup like this pipelines can end up having a lot of duplication meaning that changing something will need change across a variety of different projects. Sure, you could host your shared script somewhere and have your pipeline download it and execute it with a classic wget … | sh, but what if you need to make a breaking change to it? Are you just going to rename it to *_v2.sh? Let’s be honest: it’s a hack. Jenkins 2 provides a much more elegant solution.

Enter Jenkins 2 shared libraries

As well as introducing pipelines, Jenkins 2 introduced the concept of shared library which allows to specify steps that different pipelines share in a centralised repository. I recently gave a demo showing how powerful Jenkins 2 shared libraries are. The code supporting the demo can be found here — let’s walk through it.

First we define our shared library. Note that it might make more sense for you to have more than one shared library, especially if you have a lot of logic shared across pipelines. You could split libraries according to the domain, for example one for your java logic, one for AWS, etc. The shared libraries live in their own SCM repository and have the following file structure:

The src folder contains your core library logic, for example your logic deploying a service to a specific environment. It’s written in an extended version of Groovy which allows you to execute any of the steps that you can execute in your pipeline. For example you can execute shell commands by simply using the sh step. You can also use other steps that exist within Jenkins 2 and within any of the installed plugins. For example you could use the slackSend step if you wanted to send a message to Slack as long as you have the Slack Jenkins plugin installed.

Any time you install a plugin, you’ll be able to use it in your pipeline and in your shared library. Given there are more than 1,000 Jenkins plugins you can see just how powerful that is! Jenkins 2 also provides a very useful snippet generator which lists all the available steps, providing the associated documentation for each as well as a form to generate your snippet that uses the step.

In our example we have a file in src/com/askattest/Utils.groovy that looks like this:

All this does is define a method notifySlack that takes a message, a color and an optional channel that has a default value and uses the slackSend step (defined in the Slack plugin but no need to import it) to send a message to Slack using a set of credentials that have already been setup in Jenkins. In this case it really doesn’t do much but these methods can get really complicated and it’s great that they all live in a centralised and source-controlled place because it allows you to not repeat yourself and keep your pipelines DRY. It also allows your code to remain clean as you can split it in separate files and methods.

Now we can define a step that makes use of this core logic. To do so create a file in the vars folder called the same as you want your pipeline step to be called, in our case:

This defines a method call which takes a Closure (in this context just think of it as a function) and:

  1. Sends a message to Slack to inform that the build has started
  2. Execute the Closure that has been passed in
  3. If the Closure was executed successfully, it sends a message to Slack to inform that the build succeeded
  4. If anything failed it intercepts the error and sends it to Slack

This is probably logic that you’ll want to do in every single pipeline you have, no matter the programming language the project uses. Defining the step this way will allow you to run that logic by simply wrapping your pipeline logic in a withSlackStatusReporting { } block, like so:

This will alert Slack of the build starting and succeeding/failing without having any of that logic in the pipeline itself, pretty cool uh? Yes, yes I know a lot of CI/CD tools come with Slack integration but this is just an example. The point is I can change the definition of withSlackStatusReporting in one single place and that will automatically change all the jobs that make use of it.

The risk is that by making changes I could potentially break all the pipelines that use it. To avoid this issue, Jenkins 2 shared libraries can be versioned. As you can see at the top of the pipeline definition, the shared library is imported by name and version and in order for that to work you have to configure Jenkins to tell it where to find the shared library:

In this case we’re configuring the shared library globally meaning that all jobs will be able to use it but it can also be configured just for one single folder or organisation. All that configuration is saying is where to find the code for the shared library and what Git credentials to use to fetch it. The RefSpecs value is set to +refs/tags/${library.sharedPipeline.version}:refs/remotes/origin/master, which tells Jenkins to fetch the shared library that has the tag specified in the pipeline. This gives you the flexibility to use branch names instead if you wished to do so.

Conclusions

Jenkins 2 shared libraries allow developers to easily extend the existing DSL to add custom steps that live in a centralised and source-controlled place, keeping the pipelines very simple, DRY and easy to read. It allows the steps to be versioned so that breaking changes in their definition can be made with confidence and without impacting all the existing pipelines.

Jenkins might look a bit aging and out of date, but it often hides powerful features that the competing tools just don’t offer, neither out of the box or with their extensions. To be frank, I would be happy to let Jenkins go but I know by personal experience that that can’t happen until other tools offer this same functionality. All hail good old Jenkins!

Demo video: https://skillsmatter.com/skillscasts/10078-pipeline-as-code-with-jenkins-2-0

--

--

Dario Simonetti
Attest Product & Technology

Technology Director at Attest. Before that Java/Scala Lead at OVO Energy.