Start Continuous integration with Jenkins Pipeline

Iren Korkishko
5 min readJan 25, 2018

--

Originally I’ve posted this material to Syndicode blog.

This article is related to the material about continuous integration and delivery with Github, Gitflow, and Jenkins. But this time I will expand the topic of continuous integration with Jenkins and dive into details about Jenkins Pipelines. Here you will find everything you wanted to know about continuous integration with Jenkins Pipeline!

Ok, sure you know this all, but basic terms will never worse the situation. And we’ll start with them.

Main terms

Jenkins is an open-source continuous integration (CI) tool that helps orchestrate the development processes (build, test, and deployment) with automation. In other words, Jenkins is one of the leading tools that would help a development team industrialize its processes. This is the developer’s teammate, whom you can ask to put your code into production (or staging) when you push code on specific branches (master and develop).

CI (continuous integration), as you already know, is the practice of merging all developer working copies to a shared mainline several times a day.

Jenkins is useful because it orchestrates freestyle jobs into a CI pipeline.

Pipeline (Jenkins Pipeline) is a suite of plugins which supports implementing and integrating continuous delivery pipelines into Jenkins. A continuous delivery pipeline is an automated expression of your process for getting software from version control right through to your users and customers.

Pipeline adds a powerful set of automation tools onto Jenkins. Setting up a Pipeline project means writing a script that will sequentially apply some steps of the process we want to accomplish.

Jenkinsfile is a text file that contains the definition of a Jenkins Pipeline and is checked into source control.

Build Jobs are the runnable tasks that are controlled and monitored by Jenkins. Examples of jobs include compiling source code, running tests, provisioning a test environment, deploying, archiving, posting build jobs such as reporting, and executing arbitrary scripts.

Pipeline

Jenkins Pipeline features

  • Code: Pipelines are implemented in code and typically checked into source control, giving teams the ability to edit, review, and iterate upon their delivery pipeline.
  • Durable: Pipelines can survive both planned and unplanned restarts of the Jenkins master.
  • Pausable: Pipelines can optionally stop and wait for human input or approval before continuing the Pipeline run.
  • Versatile: Pipelines support complex real-world continuous delivery requirements, including the ability to fork/join, loop, and perform work in parallel.
  • Extensible: The Pipeline plugin supports custom extensions to its DSL (Domain-Specific Language) and multiple options for integration with other plugins.

Jenkins Pipeline Terms

Step — a single task; fundamentally steps tell Jenkins what to do.

Node. Most work a Pipeline performs is done in the context of one or more declared node steps. Node selects where the pipeline will be executed. Confining the work inside of a node step does two things:

  1. Schedules the steps contained within the block to run by adding an item to the Jenkins queue. As soon as an executor is free on a node, the steps will run.
  2. Creates a workspace (a directory specific to that particular Pipeline) where work can be done on files checked out from source control.

Stage is a step for defining a conceptually distinct subset of the entire Pipeline, for example: “Build”, “Test”, and “Deploy”, which is used by many plugins to visualize or present Jenkins Pipeline status/progress.

Declarative and Scripted Pipeline

Jenkins Pipeline is using a Domain Specific Language(DSL) with two different syntaxes:

  • Declarative Pipeline
    presents a more simplified and opinionated syntax on top of the Pipeline sub-systems.
  • Scripted Pipeline
    is following a more imperative programming model built with Groovy.

Jenkinsfile

Jenkinsfile is used to replace the three Jenkins build jobs that are currently used:

  • Multi-branch for main integration branches: develop, release, hotfix, and master.
  • Merge request for automatic testing of GitLab merge requests.
  • Parameterized for on-demand testing.

Problems you can solve using Jenkinsfile

  • Define the CI/CD pipeline as a code to make it self-documented, reproducible and versioned.
  • Have a single definition of build steps for any type of build job, be it multi-branch, merge requests or parameterized.
  • Get away from a manual configuration of build steps.
  • Make the Pipeline easily extendable. E.g., it should not be complicated to add a new static analysis tool report into all the configured build jobs.

Bringing your code into production

Defining a Pipeline

  1. Set-up / Configure a build environment.
  2. Check out your code.
  3. Build your code. Make sure you don’t use any environment specific settings for the build process could be independent of the environment.
  4. Perform quality controls. This step consists out of two main tasks: running tests and perform code quality checks.
  5. Deploy your code on a Continuous Integration environment.
  6. Run functional tests.
  7. Deploy the code on the test environment.
  8. Deploy the code on the user acceptance environment.
  9. Deploy the code on the production environment.

One common way of triggering a job is to commit a change to a repository. This means that when a developer finishes a development task and pushes their changes onto the project’s repository (e.g., the Git push command if you’re using Git), the job will be automatically triggered. An easy way to do this is via the GitHub Jenkins plugin.

Best practices for Jenkins deployment

  • Jenkins does not perform any security checks as part of its default configuration, so always ensure that you authenticate users and enforce access control on your Jenkins servers. Secure your Jenkins servers.
  • In a large, complex integration environment that includes multiple users that configure jobs, you should ensure that they are not running builds on the master with unrestricted access into the JENKINS_HOME directory. Careful with the master(s).
  • In order to make sure all configurations and activity logs will be available when needed, use backup configuration.
  • Jenkins needs disk space to perform builds, store data logs, and keep archives. To keep Jenkins up and running, make sure that you reserve 10 percent or more of the total disk space for Jenkins in order to prevent fragmentation.
  • Jenkins 2.0 version offers pipeline as code, a new setup experience, and several UI improvements. Use it. (You will find more info about it with the link “Pipeline as a code” added in the end of the article).

Best practices for Jenkins Pipeline plugin:

  1. Don’t use older plugins like Build Pipeline plugin or Buildflow plugin. Instead, use the real Jenkins Pipeline suite of plugins.
  2. Develop your pipeline as a code. Use the feature to store your Jenkinsfile in SCM then version and test it like you do other software.
  3. Any non-setup work within your pipeline should occur within a stage block.
  4. Pipeline offers a straightforward syntax for branching your pipeline into parallel steps. Use it!
  5. Pipeline has an easy mechanism for timing out any given step of your pipeline. As a best practice, you should always plan for timeouts around your inputs.
  6. Set environment variables with env global variable.

Useful resources

Hope this article was helpful!

--

--