Disaster Recovery Guide for Jenkins — 3

M. Altun
Clarusway
Published in
5 min readNov 11, 2021

How to backup and restore your Jenkins data — part 3

We continue exploring how to backup and restore Jenkins data. This time we will have a closer look at how to automate Jenkins data backup to S3 and restore from EC2 using free and open-source tools.

We have already discussed how to backup and restore Jenkins data across bare-metal public cloud servers in the first part of this series. Please see the first article at

We have also discussed how to back up your Jenkins data to S3 with a Jenkins job and restore with a Docker container running on EC2. Please see the second article at

The task in hand:

Your Jenkins is running on an instance with the Ubuntu operating system. And the Jenkins is running on a Docker container. A Jenkins job has been backing up Jenkins data to an S3 bucket every night.

We want to be able to restore Jenkins data which is resting on an S3 bucket with a Docker container running on an EC2 instance, and we want to have our Jenkins up and running quickly and with one-click automation.

Pre-requisites:

AWS free account, Git repo (Bitbucket or Github), Docker, Docker compose, Jenkins, Terraform, Bitbucket pipeline.

Solution:

We have decided to automate the disaster recovery process. As we are assuming that the Jenkins server is no longer available therefore we must pick another form of pipeline automation. View this we’ve decided that to use the Bitbucket pipelines. We could also utilize GitHub actions however that would probably be another article’s subject in the future.

We have also restricted ourselves to a one clicks disaster recovery automation. View this we must use infrastructure as code to deploy resources and restore data therefore we’ve decided to utilize terraform.

We have mentioned in our previous article that Jenkins backup data is already resting on an S3 bucket and a Jenkins job synchronises the data every day.

With one click, the Bitbucket pipeline will activate Terraform script, and Terraform script will automate the disaster recovery process as follows.

Stage 1

Write Terraform script that deploys main resources such as VPC, subnet, EC2, a role for S3, security groups, Amazon Machine Image, internet gateway, CIDR, route table, etc. And we must also have user-data for the EC2 so that once EC2 is started it can automatically clone the Bitbucket repo, change the mode of the docker-compose file, install & enable Docker, install docker-compose, remove the secret file, install Jenkins, install AWS CLI, copy the backup file from S3 bucket and paste it to Jenkins container’s volume, run Jenkins, etc. Finally, we will also have EC2 public IP address printed on the screen for convenience and speed to start using Jenkins immediately.

Please see the following public repo for sample terraform scripts.

Stage 2

Log on to your Bitbucket repo and create a pipeline as follows:

- Create a new repository

- Clone your repo to the local machine

$ git clone <weblink of the repository>

Add Terraform files and related .sh files to your local folder. Remember that each variable or data needs to be changed/maintained depending on your desired configuration and region such as Amazon AMI, eu-west-2, CIDR block, required ports, internet privileges, etc.

Create a bitbucket pipelines YAML file namely “bitbucket-pipelines.yaml” content of which should be as follows.

image: hashicorp/terraform
pipelines:
custom:
restore-jenkins-on-aws:
- step:
name: Deploy on AWS
script:
# - pipe: atlassian/git-secrets-scan:0.4.3
- terraform init
- terraform plan
- terraform apply -auto-approve
# uncomment below section for tests
# after-script:
# - echo "Sleeping"
# - sleep 10m
# - terraform destroy -auto-approve
# - echo "deploy completed..."

Now you can push all your files to bitbucket

$ git add .
$ git commit -m “Terraform files and scripts created.”
$ git push

Now we have the material for the pipeline to run our disaster recovery automation therefore we will go to the Bitbucket user interface and select

“Pipelines”

- You should see “Looks like you already have a bitbucket-pipelines.yml file in this repository” displayed

- Select ‘enable pipeline”

- Add variables:

AWS_ACCESS_KEY_ID: <AWS_ACCESS_KEY_ID>
AWS_SECRET_ACCESS_KEY: <AWS_SECRET_ACCESS_KEY>
AWS_DEFAULT_REGION: ‘eu-west-2’
S3_BUCKET: <bucket-name>

- Click Run pipeline

These files should be in the Bitbucket repo:

- in Bitbucket private repo (AWS credentials are input as variables)

- main.tf (bitbucket credentials included as secret)

- secret file (to be able to git clone for this repo)

- .sh file is ec2 starting script called user-data (repo clone-docker install-docker-compose up — restore-script.sh)

#!/bin/bash# clone repo
cd /home/ubuntu
git clone <weblink for bitbucket repo>
cd /home/ubuntu/terraform-on-aws
# change mode of the file to be able to run as a script
chmod 766 install_docker_and_docker_compose.sh
# install docker and docker-compose with downloaded script
sudo ./install_docker_and_docker_compose.sh
# create infrastructure with docker-compose
cd /home/ubuntu/terraform-on-aws/jenkins
sudo docker-compose up -d
# stop jenkins container before restore
sudo docker stop jenkins
# install aws cli
sudo apt install unzip
curl “https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o “awscliv2.zip”
unzip awscliv2.zip
sudo ./aws/install
sudo rm awscliv2.zip
# the jenkins’ data folder: /var/lib/docker/volumes/jenkins_jenkins-data/_data/
# aws s3 sync s3://jenkins-daily-backup-files2 /var/lib/docker/volumes/jenkins_jenkins-data/_data/ — delete
# — delete option will be enabled on live usage
# create s3 bucket
# create role for EC2 named “terraform-on-aws-role-for-s3” includes AmazonS3FullAccess
# attach role to EC2
# Do not forget to change s3 bucket name as it is unique
sudo aws s3 sync s3://jenkins-daily-backup-filesXX /var/lib/docker/volumes/jenkins_jenkins-data/_data/# start jenkins container after restore
sudo docker start jenkins

Stage 3

Copy EC2 IP address from the screen. And on the internet browser search bar <EC2 public IP address>:8080, to login to the Jenkins console and you should be able to login using credentials from the existing Jenkins master.

We were planning to complete this series at the third article, but we have been asked to also write one regarding GitHub actions instead of Bitbucket pipeline and who we are to object to the readers’ demand, therefore we are planning to revert with an additional article soon covering Github actions for the very same Jenkins disaster recovery task.

See you at the fourth article.

Best regards

Authors:

M. Altun

F. Sari

S. Erdem

11Nov2021, London

DevOps Engineer @ Finspire Technology

--

--

M. Altun
Clarusway

2x AWS certified, currently DevOps Engineer at Send Technology, previously DevOps Engineer at Finspire Technology. An ordinary bloke from London.