CI-CD pipeline to create a “Golden image” using PACKER

Aman Singh
5 min readFeb 2, 2024

--

Introduction

In the dynamic landscape of DevOps, a robust CI/CD pipeline is crucial for efficient and reliable AWS AMI creation. This tutorial will guide you through the implementation of a streamlined process using GitHub for version control, Jenkins for automation, and Packer for machine image creation.

Project Overview

To begin, let’s understand the necessity for a robust CI/CD pipeline in AWS :

Efficiency in Development:

  • An effective CI/CD pipeline ensures rapid and consistent development cycles for AMIs, fostering agility in the software development process.
  • With automated testing and continuous integration, developers can identify and rectify issues early in the development phase, reducing the likelihood of errors reaching the production environment.

Scalability and Consistency:

  • As applications scale in AWS, maintaining consistency across AMIs becomes paramount. A CI/CD pipeline facilitates the creation of standardized and version-controlled AMIs, ensuring uniformity in deployment across multiple instances.

Faster Time-to-Market:

  • Rapid iteration and deployment through an automated pipeline significantly reduce the time it takes to bring new features or updates to market. This agility is crucial in competitive environments where time-to-market can be a decisive factor.

Outline of Tools Used:

In the development of a robust CI/CD pipeline for AWS AMIs, a strategic selection of tools is crucial. Here’s a concise overview of the key tools employed in this project:

GitHub:

  • Role: Version Control
  • Description: GitHub serves as the central repository for source code, allowing version control, collaboration, and seamless integration with CI/CD pipelines.

Jenkins:

  • Role: Automation Server
  • Description: Jenkins orchestrates the CI/CD pipeline, automating tasks such as building, testing, and deployment. Its extensibility supports integration with various plugins and tools.

Packer:

  • Role: Machine Image Creation
  • Description: Packer automates the creation of machine images, ensuring consistency and reproducibility. It supports multiple platforms, making it ideal for creating Amazon Machine Images (AMIs) for AWS.

Toolchain Workflow:

  • GitHub: Developers push code changes to GitHub repositories.
  • Webhooks: GitHub webhooks trigger the Jenkins pipeline upon code commits.
  • Jenkins: Orchestrates the CI/CD pipeline, handling tasks like image creation and deployment.
  • Packer: Executes scripts to create machine images based on predefined configurations.
  • AWS (Amazon Web Services): Hosts and deploys the generated AMIs, contributing to the scalable and dynamic infrastructure.

Implementation Steps for the project:

STEP 1: Developers push code changes to GitHub repository.

Start by establishing a secure connection between your local system and GitHub using SSH. Follow this comprehensive guide for step-by-step instructions: Connecting Linux to GitHub using SSH: A Step-by-Step Guide.

Once you’ve successfully connected your local system to GitHub:

  1. Navigate to the GitHub repository using this link: GitHub Repository — Creating Golden Image Using Packer.
  2. Inside the repository, you’ll find a .pkr.hcl file responsible for creating the Amazon Machine Image (AMI). This file contains the necessary configurations for Packer.
  3. To install all required software and libraries in the image, a shell script is utilized. Ensure that the script is appropriately configured to meet the project’s dependencies.
  4. The project also includes a Jenkinsfile that encapsulates the details of the CI/CD pipeline. This file serves as a blueprint for Jenkins to automate tasks, including building and deploying the AMI.

STEP 2: GitHub webhooks trigger the Jenkins pipeline upon code commits.
Go to your GitHub repository and click on ‘Settings’.

Click on Webhooks and then click on ‘Add webhook’.

In the ‘Payload URL’ field, paste your Jenkins environment URL. At the end of this URL add /github-webhook/. In the ‘Content type’ select: ‘application/json’ and leave the ‘Secret’ field empty.

Press on “Add Webhooks”.

We’re done with the configuration on GitHub’s side! Now let’s move on to Jenkins.

STEP 3: Orchestrates the CI/CD pipeline, handling tasks like image creation and deployment.

In Jenkins, click on ‘New Item’ to create a new project.

Give your project a name, then choose ‘Pipeline’ and finally, click on ‘OK’.

In “General” section select ‘Discard old builds’ , in that put a value in ‘Max # of build to keep’.

Click on the ‘Build Triggers’ tab and then on the ‘GitHub hook trigger for GITScm polling’.

In ‘pipeline’ section, Select “pipeline script from SCM” for Defination block. provide your github repo link, and select “main” branch.Input “Jenkinsfile” for Script path.

STEP 4: Providing a Role to AWS EC2 instance.

To assign a role to an EC2 instance in AWS, you can follow these general steps:

  • Go to the AWS Management Console and navigate to the IAM (Identity and Access Management) service.
  • In the left navigation pane, select “Roles” and then click on “Create Role.”
  • Choose the service that will use this role. In this case, it might be “EC2.”
  • Attach policies that grant the necessary permissions to the role.
  • Name the role and provide a meaningful description.

Attach the IAM Role to EC2 Instance:

  • In the EC2 dashboard, select the instance, and then under the “Actions” dropdown, choose “Security,” and then “Modify IAM Role.”
  • Choose the IAM role you created in the previous step and save the changes.

STEP 5: Executes scripts to create machine images based on predefined configurations.

Use build command to start the process.

Conclusion

In summary, this project’s significance lies in the transformation of AWS AMI creation through the implementation of a robust CI/CD pipeline. By seamlessly integrating GitHub, Jenkins, and Packer, we’ve achieved a streamlined, automated process for developing and deploying AMIs.

--

--

Aman Singh

I'm a tech enthusiast eager to share my knowledge and insights on DevOps, cloud computing, database administration and linux administration through my blogs.