Streamlining Deployment with AWS, Ansible, Docker, and Jenkins

Deepti Trivedi
Strategio
Published in
5 min readFeb 5, 2024

In the world of DevOps, automation is the key to achieving efficiency and consistency in software deployment. Orchestrating the deployment of your applications with ease ensures that they run consistently across different environments. This is where the quartet of AWS, Ansible, Docker, and Jenkins comes into play.

Created with DALL-E

The Need for Automation

The demands of modern software development require agile and efficient deployment processes. Manual deployments are error-prone and time-consuming, leading to downtime and frustration. This is where automation can transform the path from development to production.

Ansible is a powerful automation tool that allows you to define your infrastructure as code. It helps you automate configuration management, application deployment, and task automation. With Ansible, you can ensure that your servers and applications are always in the desired state.

Docker enables containerization. It provides a consistent environment for your applications, ensuring they run the same way on any platform. Containers encapsulate your application and its dependencies, making it easy and stress-free to deploy and scale.

Jenkins, a popular automation server, orchestrates it all. It allows you to automate your build, test, and deployment pipelines. With Jenkins, you can create a seamless workflow for building and deploying your applications.

AWS Services at the Core

To build this powerful deployment pipeline, we leverage AWS services to provide a robust and scalable infrastructure.

Amazon EC2 Instances

Amazon Elastic Compute Cloud (EC2) instances serve as the backbone of our infrastructure. We utilize EC2 instances to host various components of our application, including Jenkins servers, Ansible servers, and Docker containers. These virtual servers provide us with complete control over the computing resources, making it easier to manage and scale our infrastructure as needed.

Load Balancers

Load balancers play a crucial role in distributing incoming traffic across multiple EC2 instances. By using Elastic Load Balancers (ELB), we ensure high availability and fault tolerance for our application. ELB intelligently routes traffic to healthy instances, eliminating single points of failure and improving overall system reliability.

Autoscaling

To address varying workload demands, we implement autoscaling policies that automatically adjust the number of EC2 instances in response to traffic fluctuations. This elasticity allows our application to efficiently handle both low and high traffic volumes while optimizing resource utilization and cost-effectiveness.

Virtual Private Cloud (VPC)

The VPC serves as a secure and isolated network environment for our AWS resources. We configure network subnets, security groups, and routing tables within the VPC to control inbound and outbound traffic, enhancing security and ensuring proper communication between components.

The Architecture

The architecture of the project is designed for flexibility and scalability. It can adapt to the application’s needs and scale as the user base grows.

System Architecture

Key Steps in the Pipeline

Scenario

  • Run Ansible Playbooks from Jenkins to automate infrastructure provisioning and application deployment.

Prerequisites

  • Set up Amazon EC2 instances for Jenkins, Ansible Control Node and the Managed Node.
  • Create IAM Role for secure access to AWS services.
  • Enable inbound traffic on desired ports.
  • Configure Ansible inventory and users on the Control and Managed Nodes.

Integration with Jenkins

  • Integrate Ansible Control Node and Docker server with Jenkins using the Publish Over SSH plugin.
  • Configure a Jenkins job for building, testing, and uploading artifacts to the Ansible Server (Control Node).

Docker Image Creation

  • Install Docker on both the nodes, and start the service.
  • Create the Dockerfile to build Docker images incorporating the application’s .war file.
  • Implement an Ansible Playbook to build and push Docker images to Docker Hub (registry).

Deployment Automation

  • Run an Ansible Playbook to pull Docker images from the registry and deploy containers on specified hosts (Managed Node).
  • Reconfigure the Jenkins job to deploy the Docker container using the Ansible Playbooks.

The Code Behind the Automation

Here are the Ansible playbooks responsible for deploying the application:

build-push-image.yaml

---
#Ansible Playbook to build and push Docker image to Registry

- name: Playbook to build and run Docker
hosts: ansible_CN
become: true
gather_facts: false

tasks:
- name: Delete existing Docker images from the Control Node
shell: docker rmi $(docker images -q) -f
ignore_errors: yes

- name: Push Docker image to Registry
docker_image:
name: simple-docker-image
build:
path: /opt/docker
pull: true
state: present
tag: "latest"
force_tag: yes
repository: <your-docker-hub-username>/simple-docker-image:latest
push: yes
source: build

deploy-container.yaml

---
#Ansible Playbook to pull Docker Image from Registry & run Docker container

- import_playbook: build-push-image.yaml

- name: Playbook to build and run Docker
hosts: docker_host
gather_facts: false

tasks:
- name: Run Docker container using simple-docker-image
docker_container:
name: simple-docker-container
image: <your-docker-hub-username>/simple-docker-image:latest
state: started
recreate: yes
detach: true
pull: yes
ports:
- "8888:8080"

The complete code for the project can be found in my GitHub repo.

A typical Workflow

1. Code Repository

Developers commit code changes related to the project to Git, a distributed version control system which uses GitHub, an internet hosting platform to store code repositories.

2. GitHub Webhook

GitHub is configured to trigger a webhook whenever there’s a new commit or push to the repository.

3. Jenkins Job Trigger

Jenkins, a continuous integration server, receives the webhook notification.

4. Jenkins Build

Jenkins starts a build job, which includes the following steps:

  • Pull the latest code from the GitHub repository.
  • Build the artifacts and upload them to Ansible Server.
  • Initiate an Ansible playbook for provisioning and configuring the necessary infrastructure and Docker environment.

5. Ansible Deployment

Ansible Playbooks are run to instruct Docker to:

  • Delete any existing images.
  • Package the application into a Docker image using the artifacts.
  • Push the Docker Image to Docker Hub, a service for sharing container images.
  • Deploy the Docker containers onto the prepared deployment environment by pulling the image.

6. Application Deployment

The Docker containers are started on the deployment environment, running the application, with the latest code changes.

7. End Users

End users access the application via its public endpoint or URL.

8. Monitoring and Logs

Continuous monitoring and logging tools (e.g., CloudWatch, Prometheus, Grafana, ELK Stack) collect data and log information from the running containers.

Future Possibilities

As we explore this setup, we can envision a future where our deployment pipeline includes:

  • Amazon Simple Storage Service (S3): Serving as our artifact repository, storing Docker images and Ansible playbooks to ensure that our deployment artifacts are readily available and versioned.
  • Amazon Elastic Container Registry (ECR): Storing Docker images in a managed Docker container registry.
  • Amazon Simple Notification Service (SNS): Keeping the team informed about deployment statuses. Whether it’s a successful deployment or an issue that requires attention, SNS would send notifications to relevant teams or individuals through tools like Slack or email.
  • Amazon API Gateway: Providing a front-end for our applications, accessible to users.
  • AWS DynamoDB: Serving as the back-end data store, ensuring data consistency.
  • AWS Lambda: Helping us execute code in response to events, allowing us to trigger deployments automatically.

The world of DevOps is ever-evolving, and with AWS, Ansible, Docker, and Jenkins, we’re equipped to adapt and innovate.

I hope you enjoyed a glimpse into this project.
Do comment and follow along for more exciting content on DevOps!

--

--

Deepti Trivedi
Strategio

Aspiring Cloud Engineer and AWS JAM & Cloud Quest enthusiast