Designing A Delivery Pipeline with Containerized Applications

BISINET
6 min readNov 4, 2023

Containers have gained prominence as the preferred way to ship applications. To cater to this need software development and deployment pipelines need to be designed to take full advantage of all the benefits gained by containers. Let’s look at how we can design a container-based delivery pipeline.

What is a Containerized Application?

Virtualization helped users to create virtual environments that share hardware resources. Containerization takes this abstraction a step further by sharing the operating system kernel. This leads to lightweight and inherently portable objects (containers) that bundle together the software code and all the required dependencies. These containers can be deployed on any supported infrastructure with a supported container runtime.

In a traditional deployment, one of the most complex parts is configuring the deployment environment with all the dependencies and configurations. Containerized applications eliminate this configuration requirement as the container packages everything within the container that the application requires. On top of this, containers will require fewer resources and are easily managed compared to virtual machines. When applications are containerized this leads to simplified deployment strategies and can be easily automated and integrated into the delivery pipelines. Containers help to avoid “it runs on my machine” issue.

Combining this with an orchestration platform like Kubernetes or Rancher users can leverage the strengths of these platforms to manage the application throughout its lifecycle while providing greater availability, scalability, performance, and security.

What is a Continuous Delivery Pipeline?

Continuous Delivery enables software development teams to deploy software more frequently while maintaining the stability and reliability of the systems. Continuous Delivery utilizes a stack of tools such as CI/CD platforms, IaC, testing tools, etc… combined with automation to facilitate frequent software delivery. Automation plays a key part in this continuous delivery pipeline by automating all tasks within the pipeline from tests, infrastructure provisioning, and configuration management to deployments.

In most cases, Continuous Delivery is combined with Continuous Integration to create more robust delivery pipelines also called CI/CD pipelines. This way organizations can cater to the complete software development process. Continuous Integration automates the build and testing of the latest code changes while continuous delivery ensures that validated changes from CI get deployed to the relevant environments typically first to a dev/staging environment and ultimately to the production environment.

How Does it All Fit Together?

Now we have an understanding of a containerized application and a delivery pipeline. Let’s see how these two relate to each other to create ever more efficient software delivery.

First, let’s look at a more traditional DevOps pipeline. In general, a traditional delivery pipeline will consist of the following steps.

  • Develop the software and integrate recent changes to a centralized repository. (Version control tools come into play at this point)
  • Verify and validate the code and merge the changes.
  • Build the application with the new code changes.
  • Provision the test environment with all the configurations and dependencies and deploy the application. (Environment can be provisioned beforehand and updated as required.)
  • Carryout Testing (This can be both automated and manual testing depending on the requirement)

After all the tests are completed deploy the application in production. This again requires the provisioning of resources and configuring the dependencies with any additional configurations required to run the application.

Most of these tasks can be automated, even provisioning infrastructure can be automated with IaC tools such as Terraform, CloudFormation, and Pulumi. Deployment can be simplified by using platforms such as AWS Elastic Beanstalk, and Azure App Service. However, all these add additional overhead to the deployment process, and using provider-specific tools will lead to vendor lock-in. Containerized application deployments allow us to simplify the delivery pipeline with less management overhead. A typical containerized pipeline can be summed up in the following steps.

  • Develop and integrate the changes using a version control system.
  • Verify and merge the code changes. (This will include peer reviews, static code analysis, etc…)
  • Build the container. At this stage, the code repository not only has the application code but all the necessary configuration files and dependencies. All these are used to build the container.
  • Deploy the container to the staging environment.
  • Carry out the testing.
  • Deploy the same container to the production environment.

Container-based Delivery Pipeline

As you can see, a container-based delivery pipeline effectively eliminates most regular infrastructure and environment configuration requirements. However, this does not mean that all the infrastructure and environment configurations are eliminated. We need to provision and configure the container deployment environment must be configured beforehand. In most instances, this relates to a container orchestration platform like Kubernetes or Rancher or a platform-specific orchestration service like Amazon Elastic Container Service (ECS), Azure Container Services, etc…

In the delivery pipeline, the main turning point is the application build vs containerization. In a traditional delivery pipeline, the goal is to create a deployable package of the application, but in the containerized application the complete container is built with all the dependencies and configurations bundled that can be deployed in any supported environment. This eliminates the need for separate configurations or dependency management in each environment and reduces the errors due to faulty configurations. Ultimately allowing the product team to quickly move these containers between different environments such as staging and production. Since containers standardize the configuration between environments, the troubleshooting can be scoped easily to the container or external resources.

Modern application architectures such as microservices are well suited for containerization as they decouple application functionality to different services and with containerization, these can be easily managed as separate individual entities without relying on any external configurations.

Even with containers, there will be infrastructure management requirements, however, containers simplify these requirements. The most prominent infrastructure management requirement will come from managing the container orchestration platforms and external services like load balancers and firewalls. Using a managed container orchestration platform like Amazon Elastic Kubernetes Service (EKS) or Azure Kubernetes Service (AKS) and container services like AWS Elastic Container Service or Azure Container Apps somewhat simplifies this management overhead.

Container Orchestration in Delivery Pipeline

Container Orchestration goes hand in hand with containerized applications. Containerization is only one part of the overall architecture. Container Orchestration is the process of managing the container throughout its lifecycle from deploying the container to managing availability and scaling.

There are many orchestration platforms with differentiating features and complexity, but Kubernetes stands out as the defacto choice. With industry-wide support and the ability to power any type environment from single-node clusters hosting simple web applications to multi-cloud clusters with Apache Hadoop managing large, distributed data processing. As orchestration platforms can be responsible for the management of the container throughout its lifecycle while ensuring availability this eliminates the need for manual intervention to manage containers. Being the industry standard means almost all cloud service providers offer some kindof a managed service based on Kubernetes effectively eliminating vendor-lock-in while allowing users to utilize managed solutions and power multi-cloud architectures with a single platform.

Are Containers the Right Fit for Your Delivery Pipeline?

Simple answer, Yes. Containerization can benefit most application developments with only detractors beginning overly simple developments or legacy monolithic developments. Containers can support any environment regardless of the programming language, framework, deployment strategy, etc… and provide additional flexibility to delivery teams to customize their environments without affecting the delivery process.

Combining this with CI/CD pipelines can lead to streamlined development and delivery while increasing team collaboration and improving overall application quality.

--

--