Reducing Complexity With Docker

Justin Michalicek
Mobelux
Published in
6 min readJun 1, 2018

At Mobelux, the development projects never stop. We’re constantly working on both new and existing codebases. At any given time, there are multiple versions of Python, Ruby, NodeJS, PostgreSQL, MongoDB, and Elasticsearch all in active use.

With all these different environments running our code, it can be incredibly painful to ensure that each development and testing environment is configured to use the correct versions of software and daemons. All of these moving parts can also make it difficult to ensure that deployments to production environments happen consistently and correctly.

To simplify development and deployments we started using Docker and Docker Compose, and they’ve made our lives so much easier.

Docker Basics

In case this is the first you’re hearing of Docker, I’ll give you an overview of how it works.

Docker allows you to quickly and easily set up complex environments in a consistent, reproducible manner. It gives your software a small, isolated environment called a container which is completely independent of the other software running on your computer or in other containers.

These containers can then be configured to run any operating system and install any necessary software, allowing you to easily configure application-specific needs without affecting other software that may have conflicting needs. It’s similar to a virtual machine in many respects (for those of you more familiar with that technology) but uses far fewer system resources to run.

Each Docker container is an executing instance of a Docker image — a pre-built definition with all the software and configuration built in so it runs exactly the same everywhere. The image itself is built from a text file called a Dockerfile that contains a simple list of commands and can be consistently rebuilt whenever necessary.

That means that once a Dockerfile is created, any user can quickly build an image from it and run a container that’s pre-configured with the correct software. You can also pre-build and distribute the image to save even more time.

As you probably know, most web applications and APIs consist of more than just one application. They make use of databases, caches, and other web services running on other servers. Docker Compose is a tool that’s distributed alongside Docker and comes pre-installed with Docker for Mac and Docker for Windows, allowing you to define a group of containers that all start up together and run on a private network. That way, you can simulate the entire production stack in a local, isolated, automatically-configured environment.

Developing with Docker

When you’re working in a traditional development environment for a web platform, a developer has to configure a number of things and keep them all in sync as the platform evolves. First, the correct environment (such as a specific Python, Ruby, or NodeJS version) has to be installed. This alone can be time consuming, error prone, and just a lot to handle for someone new to the ecosystem — especially when you add in tools for segregating environments and libraries for multiple projects to exist on the same machine.

Next, the developer needs to ensure that they have the correct versions of external services such as databases and caches. If you’re working on multiple projects, you could need several versions of each, and you have to keep them up to date with whatever you use in production. Each project could also need any number of development libraries and system tools to properly build and run. Without something helping you out, the whole thing can be extremely difficult and time consuming.

That’s where Docker comes in to save the day.

Using Docker, we only have to configure everything once and then distribute it as a Docker image. Whenever a developer works on a project, they pull down the latest image or rebuild it from the project’s Dockerfile. This automates the installation of the correct versions of all software dependencies so that we can be sure everything is correct and up to date.

By using Docker Compose, we take it a step further by configuring images and containers for the correct database, cache, search engine, and even an Amazon S3 clone. We configure these compose stacks so that when started, all services start up, a local copy of the current code is mounted into the development container so that the developer may use native code editors and tools of their primary OS, and the developer is given command line with the ability to install any release specific dependencies and run the development code.

At the end of the day, this saves us hours of work configuring systems and debugging configuration problems. Setting up an environment for the first time will takes minutes (or even seconds) instead of hours.

As an added bonus, bugs throughout the lifetime of the project are reduced because development is happening using the same libraries, base OS, and external services used to run the production systems. Updates to any of those services and dependencies can also easily be tested locally so you’re sure they’re working as expected.

Testing with Docker

After development comes testing, where Docker continues to be the go-to tool to make the process simpler and more accurate.

All of the same challenges with development still exist for testing and can even be compounded when not all testers come from a development or highly technical background. Docker gives us a couple of different options for managing this complexity.

A developer can provide the docker image with the correct version of the code already baked into the image. This way the tester doesn’t need to interact with version control or check to make sure they have the correct version of the code locally. Then Docker Compose can be used to run a container using that image along with containers for external services so they can simply run the Docker Compose stack, and everything is right there.

If we do need to test specific code from version control, the tester can still do that by cloning the git repository and having it mount in to the running container similarly to the development process. The difference is that rather than going to a command line, the container with the code that needs testing automatically installs the current dependencies, seeds the database with basic test data, and automatically runs the service.

To make it even simpler we’ve been using a tool called Lifeboat — a simple GUI for Docker Compose along with GitHub Desktop. This allows testers to easily start and stop compose stacks without the need to navigate between projects or manage Docker and git on the command line.

Deployments with Docker

It’s finally time to deploy and Docker continues to make things easier.

In a traditional server deployment, each server has to manually update dependencies at the time of deployment. For most modern web platforms built with languages such as Python or Ruby, this means reaching out to external servers to download those dependencies and, in some cases, compile them each time. Every time that happens, there’s another chance for a failure — especially considering that the server operating system might be different from where development or testing took place.

When using Docker in staging and production environments, those risks are reduced or removed. We first create an image and deploy it to staging to make sure that it builds, deploys, and functions as expected. This exact same image, which already has all of the dependencies installed and configured, is then be deployed to production removing the previous opportunities for failure during deployment.

From development to deployment, Docker makes the life of a developer so much easier and less frustrating. I highly recommend you give it a try.

--

--