How Docker can improve your development environments

Kim Desrosiers
Phytochemia Tech Blog
5 min readOct 4, 2017

--

Disclaimer: I want to clarify that I won’t cover any good practices with Docker in production. I will talk about development environments improvement using Docker.

Development environments are not always easy and simple to setup. Especially with complex projects using micro services or external dependencies. For example, with a trivial backend project, you have a good chance of having to setup a local database and even a local message broker for your development needs.

For the purpose of this post, we will define a little project that has some dependencies. The project will be named “WishlistManager” and use Python and Flask. Then we’ll use PostgreSQL and Redis as the message broker. Here we go, we have our little project to set up locally.

Yeah, we can do it!

Without Docker

What we need to do if we choose to not use Docker.

  1. Install PostgreSQL, Redis and Python manually or with a script. Depending on the OS, this can be tricky sometimes.
  2. Configure each dependency. For example, we need to create a PostgreSQL database user.
  3. Start dependencies daemons.
  4. Begin to code!

Not so bad after all. However, there are some drawbacks to this setup.

  • All services are installed system wide. So, if I need to upgrade PostgreSQL for a project I can break another one.
  • It can be a mess for your development setup documentation since you need to handle different OS and Package managers. For example, configuration files locations change according to OS or even Distros.
  • It is time-consuming. Taking into account the two previous points, if you have 15 developers, they will have to repeat all steps to setup their environment.
  • You can’t ensure that everybody has exactly the same version of each dependency. One developer might install PostgreSQL 9.4 instead of 9.5 by mistake and you realize it when this developer commits something that is no more supported.

Unfortunately, I want to keep my project setup simple and quick as it will get bigger. Thus, this is where Docker helps us.

With Docker

In order to simplify and isolate our development environment, we will use Docker. However, you need to know the basics of Docker of course. The official documentation of docker is a good starting point to learn the terminology and the basics.

Here are some quick definitions of important terms related to Docker to help you understand the next part:

Image — A binary file that includes everything needed to run a Docker container, including metadata describing capabilities and dependencies.

Container — A runtime instance of a Docker image. Multiple containers can run concurrently using the same Docker image.

Dockerfile — A text document that contains the execute commands needed to build a Docker image. An image is produced from a Dockerfile using the docker build command.

Compose — A tool for defining and running multi-container applications. The containers are all defined and configured from data in a single file. All the containers can be started or stopped with a single command.

To be honest with you the Docker way requires more work and time the first time in order to install and setup the Docker stuff but this is for a good reason.

Here’s what you need to do to use Docker:

First, install Docker, of course, and Docker-Compose. Note: for Windows and Mac there is a little difference but Docker for Mac and Docker for Windows are there to fix that.

Then, create a file named docker-compose.yml. By the way, I always use Docker-Compose to simplify my multi-containers setups and I encourage you to do so to avoid unnecessary complexity. I will explain how to use it.

Example of a docker-compose.yml

Finally, create a Dockerfile (if needed). Why if needed? Some of my basic projects, especially Python projects, don’t really need a Dockerfile since I use Python virtualenvs to isolate my python setup. However, it is always possible to replace a virtualenv with a Docker container.

Example of a Dockerfile

What is actually happening? Basically, Docker-Compose helps us by simplifying the definition of our services. In one file we can define for each service which Docker image will use, port mapping, volumes mapping, environment variables and more. The Dockerfile allows us to create a custom Docker image from an existing one with our application data.

From there if we want to create and start the containers we just need to run the following command:

docker-compose up

Then, according to our example, you should be able to access to your app at localhost:5000. PostgreSQL and Redis are available from your system without having to enter in Docker containers since we mapped their port to local ones.

Note: Giving that Docker creates a virtual local network for the containers communication, your python container will be able to connect to PostgreSQL by using its service name as IP. For example: in my app config the database address will be set to: “postgres”.

If we want to stop our containers we run the following command:

docker-compose down

Furthermore, Docker-Compose gives us many others features that I didn’t cover in this blog post. Once again go read its documentation it’s worth it.

Thus we have our Docker version. Although we have to learn the basics of Docker and it requires a little more scaffolding, I think this setup has some advantage on the previous one.

  • Quick setup. You need to setup somebody quickly, just clone the repository, install Docker and Docker-compose then run docker-compose up
  • A consistent development environment. It does not matter which OS you use everybody has the same libraries, language runtime etc
  • Isolated. I might run 20 different PostgreSQL versions and everything will be alright (My CPU and my RAM will be angry but yeah I might)
  • Makes it easy to use automated test tools such Continuous Integration(CI)
  • You have only one setup to document

In Conclusion

As we have seen above, setup without Docker isn’t too bad but there are some drawbacks to take in consideration. On the other hand, using Docker involves that we need to learn the basics and we have a little more scaffolding but it gives us a more robust environment. Once you are comfortable using Docker you will be tempted to use it in production.

I encourage you to learn advanced features of Docker to be able to get the maximum power of it.

--

--