Miniature toy shipping container that containers a docker container in it.
Dick Hardt’s Docker container, Photo by Cory Doctorow

I spent 3 years using Docker for local development

Daniel Wilby

--

10 years ago I first started my programming journey by making Minecraft mods on my parents PC. Building on those blocks by moving to bigger and better projects, I realized there had to be a better way to keep a handle on the gargantuan amount of tooling involved with development of any kind.

Enter Docker. Docker is an application/platform which enables the user to create containerized environments to run their applications. This helps the user deploy their applications without worrying about configuring each it time they launch it.

While predominantly used in production settings, you can use it to help streamline local development by keeping all your tooling inside a nice little containerized box. I’ll show you an example of when I used Docker to compile Veloren, an open source voxel game made in Rust. I’ll then explain some other ways of using Docker and some reasons why you should or shouldn’t use it.

My setup

For each new development environment I wanted to develop in, I had a few steps I had to follow:

  • Create the Dockerfile and build the image — if I needed to fine-tune any different aspects of the build environment, i.e. the dev packages, etc… then I would do here. Here’s a link to the specific docker files I used for different projects.
  • Copy my “run” script into the project folder — the run script would contain the following commands, changing the relevant arguments after copying
$ sudo docker run -it \ 
--user "$(id -u):$(id -g)" \
--name rust-build-test \
--mount type=bind,source="$(pwd)",target=/app \
--workdir /app \
rust/vulkan /bin/bash

$ sudo docker rm rust-build-test # cleans up the container after finishing

The flags

These are what really gave my system its power. I could simple execute my run script and build whatever project I wanted. A brief explanation of the flags follow:

  • -it is actually two flags. -i keeps stdin open, and -t allocates a psuedo-TTY. (-it is otherwise known as the interactive terminal flag)
  • --user "$(id -u):$(id -g)" lets the container run using your user’s permission. If you don’t include this, files generated by the container may need root to accessed when you quit the container.
  • --name rust-build-test just gives the container a name that is easier to deal with for us.
  • --mount type=bind,source="$(pwd)",target=/app binds the directory that you’re running the script file in to the container’s inside a folder called app .
  • --workdir /app replaces the default working directory of the container, without this the default directory will be the root folder.
  • rust/vulkan /bin/bash is the image we are running followed by the command you want to run in the container. I usually like to start by opening up bash.

Obviously there are tons of flags that you could set, but these are the ones I found were essential. Also Docker’s documentation is actually pretty good, here’s their page on the docker run command.

Why I liked it

  • Small projects or tools were easy to test out — a machine learning course I took would use Python, and a chaos theory course used Julia. I didn’t have to install either natively, and cleanly removed both from my system after I finished the courses.
  • Similar performance to native builds — the following are screenshot comparisons of building a program natively vs in a container.
Compiling the Veloren project took 5 minutes and 44 seconds when not using a docker container for local development.
Build performance not using Docker
Compiling the Veloren project took 5 minutes and 16 seconds when using a docker container for local development.
Build performance using Docker

As you can see there’s not much in it. And, I know, running a build command once does not constitute very strong benchmarks. That said, after using Docker for the amount of time I have, I can tell you that the build times have never been noticeably longer. What’s more, I could even compile something in the container and run the binary locally on my machine since docker automatically picks up your platform’s architecture.

Why I didn’t like it

  • There is an added thin layer of overhead to every project — while the set up for a project is fairly easy, there’s an added constant to the start up cost every time. I began to feel worn down after a while, especially when trying to quickly prototype or test.
  • While running GUI applications is possible, it does not work well — there is nuance here that I’ll explain in the next section

Gui Applications

While intuitively it might seem as though running GUI applications in Docker wouldn’t work at all, you can pass in your display as an environment variable when running your container to make it work. Docker passes the display on and ports the display to your Xorg server. This means that if you use Wayland as your default compositor, you can try installing Waypipe, or else you’re out of luck. Unfortunately, as far as I can tell, the Waypipe project is no longer active.

Tl;dr

From the title you might have surmised that I no longer use docker for my local development. For my use cases, I found that the cons outweighed the pros. In my eyes there’s only really two cases for using docker:

  • (1) If you work in an organization where configuring the development environment is difficult, or
  • (2) if you don’t mind the overhead and are up for the constant configuration of it.

If you’re like me and just want to get stuff up and running, then don’t follow in my footsteps.

--

--

Daniel Wilby
0 Followers

Mathematics/Computer Science/UX Design — Student at the University of California, Santa Cruz.