Using Docker in Development
There are plenty of resources on Dockerizing applications for production. But it’s much harder to find good resources on using Docker development, in an effective and frictionless way (i.e. not having to constantly docker build
)
In this article I will:
- Describe some of the benefits of using Docker in development.
- Explain the overall architecture.
- Showcase some specific tools that can make the experience smoother.
Reasons to use Docker in development.
Zero-effort Onboarding
Onboarding a new team member with a new computer should be as simple as git clone …
and some form of a docker run …
command.
Then new teammates can download/set up whatever tools they want to edit their code, and if that tooling requires updating to a new version of Python/NodeJS/Ruby for some plugin- it will not affect their dev environment.
Consistent dev environment.
Your whole team will be on the same “correct” version of node
/ ruby
/ etc. In the case of node
development you can make sure everyone has the same version of npm
installed to prevent pesky package-lock.json
diffs too.
And potentially much more!
- At Priceline, we use a private
npm
registry to share modules internally. Our docker dev image is set up to already have access to it! So that’s one fewer step to getting set up. - We also use self-signed certs for
https
. We put that in a docker volume and have it shared in a consistent location across all our apps now, so if we spin up a new app we can tellwebpack-dev-server
exactly where that is without enforcing any specific directory structure on the host laptop.
The Architecture
- Use a container to wrap a consistent environment.
- Rely on the host machine to provide the code to execute. That is to say, the development image should ideally not be built with any application specific code, as changes to the application code should not require another
docker build
. - Make heavy use of volumes to store persistent data (such as node_modules) and map your application’s code from your Host machine with intelligently chosen configurations (cached, ro, delegated) so things run fast!
Don’t do what every other blog post I could find with this title says to do and require docker build
every time you make changes or add a node_module!! Trust your Host machine to bring the application code, use Docker as a wrapper for executing it.
Node Specific Example
What follows here is a NodeJS specific implementation of the above, general
Prerequisites
- Docker: https://www.docker.com/get-started
- That’s it!
- Seriously, that is the whole point of doing this.
Use a container to wrap a consistent environment.
(Optional) Make your own Docker image
Here we make a Dockerfile
to create a Docker image that has the exact versions of node
and npm
that we want. You may have other external dependencies like imagemagick
for your app which we would want to install for this step.
Let’s say we have a hypothetical app with node
8.11.0
and npm
5.6.0
.
Then we will want to create a file called Dockerfile
with the content below:
FROM node:8.14.0
RUN npm install -g npm@5.6.0
# RUN npm config set registry https://npm.fury.io/USERNAME/ # set up private NPM registry
# do any setup steps here.. installing imagemagick, etc.
ENV FORCE_COLOR=1 # this lets npm log colored output
WORKDIR /app
You should then docker build -t dev-node .
this Dockerfile
to create your personal development image.
If you're starting a totally new project you could potentially skip this step entirely if you wanted to just use FROM node:10
with whatever those defaults are. In which case, feel free to have no Dockerfile
, and just use node:10
in place of dev-node
in the later steps.
Check http://dockerhub.com/ first to see if there’s a well maintained public image that works for your use case.
Running the container
Now inside your source directory on your host machine, we do a few things.
- We want
node_modules
to be persisted in a docker volume that is optimized for our container to write. - We want everything else in the source directory to be persisted on the host machine, mounted to the
/app
folder, and optimized for our container to read. - I’m also making the assumption that our node app exposes port
3000
(but if you're using a different port just replace with that).
In our hypothetical directory we want to install our node_modules
inside of our container:
docker run --rm -v MYAPP_modules:/app/node_modules -v $(pwd):/app:cached dev-node npm install
Now we are ready to run our app! Assuming we have a start
script defined in our package.json
we simply:
docker run --rm --name myapp -v MYAPP_modules:/app/node_modules -v $(pwd):/app:cached -p 3000:3000 dev-node npm start
Breaking this down:
docker run
- This creates a new container and runs the appropriate docker image in it.--rm
- This tells docker to destroy the container upon exit.--name myapp
- Name's the container "myapp" (useful for other commands)-v drive-search-responsive-vol:/app/node_modules
- creates adocker volume
to mount the data located at/app/node_modules
so any files the docker image writes to/app/node_modules
will be stored in a volume nameddrive-search-responsive-vol
. This lets us remove and re-create docker images without needing tonpm install
each time-v $(pwd):/app:cached
This mounts thepwd
(present working directory) to the/app
directory in the container using acached
configuration. More info oncached
here.-p 3000:3000
- This tells docker to forward the containers port 3000 to the host machine's 3000, so it is accessible at localhost:3000-it
- Short for--interactive
and--tty
. Basically enables interactive terminal in docker so that you can send signals/STDIN to the container (ex. The host typing^C
will pass through to the container)pcln-node
- This is the docker image we want to run.npm start
- This is the command we want to run in the container (we could replace it with anything and see it get run in the container). This will be run at theWORKDIR
specified in theDockerfile
(In our case/app
).
So as you see we pretty much have a docker run ...
and then the command we would normally run locally. We can continue this pattern for basically anything: docker run ... npm test
, etc.
That’s it!!!
Now you don’t need to make sure your whole team is always on the right version of npm
and node
, because docker run
wraps everything. But this is still not completely ideal. What if someone mistypes one of those -v ______
statements.
Well, that’s where better tooling comes in handy!
Making it easier
Using docker-compose.yml
The first thing that can be done to make this easier is to make a base docker-compose.yml
for your application. This can function as a configuration file for the volume and port configurations. For our example above this docker-compose.yml
probably makes the most sense.
version: '3'
services:
app:
image: dev-node
working_dir: /app
volumes:
- .:/app:cached
- node_modules-vol:/app/node_modules
command: npm start
ports:
- "3000:3000"
volumes:
node_modules-vol:
Okay, so how does this help us?
Well, now instead of having to type `docker run --rm --name myapp -v MYAPP_modules:/app/node_modules -v $(pwd):/app:cached -p 3000:3000 dev-node npm start
we should be able to type docker-compose up
(since that’s what we set our command:
to, for the same effect.
If you want to npm install
instead of
docker run --rm --name myapp -v MYAPP_modules:/app/node_modules -v $(pwd):/app:cached -p 3000:3000 dev-node npm install
We can do a simple docker-compose run --rm app npm install
, and so forth npm test
would be docker-compose run --rm app npm test
.
We can do better!
This is already much nicer, but let’s go even further with docker-dev-tools
.
npm i -g docker-dev-tools
This adds a few powerful aliases to your terminal.
Now instead of docker-compose run --rm app npm test
that boilerplate docker-compose run --rm app npm
can be replaced with 3 letters: dpm
.
dpm install
dpm test
dpm start # this is a special command that runs docker-compose up
If you want to do more than just run npm
, dssh
creates an “ssh” like session inside your image.
I plan to follow up this blog post with some more details specifically on docker-dev-tools
and it’s many uses.