Docker Orchestra in a Symphony of Modules
Let me compose you a container..
Introduction
Hey that’s not the Docker logo, that’s Wailord from the Pokémon™ series.
Docker is basically a container for apps (hence the whale carrying cargo logo) that makes developers’ job easier. It’s like a pseudo-VM made exclusively for apps so developers wouldn’t worry about the dependencies that needs to be installed on the server. Why? Docker magic, all the dependencies have already installed on the container or image.
When talking about docker, we should talk about containers instead since docker is just the name. Container is a standard unit of software that packages up code and all its dependencies so the applications runs quickly and reliably from one computing environment to another.
I used the word “pseudo-VM” because Virtual Machine and Docker are both containers but different in nature, below are the comparison.
Containers:
- Abstraction at app layer that “contains” the app and its dependencies together
- Multiple container can run on the same machine and share OS kernel
- Less size than VM (usually tens of MBs)
Virtual Machines:
- Abstraction of physical hardware
- The hypervisor allows multiple VMs to run on a single machine
- Significantly large in size (tens of GBs)
Why bother?
Here are benefits using Docker:
- Standard: Docker created the industry standard for containers, so they could be portable anywhere
- Lightweight: Containers share the machine’s OS system kernel and therefore do not require an OS per application, driving higher server efficiencies and reducing server and licensing costs
- Secure: Applications are safer in containers and Docker provides the strongest default isolation capabilities in the industry
Using Docker
In my Software Development Project course, I use Docker to contain my NodeJS app. It really helps to manage dependecies of the notorious /node_modules/ folder. Here’s how I did it:
- Create a
Dockerfile
file in the root directory of the app - Inside the file should look like this
FROM node:12 = what base image you want to use (i use node because i use NodeJS)
WORKDIR /usr/src/api =is your designated work directory of the app
COPY package*.json /usr/src/api/ = copies the dependency *.json file
RUN npm i = installs the dependencies of the app
COPY . . = Bundles the app source
EXPOSE 9000 = expose the app to port number 9000
CMD [“/bin/bash”, “start.sh”] = run the argument as a cli command (you can put direct commands like [“node”, “server.js”] to run it, here I used a start.sh
script for running all my database migrations + the app)
- And you might want to make a
.dockerignore
file and fill it with your dependencies folder (node_modules/
in NodeJS for example) so you don’t push it into your image and other file you don’t want to push to the image like.env
As for the docker architecture of our app we use docker-compose.yml
as follows
docker-compose.yml
is used generally for wrapping more than 1 container. Here, we use it to get a postgres container in order to run our database with settings as shown above. service
: shows that what container we use, in here our app api
and the database postgres
image
: is the image that is used in the serviceports
: what ports that image is bind todepends_on
: dependency of the image to another imageenvironment
: the environment used based on the dependency
In our ci/cd file, we use the postgres image for testing our database
That’s basically it, but in order to build the docker container itself you’ll need to build it and push it to hub.docker.com more on that here.
Conclusion
Docker is an exceptionally useful tool for developers to pack up their apps and ship it to the servers without worrying about the dependencies of the app itself. In reality, docker is used for all kinds of things not just deploying a web app like I did. So, let’s orchestrate our apps with Docker!