Introduction to Docker

Most of the developers developing with a team is familiar with the dilemma where someone would complain ‘it worked on my machine, but its not working on the your computer/server’. Sometimes you would have setup the infrastructure or prepare run-time environment endlessly rather than actual productive coding of the application.

‘Build, Ship, and Run Any App, Anywhere’

Here comes Docker addressing this problem with above stated slogan. Docker is simply container solution and is now increasing in popularity among developers. Docker allows to create, deploy and run applications, & bundling all of what’s necessary together into docker container. These containers will be able to run on any docker installed device where the container will provide exactly same environment across the devices.

Containers vs Virtual Machines

We might get the feeling that containers and virtual machines tend to answer the same question. Well that’s sort of correct, however here’s a comparison.

Virtual Machine vs Docker Container

From the above diagram we could clearly address our concerns. Here for each application virtual machine provides embedded OS. A complete guest OS for an individual application seems a waste of resources.

What’s done in container solutions is share same host OS. Isolation of OS and Application is done through the container solution as in this diagram by Docker. Each application access needed OS resources from the host OS through a connection sort of like API calls. This clear separation of OS is done by Docker Engine.

Here container solutions like Docker provide OS level virtualization compared to hyper-visor based virtual machines. Hyper-visor adds up huge overhead to application. We could simply compare boot up times of virtual machine and docker image both carrying Ubuntu to show huge overhead in VMs.

Let’s get started

Docker homepage provides easy installation process for many different platforms such as Windows, MacOS, Linux and even cloud services.

However for the ease of this tutorial I would be using DigitalOcean for demonstration. Nevertheless rest of the tutorial would work on any Docker installed device.

I have selected pre-installed Docker environment within Ubuntu OS.

To test whether to Docker is running we could simply run docker -v, which would give installed Docker version. There are various methods to be used to check Docker deployment and we will be using tutum/hello-world container to test, and we are able to find this container setup from DockerHub.

What’s DockerHub?

DockerHub is a repository for public docker containers, simply its like GitHub but for docker containers. There will be Dockerfile of Docker images, which would contain instructions to Docker to how to build Docker container. We would be looking at Dockerfile later on.

We will be using these pre configured docker image to get a clear understanding of docker. We will look into how to create new docker image later.

Here we make use of ubuntu VM with docker. We will clone docker container from DockerHub which would return simple hello world web server.

docker run tutum/hello-world

We could see that which containers are currently running. Here we can see our hello-world container is deployed.

However this would not show anything when we access our IP address. That’s because we haven’t exposed our application to the public. We need to need to expose needed port when running the container. As per this scenario we are exposing port 8080.

Output of this can be visible in web browser.

docker images          -> List all images
docker run ImageName   -> Create new container from image
docker start ImageName -> Uses existing container and run it again
docker stop
docker ps              -> List all running containers
docker rm ImageName

Create Docker Image

Docker image is simply a file containing the snapshot of a container. These images are created with build command. Thereby Docker container is simply an instance of Docker image. Container can be created from the image by using run command.

We can write our own image file using GO language. Procedure to make the Docker image is stored in Dockerfile. This is the most important & only file needed in setting up Docker images. There take a look at the official NodeJS Dockerfile.

FROM node:boron

# Create app directory
WORKDIR /usr/src/app

# Install app dependencies
COPY package.json .
# For npm@5 or later, copy package-lock.json as well
# COPY package.json package-lock.json ./

RUN npm install

# Bundle app source
COPY . .

EXPOSE 8080
CMD [ "npm", "start" ]

Here FROM command found import basic NodeJS into the container. This could be changed as required FROM Ubuntu, FROM debian:jessie, FROM microsoft/nanoserver or any other base OS for the container.

From this basic import, Dockerfile provides instructions on how to setup the environment. From changing working directory: WORKDIR, copying necessary files: COPY, running required scripts: RUN.

(imagelayers.io provides cool way to visualize all the layers of container from the content of the Dockerfile.)

Running this Dockerfile in docker installed devices will results in identically same container regardless of the difference in host systems. Simply all the members of the development group will possess exactly same environment. Just run docker build and come up with running environment with no hassle. No wasted time on changing environment variables, dealing with Linux / Windows OS dilemma.

Another cool feature to add is it integrate with Git. During the Dockerfile simple git pull or git clone would result in Docker containers with latest changes not just the environment. Simply install git using RUN apt-get install git and then clone using RUN git clone repo.git.

Another cool approach is to continuous integration (automated build) to work with GitHub update and build up container with new changes. You could try collaborate CI, Version Control with Docker as an activity.

Docker Tools

Docker have produced many useful tools to manage and work with Docker images, and containers. Docker for Windows works only with Windows Enterprise Edition. For Home Edition go with Docker Toolbox. There are various advanced tools to work on Docker related tasks. Docker Compose is another cool tool for managing multi-container Docker solutions.


I hope this article was able to give brief idea about Docker & its integration in modern development workflow. For your practice try dockerizing your next application. Feel free to comment or point out if there’s anything!