HackerNoon.com
Published in

HackerNoon.com

Getting started with Dockerizing your Node.js Application

Photo by chuttersnap on Unsplash

Why use Shipping Containers?

Well, shipping containers revolutionized the transportation industry by standardizing and making it simple to transport large quantities of goods. This could be over sea or land. Now with these standard containers, we are able to ship multiple things in one container or even ship large quantities of a single thing in multiple containers. Some of the key features here are:

  • Handling: These shipping containers are standardized and hence could be handled in the same manner regardless of what’s in them or where they are.
  • Security: These shipping containers are independent and separate from each other hence providing a barrier from external interference for the contents within.
  • Scalable: These shipping containers can fit multiple things within or fit large quantities of a single thing. We can also get more containers based on the number of goods.

How does this work in Software Containerization?

Software Containerization works similarly as above. I’ll get into a little more detail later but we are able to achieve all the key features above in the following manner:

  • Handling: These containers are standardized and hence we can easily deploy them as well as scale them when required. The scaling and management of these are handled by orchestration tools like Kubernetes or Docker Compose. Also, Docker can easily do anything on these containers due to a standard defined documents called Dockerfile.
  • Security: These containers are immutable and thus any change produces a new container which provides a secure setup of code. The code is within the container and also harder to access and manipulate.
  • Scalable: These containers can easily be scaled up or down based on criteria we have assigned for it. This is basically what an orchestration tool like Kubernetes or Docker Compose does. It is essentially able to spin up or shut down containers and handle the load on all of them.

How Does Containerization Work?

To explain this simply, I will use the shipping container example again. Each container we have has a manifest file which basically specifies the contents of the container, how to load and unload it etc. The container gets loaded based on the instructions of the manifest file and then the dock workers load/unload them based on instructions.

Docker Process. Image from Postman
  • Docker Image: This basically is sort of like an executable file of your code which when started will be run inside a container. This is a standard file and hence we are able to get the same execution on multiple instances (laptops/servers).
  • Docker Container: When we run a docker image we can have multiple images running inside one container or a single image. The idea of a container is slightly complex and I suggest you refer to the below brilliant article by Preethi Kasireddy in freeCodeCamp

Set up A Node.js Project

So I’m going to basically use an Express application that has a single API endpoint which I will test on the browser. I am not going to go into the details of the Express application but you can read through the README if you need help setting up the application and starting it.

Set up Docker

We will first need to setup Docker. Use one of the links below.

Writing the Dockerfile

We will need to make a new file called Dockerfile to help us set up our image and container. Below is the full Dockerfile which we will use. I will explain what every single line does.

  • ENV NODE_ENV production: Here we set the node environment variable to production.
  • WORKDIR /usr/src/app: This sets the working directory inside the image that we will build.
  • COPY [“package.json”, “./”]: When we build our Docker Image we have a clean image with nothing except the node:8-alpine version setup within. As a standard practice, we generally copy the package.json and install the dependencies and then copy the code in. This also helps as every command in the Dockerfile is cached and makes the build faster.
  • RUN npm install — production — silent: This basically installs the production dependencies and in the background. This stage is cached (as explained before) so unless we have a change in the package.json this stage is never repeated and hence our builds are faster.
  • COPY . . : Now we copy all the other files into the working directory.
  • EXPOSE 8081: We make the service run on the port 8081. This basically exposes the port 8081 within the container and not outside the container.
  • CMD node index.js: Finally, we set our main command which will execute the image on the container.

Building the Docker Image

We have now come to the stage of building the docker image. To do so we just type:

docker build -t docker-node-example .
docker images

Running the Docker Image

Now we need to run the docker image. To do that use the below command

docker run -p 8081:8081 --name docker-node-example -d docker-node-example
docker ps

Congrats, you have now Dockerized your Node.js application!

  • Docker Compose
  • Kubernetes
  • Docker Swarm or another Docker Repository

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Rohit Jacob Mathew

SDE at Trellix | Auth0 Ambassador | Ex Turtlemint & HackerRank | An Eccentric Coder | Musician/Beatboxer | Manchester United & Bengaluru FC fan.