Dockerizing a Node.js Web Application -Part 1

Nisal Renuja Palliyaguru
SLIIT FOSS Community
9 min readJun 10, 2022

What is Docker?

Docker is a low-level operating system abstraction that lets you execute one or more containerized processes or applications within one or more virtualized Linux instances.

Advantages of Using Docker

  • Rapid application deployment
  • Portability across machines
  • Version control and component reuse
  • Sharing of images/dockerfiles
  • Lightweight footprint and minimal overhead
  • Simplified maintenance

Prerequisites

You can find all the example code in this post in the dockerizing-nodejs repository.

Create Repository

Create an empty repository to host your code:

  1. Go to GitHub and sign up.
  2. Use the New button under Repositories to create a new repository.
  3. In Add .gitignore, select Node.
  4. Create the repository.
  5. Clone the repository to your work machine.

Directory Structure

To run in our Docker container, we’ll use a simple Express application as an example Node.js application. To keep things going, we’ll create our directory structure and basic files using Express’s scaffolding tool.

$ npx express-generator — no-view addressbook$ cd addressbook$ npm install

A number of files, including bin and routes folders, should have been produced in your directory as a result of this. Make sure to run npm install so that npm can set up and use all of your Node.js modules.

We’ll create an addressbook API that keeps a database of people’s names.

Add a Route

Each HTTP request is handled by a route. There are a few example routes in the express starter project, and we’ll create one more to handle our API calls.

  • Create a new file called routes/persons.js with the following content:
// persons.jsvar express = require('express');
var router = express.Router();
var db = require('../database');
router.get("/all", function(req, res) {
db.Person.findAll()
.then( persons => {
res.status(200).send(JSON.stringify(persons));
})
.catch( err => {
res.status(500).send(JSON.stringify(err));
});
});
router.get("/:id", function(req, res) {
db.Person.findByPk(req.params.id)
.then( person => {
res.status(200).send(JSON.stringify(person));
})
.catch( err => {
res.status(500).send(JSON.stringify(err));
});
});
router.put("/", function(req, res) {
db.Person.create({
firstName: req.body.firstName,
lastName: req.body.lastName,
id: req.body.id
})
.then( person => {
res.status(200).send(JSON.stringify(person));
})
.catch( err => {
res.status(500).send(JSON.stringify(err));
});
});
router.delete("/:id", function(req, res) {
db.Person.destroy({
where: {
id: req.params.id
}
})
.then( () => {
res.status(200).send();
})
.catch( err => {
res.status(500).send(JSON.stringify(err));
});
});
module.exports = router;

This file implements all the API methods our application will support, we can:

  • Get all persons
  • Create a person
  • Get a single person by id
  • Delete a person

All of the routes return JSON-encoded information about the person.

Configuring the Database

All person routes require a database to store the data. We’ll use a PostgreSQL database to keep our contact details.

  1. Install the PostgreSQL node driver and sequelize ORM:
$ npm install — save pg sequelize

Sequelize handles all our SQL code for us, it will also create the initial tables on the database.

2. Create a file called database.js

// database.jsconst Sequelize = require('sequelize');
const sequelize = new Sequelize(process.env.DB_SCHEMA || 'postgres',
process.env.DB_USER || 'postgres',
process.env.DB_PASSWORD || '',
{
host: process.env.DB_HOST || 'localhost',
port: process.env.DB_PORT || 5432,
dialect: 'postgres',
dialectOptions: {
ssl: process.env.DB_SSL == "true"
}
});
const Person = sequelize.define('Person', {
firstName: {
type: Sequelize.STRING,
allowNull: false
},
lastName: {
type: Sequelize.STRING,
allowNull: true
},
});
module.exports = {
sequelize: sequelize,
Person: Person
};

The database file specifies the PostgreSQL connection parameters as well as the person model. The model simply has two fields: firstName and lastName, but you can add more if you want to experiment. For further information, see the sequelize model doc.

3. Create a new file for database migration at bin/migrate.js:

// bin/migrate.jsvar db = require('../database.js');
db.sequelize.sync();

Let’s add a test for the database. We’ll use Jest, a JavaScript testing library.

4. Install Jest:

$ npm install — save-dev jest

5. Create a new file called database.test.js:

const db = require('./database');beforeAll(async () => {
await db.sequelize.sync({ force: true });
});
test('create person', async () => {
expect.assertions(1);
const person = await db.Person.create({
id: 1,
firstName: 'Bobbie',
lastName: 'Draper'
});
expect(person.id).toEqual(1);
});
test('get person', async () => {
expect.assertions(2);
const person = await db.Person.findByPk(1);
expect(person.firstName).toEqual('Bobbie');
expect(person.lastName).toEqual('Draper');
});
test('delete person', async () => {
expect.assertions(1);
await db.Person.destroy({
where: {
id: 1
}
});
const person = await db.Person.findByPk(1);
expect(person).toBeNull();
});
afterAll(async () => {
await db.sequelize.close();
});

6. Edit package.json and add the following lines in the scripts section:

"scripts": {
"start": "node ./bin/www",
"test": "jest",
"migrate": "node ./bin/migrate.js"
},

The test code goes through all the basic database operations:

  • Create an initial table with sync()
  • Create a person.
  • Get the person.
  • Delete the person.

Start the Application

We’re almost ready to start the application for the first time. We only need to add the new routes to the main file: app.js

  1. Create a persons router object near the index router:
// app.js. . .var indexRouter = require('./routes/index');// add the following line near the indexRouter
var personsRouter = require('./routes/persons');
. . .

2. Add the persons router object to the application near the other app.use() lines:

// app.js. . .app.use('/', indexRouter);// add the following line near app.use indexRouter
app.use('/persons', personsRouter);
. . .

3. To start the application:

$ npm start

Check the new application on http://localhost:3000

If you go to http://localhost:3000/persons/all you’ll see a connection error message.

That’s to be expected as we didn’t provide the application any database to work with.

We’ll use Docker to run our database in the following sections.

Setting Up PM2

While running our Node.js application node bin/www is fine for most cases, we want a more robust solution to keep everything running smoothly in production. It’s recommended to use pm2, since you get a lot of tunable features.

We can’t go too deep into how pm2 works or how to use it, but we will create a basic processes.json file that pm2 can use to run our application in production.

$ npm install — save pm2

To make it easier to run our Node.js application and understand what parameters we are giving to PM2, we can use an arbitrarily-named JSON file, processes.json, to set up our production configuration:

{
"apps": [
{
"name": "api",
"script": "./bin/www",
"merge_logs": true,
"max_restarts": 20,
"instances": 4,
"max_memory_restart": "200M",
"env": {
"PORT": 3000,
"NODE_ENV": "production"
}
}
]
}

In the processes.json we have:

  • Named our application,
  • Defined the file to run,
  • Sets Node.js arguments,
  • Set the environment variables.

Finally, edit package.json to add a pm2 action, the scripts section should look like this:

"scripts": {
"pm2": "pm2 start processes.json --no-daemon",
"start": "node ./bin/www",
"test": "jest",
"migrate": "node ./bin/migrate.js"
},

To start the application with pm2:

$ npm run pm2

Installing Docker

Running Postgres With Docker

With Docker, we can run any pre-packaged application in seconds. Look how easy it is to run a PostgreSQL database:

$ docker run -it -e “POSTGRES_HOST_AUTH_METHOD=trust” -p 5432:5432 postgres

Docker will download a PostgreSQL image and start it on your machine with the 5432 port mapped to your local network.

Now, with the database running, open a new terminal and execute the migrations to create the table:

$ npm run migrate

The application should be fully working now:

$ npm run pm2

Try again the http://localhost:3000/persons/all route, the error message should be gone now.

Also, the database tests should be passing now:

$ npm run test

Creating a Dockerfile

We’ve used Docker to run our database without having to install it. But Docker can do much more; it can create portable images so others can run our software.

There are many ways to use Docker, but one of the most useful is through the creation of Dockerfiles. These are files that essentially give build instructions to Docker when you build a container image. This is where the magic happens.

Let’s create a Dockerfile in the root of our project directory:

$ cd ..

We need to choose which base image to pull from to get started. We are essentially telling Docker “Start with this.” This can be hugely useful if you want to create a customized base image and later create other, more-specific containers that ‘inherit’ from a base container. We’ll be using the official Node image since it gives us what we need to run our application and has a small footprint.

Create a file called Dockerfile:

# DockerfileFROM node:16.15-alpine3.14
RUN mkdir -p /opt/app
WORKDIR /opt/app
RUN adduser -S app
COPY addressbook/ .
RUN npm install
RUN npm install --save pm2
RUN chown -R app /opt/app
USER app
EXPOSE 3000
CMD [ "npm", "run", "pm2" ]

The Dockerfile consists of the following commands:

  • FROM: tells Docker what base image to use as a starting point.
  • RUN: executes commands inside the container.
  • WORKDIR: changes the active directory.
  • USER: changes the active user for the rest of the commands.
  • EXPOSE: tells Docker which ports should be mapped outside the container.
  • CMD: defines the command to run when the container starts.

Every time a command is executed, it acts as a sort of git commit-like action in that it takes the current image, executes commands on top of it, and then returns a new image with the committed changes. This creates a build process that has high granularity—any point in the build phases should be a valid image—and lets us think of the build more atomically (where each step is self-contained).

This part is crucial for understanding how to speed up our container builds. Since Docker will intelligently cache files between incremental builds, the further down the pipeline we can move to build steps, the better. That is, Docker won’t re-run commits when those build steps have not changed.

Create a file called .dockerignore:

.git
.gitignore
node_modules/

The .dockerignore is similar to a .gitignore file and lets us safely ignore files or directories that shouldn’t be included in the final Docker build.

Bundling and Running the Docker Container

We’re almost there. To run our container locally, we need to do two things:

  • Build the container:
$ docker build -t addressbook .
  • Run the container:
$ docker run -it -p 3000:3000 addressbook

If you now go to http://localhost:3000/persons/all you’ll find the same connection error as before. This will happen even if the PostgreSQL container is running.

This shows an interesting property of containers: they get their own network stack. The application, by default, tries to find the database in localhost, but technically, the database is in a different host. Even though all containers are running on the same machine, each container is its own localhost, so the application fails to connect.

We could use Docker network commands to manage the container’s network details. Instead, we’ll rely on Docker Compose to manage the containers for us.

Docker Compose

Docker Compose is a tool for managing multi-container applications. Docker Compose is bundled with Docker Desktop for Windows and Mac. On Linux, it has to be installed separately, check the installation page for details.

Docker Compose can:

  • Start and stop multiple containers in sequence.
  • Connect containers using a virtual network.
  • Handle persistence of data using Docker Volumes.
  • Set environment variables.
  • Build or download container images as required.

Docker Compose uses a YAML definition file to describe the whole application.

  • Create a file called docker-compose.yml:
# docker-compose.ymlversion: "3.9"
services:
postgres:
image: postgres
environment:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
ports:
- '5432:5432'
volumes:
- addressbook-db:/var/lib/postgresql/data
addressbook:
build:
context: .
environment:
DB_SCHEMA: postgres
DB_USER: postgres
DB_PASSWORD: postgres
DB_HOST: postgres
depends_on:
- postgres
ports:
- '3000:3000'

volumes:
addressbook-db:

Stop the PostgreSQL container if it’s still running by pressing CTRL-C on its terminal. You can check for running containers with:

$ docker ps

Start Docker Compose and run the tests. Compose will build the image as needed and map the data volumes:

$ docker compose run addressbook npm test

We can start the app and use curl to test the endpoint:

$ docker compose up -d
$ curl -w "\n" \
-X PUT \
-d "firstName=Bobbie&lastName=Draper" \
localhost:3000/persons

Bobbie’s contact should have been created:

$ curl -w "\n" localhost:3000/persons/all
[{"id":1,"firstName":"Bobbie","lastName":"Draper","createdAt":"2022-06-07T15:12:38.232Z","updatedAt":"2022-06-07T15:12:38.232Z"}]

Perfect, now that everything works, push all the new code to GitHub:

$ git add -A 
$ git commit -m "initial commit"
$ git push origin master

In Part 2 we are going to discuss how to dockerize the Node application with CI/CD. Until now all the source code for this tutorial is available on GitHub.

--

--

Nisal Renuja Palliyaguru
SLIIT FOSS Community

Senior Software Engineer | Postman Student Leader | CNCF Chapter Organizer | FOSS Enthusiast