Writing A Microservice Using Node.js

Akinnurun Samuel
15 min readJan 8, 2024

JavaScript keeps one of the leading positions in the rating of programming languages. There are two implementations for JavaScript: browser and Node.js. The second is widely used to develop web services.

I started to learn JavaScript when I began my career path in IT (about eight years ago), and just a few years ago, I read the magnificent JavaScript book “JavaScript: The Definitive Guide” by David Flanagan.

This book answered many questions and showed me some interesting JavaScript recipes I’m using in my everyday work. But more important, I started to practice web services development with Node.js (before, I used JS only for frontend development). I recommend reading this book for both new and experienced JavaScript developers.

In this post, I want to show how to build a microservice using Node.js. Based on my web development experience, I’ll try to create a scalable, robust, reliable and performant solution by using the popular Node.js stack.

Introduction

In this article, I want to show how to build a microservice on the Task Management web service example. It will provide the next API:

  • create a task with a name and description;
  • get task by its identifier;
  • update task status/name/description.

It’s a simple API that can show how powerful Node.js is in building web applications. The development process is fast and easy.

Some application requirements:

  • The task should be created with the status ‘new’;
  • Available status transitions: from ‘new’ to ‘active,’ from ‘new’ to ‘canceled,’ from ‘active’ to ‘completed,’ from ‘active’ to ‘canceled’;
  • Avoid race conditions (more details later).

The main non-functional requirements are:

  • Scalability — microservice should be able to handle an increasing amount of requests;
  • Elasticity — microservice should be able to handle spikes;
  • Performance — microservice should respond quickly for better user experience;
  • Resilience — microservice should be fault-tolerant and able to recover so that it can continue to function correctly;
  • Monitoring — microservice should provide ways to monitor its health;
  • Observability — microservice should generate log stream and metrics to be able to maintain it;
  • Testability — microservice should be easy to test;
  • Stateless — microservice should not store client context, instead, state should be stored in the database;
  • Deplorability — microservice should be easy to deploy and update.

It’s all possible when developing a web application with Node.js. Let’s discuss how to achieve these requirements in the following steps.

Stack

You have to choose the tech stack you will use to build a web service from the beginning. Of course, the very first question is programming language. As you understand, for this microservice, I’m going to use Node.js, but anyway, I’ll name a few benefits of using it to develop web services:

  • JavaScript is already the dominant language for frontend development, and it makes sense to use it for backend development as well so the same developers can develop full-stack applications;
  • The JavaScript community is huge. You can find answers to any question that can arise during development. Also, a lot of libraries are developed and maintained by the community. You may find different 3rd parties solving similar problems, each having its own features;
  • Node.js interprets JavaScript using Google’s V8 engine, which quickly compiles JS into machine code.

And so on, the list has no end! But that’s not the biggest question.

Database

I need to persist the data between web requests. It’s a complex task to develop scalable stateful web services. So it’s recommended to keep your web application stateless and persist state in the external database instead.

To develop this service, I will use the popular document-oriented database MongoDB:

MongoDB is a NoSQL database that provides a few benefits over SQL databases:

  • Schemaless — MongoDB collection (analog to SQL table) can hold documents with different schemas. You don’t need to define your structure first before storing documents in the collection;
  • Scalability — MongoDB is designed to be scaled out across many servers;
  • Performance — MongoDB is optimized for read-heavy workloads and can store large amounts of data.

It’s a popular choice when you’re developing Node.js web service.

Web Framework

Web Framework is needed to build web applications. It handles a lot of everyday tasks that are required when developing web services. To name a few: routing, security, binding, etc.

There are plenty of options when it comes to Node.js web frameworks. The most popular among them is express:

The most significant benefit of express is that it’s straightforward to use and requires a minimum amount of code to start the web server. There is a ‘Hello, World!’ example written with express:

const express = require('express')
const app = express()
const port = 3000;

app.get('/', (req, res) => {
res.send('Hello World!')
})
app.listen(port, () => {
console.log(`Example app listening on port ${port}`)
})

Also, express has a huge community. You can find different libraries that extend your server functionality (usually developed as middleware).

Validation

Validation is an important part of web applications because you never know how users will use your API. Intruders might break your application by providing some invalid input.

To validate parameters provided in the web request (in the path, body, etc.) are correct, I’ll use joi:

It’s a powerful library used to validate different models. There is an example of validating one of the requests I’ll develop later:

const createTask = {
body: Joi.object().keys({
name: Joi.string().required(),
description: Joi.string().optional(),
}),
};

It validates that the object has a nested object body with two strings: mandatory name and optional description.

But that’s not all. Providing some dangerous MongoDB injection that can drop collection is still possible. To sanitize web requests, I will use the express-mongo-sanitize package:

Configuration

You need to make your application configurable so the same build artifact can be run in different environments by changing the configuration. The standard approach is to provide configuration via environment variables.

You probably don’t want to manually set environment variables on your local machine before the application starts. The popular Node.js solution for this issue is implemented in the library dotenv:

This library loads content from a file named .env (filename can be changed) and sets content from this file to environment variables.

Static Analysis

With JavaScript applications, you can easily enforce code style and follow the same rules when developing your applications by installing the library ESLint:

ESLint improves code quality and might detect some bugs. It can identify security vulnerabilities. You can force your team to follow the rules introduced by ESLint by including ESLint checks during continuous integration (CI). I will do it later.

Testing

It’s a good practice to write autotests for your application to ensure it works as before when you’re changing it. Different test types exist: unit, integration, load, end-to-end (E2E), etc. If you want to be sure of your application’s quality and performance, you need to have more tests.

One of the most popular libraries for testing JavaScript applications is Jest:

With Jest, I will implement unit and integration tests. They helped me improve and refactor code in the later stages and confirm that the application had not been broken.

Logging

Application log stream helps you to ‘remotely debug’ web service. You can simply identify the code execution path and explain the request logic in different circumstances if you cover your code with logs.

The most popular package to collect JavaScript logs is winston:

It’s a simple but powerful logging library that helps you collect logs using different transports (console, file, etc.). You can also change the log format (simple text, JSON, etc.).

Metrics

With metrics, you can monitor the health of your application. You can see the number of incoming requests, average request execution time, number of 5XX responses, etc. Having metrics, you can set up different monitors that, in case something goes wrong, will notify you by e-mail, notifications, etc.

For my application, I will install Prometheus middleware that will collect standard web application metrics called express-prom-bundle:

More about Prometheus in the next section.

Monitoring Stack

Log stream and metrics should be collected in some database so later that can be used to monitor them or visualize.

I will use the next stack to collect and visualize logs and metrics:

  • Prometheus — open-source monitoring alerting toolkit that uses a pull model to collect metrics;
  • Promtail — an agent that contains and ships logs;
  • Loki — log aggregation system;
  • Grafana — observability system.

Local Infrastructure

To create the local stack needed to develop and test the application locally, I’m going to use Docker:

With Docker, you can start a local environment similar to what will be used in staging and production environments. You don’t need to install many tools on your local machine. Instead, you can execute several commands to start the stack you need.

With Docker Compose, you can define all the infrastructure with a single compose.yml file and start it with a single command:

docker compose up -d

Continuous Integration

To be sure commits do not break anything, you need continuous integration (CI).

For this purpose, I will use GitHub Actions:

There is a free tier for your GitHub account so you can run simple builds to check your application code.

Develop Application

I’ve started with application structure and found a great repository that helped me to follow suitable project styles:

I’ve borrowed some code from this repository (e.g., validation middleware), so I recommend for you to check it.

Later, I’ll share a link to the repository, but before that, I want to show some interesting application parts.

I’ve used mongoose to integrate applications with MongoDB. First, I’ve defined the schema of the model:

const mongoose = require('mongoose');
const { Schema } = mongoose;
const TaskSchema = new Schema(
{
name: {
type: String,
required: true,
},
description: {
type: String,
required: false,
},
status: {
type: String,
enum: ['new', 'active', 'completed', 'cancelled'],
default: 'new',
},
createdAt: {
type: Date,
default: Date.now,
},
updatedAt: Date,
},
{ optimisticConcurrency: true },
);
module.exports = mongoose.model('task', TaskSchema);

This object can be used to validate the model and perform different MongoDB operations from your code. This is an example of a task update:

async function updateTaskById(id, { name, description, status }) {
if (!name && !description && !status) {
return { error: 'at least one update required', code: AT_LEAST_ONE_UPDATE_REQUIRED_CODE };
}

if (status && !(status in availableUpdates)) {
return { error: 'invalid status', code: INVALID_STATUS_CODE };
}
for (let retry = 0; retry < 3; retry += 1) {
// eslint-disable-next-line no-await-in-loop
const task = await Task.findById(id);
if (!task) {
return { error: 'task not found', code: INVALID_STATUS_TRANSITION_CODE };
}
if (status) {
const allowedStatuses = availableUpdates[task.status];
if (!allowedStatuses.includes(status)) {
return {
error: `cannot update from '${task.status}' to '${status}'`,
code: TASK_NOT_FOUND_CODE,
};
}
}
task.status = status ?? task.status;
task.name = name ?? task.name;
task.description = description ?? task.description;
task.updatedAt = Date.now();
try {
// eslint-disable-next-line no-await-in-loop
await task.save();
} catch (error) {
logger.warn('error during save', { error });
if (error.name === 'VersionError') {
// eslint-disable-next-line no-continue
continue;
}
}
return task;
}
return { error: 'concurrency error', code: CONCURRENCY_ERROR_CODE };
}

The most exciting part is saving the model after the update. I’m using an optimistic lock to fight against the race condition problem.

Imagine in two concurrent requests, you’re trying to complete and cancel the same task. Race conditions might occur when they both get a task with the status ‘active’ and save the model. The first task status might be changed to ‘completed’ and then to ‘canceled’ (or vice versa). This is wrong because the transition ‘completed’-’canceled’ and ‘canceled’-’completed’ is prohibited.

Mongoose has the solution for this issue implemented with optimistic locks. Optimistic lock is a strategy used in databases to handle concurrent requests. Each document has an additional version property. When the transaction tries to save/update the model, it checks the version. If the version differs from the version received during the get query, somebody has already updated the document concurrently. The transaction is aborted (in the code above, an error is thrown).

Document example:

{
"_id": {
"$oid": "654e03210948a61665b7c889"
},
"name": "damnatio",
"description": "Ciminatio totus spiritus suffoco damnatio blanditiis.",
"status": "completed",
"createdAt": {
"$date": "2023-11-10T10:17:05.039Z"
},
"__v": 2,
"updatedAt": {
"$date": "2023-11-10T10:17:05.064Z"
}
}

The document above stores the version in property __v.

The next level is a controller. Here is controller example:

const updateTaskById = catchAsync(async (req, res) => {
const result = await taskService.updateTaskById(req.params.id, req.body);
if (result.error) {
switch (result.code) {
case taskService.errorCodes.AT_LEAST_ONE_UPDATE_REQUIRED_CODE:
res.status(400).json({ success: false, message: 'at least one update required' });
return;
case taskService.errorCodes.INVALID_STATUS_CODE:
res.status(400).json({ success: false, message: 'invalid status' });
return;
case taskService.errorCodes.INVALID_STATUS_TRANSITION_CODE:
res.status(404).json({ success: false, message: 'task not found' });
return;
case taskService.errorCodes.TASK_NOT_FOUND_CODE:
res.status(400).json({ success: false, message: result.error });
return;
case taskService.errorCodes.CONCURRENCY_ERROR_CODE:
res.status(500).json({ success: false, message: 'concurrency error' });
return;
default:
res.status(500).json({ success: false, message: 'internal server error' });
return;
}
}

res.status(200).json({
success: true,
task: toDto(result),
});
});

It’s responsible for executing application business logic and returning the HTTP response. Controllers are registered in the routes module:

const { Router } = require('express');
const taskController = require('../../../controllers/task');
const taskValidation = require('../../../validation/task');
const validate = require('../../../middlewares/validate');

const router = Router();
router.get('/:id', validate(taskValidation.getTaskById), taskController.getTaskById);
router.put('/', validate(taskValidation.createTask), taskController.createTask);
router.post('/:id', validate(taskValidation.updateTaskById), taskController.updateTaskById);
module.exports = router;

/**
* @swagger
* tags:
* name: Tasks
* description: Task management and retrieval
* /v1/tasks/{id}:
* get:
* summary: Get a task by id
* tags: [Tasks]
* description: Get a task by id
* parameters:
* - in: path
* name: id
* schema:
* type: string
* required: true
* description: Task id
* example: 5f0a3d9a3e06e52f3c7a6d5c
* responses:
* 200:
* description: Task Retrieved
* content:
* application/json:
* schema:
* $ref: '#/components/schemas/TaskResult'
* 404:
* description: Task not found
* content:
* application/json:
* schema:
* $ref: '#/components/schemas/TaskResult'
* 500:
* description: Internal Server Error
* post:
* summary: Update a task by id
* tags: [Tasks]
* description: Update a task by id
* parameters:
* - in: path
* name: id
* schema:
* type: string
* required: true
* description: Task id
* example: 5f0a3d9a3e06e52f3c7a6d5c
* requestBody:
* required: true
* content:
* application/json:
* schema:
* $ref: '#/components/schemas/UpdateTask'
* responses:
* 200:
* description: Task Updated
* content:
* application/json:
* schema:
* $ref: '#/components/schemas/TaskResult'
* 404:
* description: Task not found
* content:
* application/json:
* schema:
* $ref: '#/components/schemas/TaskResult'
* 500:
* description: Internal Server Error
* /v1/tasks:
* put:
* summary: Create a task
* tags: [Tasks]
* description: Create a task
* requestBody:
* required: true
* content:
* application/json:
* schema:
* $ref: '#/components/schemas/CreateTask'
* responses:
* 201:
* description: Task Created
* content:
* application/json:
* schema:
* $ref: '#/components/schemas/TaskResult'
* 500:
* description: Internal Server Error
*/

At the bottom, you can see the OpenAPI specification used by Swagger middleware to generate the API documentation page.

Each route registration uses two handlers: the validator and the controller method itself. The validator validates the schema registered in different models. Validator handler:

const Joi = require('joi');
const pick = require('../utils/pick');

function validate(schema) {
return (req, res, next) => {
const validSchema = pick(schema, ['params', 'query', 'body']);
const object = pick(req, Object.keys(validSchema));
const { value, error } = Joi.compile(validSchema)
.prefs({ errors: { label: 'key' }, abortEarly: false })
.validate(object);
if (error) {
const errorMessage = error.details.map((details) => details.message).join(', ');
res.status(400).json({ success: false, message: errorMessage });
return;
}
Object.assign(req, value);
next();
};
}
module.exports = validate;

And there is an update request validation schema:

const updateTaskById = {
params: Joi.object().keys({
id: objectId.required(),
}),
body: Joi.object().keys({
name: Joi.string().optional(),
description: Joi.string().optional(),
status: Joi.string().valid('new', 'active', 'completed', 'cancelled').optional(),
}),
};

For the update method, I’ve implemented integration tests only. Integration test starts the server and stops it before and after all tests run:

const path = require('path');
const app = require('../../src/app');
const db = require('../../src/db');
const { createConfig } = require('../../src/config/config');
const logger = require('../../src/config/logger');
const setupServer = () => {
let server;
const configPath = path.join(__dirname, '../../configs/tests.env');
const config = createConfig(configPath);
beforeAll(async () => {
logger.init(config);
await db.init(config);
await new Promise((resolve) => {
server = app.listen(config.port, () => {
resolve();
});
});
});
afterAll(async () => {
await new Promise((resolve) => {
server.close(() => {
resolve();
});
});
await db.destroy();
logger.destroy();
});
};
module.exports = {
setupServer,
};

And there is a test that performs a PUT request (create task) and then a POST request (update task):

describe('should create & update a task', () => {
const data = [
{
name: 'only status update',
taskName: 'Task 1',
description: 'Task 1 description',
newStatus: 'active',
},
{
name: 'english full update',
taskName: 'Task 1',
description: 'Task 1 description',
newTaskName: 'Task 1 New',
newDescription: 'Task 1 New description',
newStatus: 'active',
},
{
name: 'english only name update',
taskName: 'Task 1',
description: 'Task 1 description',
newTaskName: 'Task 1 New',
},
{
name: 'english only description update',
taskName: 'Task 1',
description: 'Task 1 description',
newDescription: 'Task 1 New description',
},
{
name: 'japanese full update',
taskName: 'タスク 1',
description: 'タスク 1 説明',
newTaskName: 'タスク 1 新',
newDescription: 'タスク 1 新 説明',
newStatus: 'active',
},
{
name: 'japanese only name update',
taskName: 'タスク 1',
description: 'タスク 1 説明',
newTaskName: 'タスク 1 新',
},
{
name: 'japanese only description update',
taskName: 'タスク 1',
description: 'タスク 1 説明',
newDescription: 'タスク 1 新 説明',
},
{
name: 'japanese only status update',
taskName: 'タスク 1',
description: 'タスク 1 説明',
newStatus: 'active',
},
{
name: 'chinese full update',
taskName: '任务 1',
description: '任务 1 描述',
newTaskName: '任务 1 新',
newDescription: '任务 1 新 描述',
newStatus: 'active',
},
{
name: 'chinese only name update',
taskName: '任务 1',
description: '任务 1 描述',
newTaskName: '任务 1 新',
},
{
name: 'chinese only description update',
taskName: '任务 1',
description: '任务 1 描述',
newDescription: '任务 1 新 描述',
},
{
name: 'chinese only status update',
taskName: '任务 1',
description: '任务 1 描述',
newStatus: 'active',
},
{
name: 'emoji full update',
taskName: '👍',
description: '👍',
newTaskName: '👍 👍',
newDescription: '👍 👍 👍',
newStatus: 'active',
},
{
name: 'emoji only name update',
taskName: '👍',
description: '👍',
newTaskName: '👍 👍',
},
{
name: 'emoji only description update',
taskName: '👍',
description: '👍',
newDescription: '👍 👍',
},
{
name: 'emoji only status update',
taskName: '👍',
description: '👍',
newStatus: 'active',
},
];

data.forEach(({
name, taskName, description, newTaskName, newDescription, newStatus,
}) => {
it(name, async () => {
let response = await fetch(baseUrl, {
method: 'put',
body: JSON.stringify({
name: taskName,
description,
}),
headers: { 'Content-Type': 'application/json' },
});
expect(response.status).toEqual(201);
const result = await response.json();
expect(result.task).not.toBeNull();
expect(result.success).toEqual(true);
expect(result.task.id).not.toBeNull();
response = await fetch(`${baseUrl}/${result.task.id}`, {
method: 'post',
body: JSON.stringify({
name: newTaskName,
description: newDescription,
status: newStatus,
}),
headers: { 'Content-Type': 'application/json' },
});
expect(response.status).toEqual(200);
const result2 = await response.json();
expect(result2).toEqual({
success: true,
task: {
id: result.task.id,
name: newTaskName ?? taskName,
description: newDescription ?? description,
status: newStatus ?? 'new',
createdAt: result.task.createdAt,
updatedAt: expect.any(String),
},
});
expect(new Date() - new Date(result2.task.updatedAt)).toBeLessThan(1000);
});
});
});

To create an application Docker image I’ve defined a simple Dockerfile:

FROM node:20-alpine
WORKDIR /app
COPY package.json yarn.lock ./
RUN yarn install --frozen-lockfile
COPY src /app/src
CMD ["node", "./src/index.js"]

To start application and infrastructure, there is compose.yml definition:

version: '3.9'
services:
app:
build: .
ports:
- '8081:80'
depends_on:
- mongo
volumes:
- ./configs/docker.env:/app/configs/.env
- logs:/app/logs:rw
mongo:
image: mongo:5
restart: always
ports:
- 27017:27017
volumes:
- mongodata:/data/db
healthcheck:
test: echo 'db.runCommand("ping").ok' | mongo localhost:27017/test --quiet
interval: 10s
timeout: 2s
retries: 5
start_period: 5s
loki:
image: grafana/loki:2.9.0
expose:
- 3100
command: -config.file=/etc/loki/local-config.yaml
promtail:
image: grafana/promtail:2.9.0
volumes:
- logs:/var/log:rw
- ./infrastructure/promtail.yml:/etc/promtail/config.yml
command: -config.file=/etc/promtail/config.yml
prometheus:
image: prom/prometheus:latest
volumes:
- ./infrastructure/prometheus.yml:/etc/prometheus/prometheus.yml
command:
- '--config.file=/etc/prometheus/prometheus.yml'
expose:
- 9090
grafana:
image: grafana/grafana:latest
volumes:
- grafanadata:/var/lib/grafana
environment:
- GF_PATHS_PROVISIONING=/etc/grafana/provisioning
- GF_AUTH_ANONYMOUS_ENABLED=true
- GF_AUTH_ANONYMOUS_ORG_ROLE=Admin
ports:
- 3000:3000
volumes:
mongodata:
grafanadata:
logs:

Every Git push triggers GitHub Actions CI to run the build. During CI, I’m installing dependencies, running linter and running all tests:

name: App CI
on:
push:
branches:
- "*"
jobs:
ci:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3
with:
node-version: 20
cache: "yarn"
- run: yarn install --frozen-lockfile
- run: yarn run lint
- run: docker-compose up -d mongo
- run: yarn test -- --verbose --coverage
- run: docker-compose build
- run: docker-compose logs
if: always()
- run: docker-compose down --volumes
if: always()

Conclusions

Node.js is a powerful technology. More important, the Node.js community is very huge. You can use the different stack whenever you develop a new web service. But that’s not the way I recommend; if you want to become a proficient Node.js web developer, you need to learn the features of some specific technologies first and only then try to use the different stack.

The stack I’ve used to build web applications in this post is most popular for building Node.js web services. There are a lot of documentation and libraries that will help you to implement different features.

It’s easy to achieve all the non-functional requirements I set before with Node.js. You can build an application docker image and host it with Kubernetes. Kubernetes will help you to scale in and out your application quickly by changing the deployment definition. As well, Kubernetes can scale your application based on the incoming traffic. Also, MongoDB is designed to scale based on your needs, so increasing the traffic won’t be a problem for you.

Google’s V8 Node.js engine helps you to achieve better application performance. The speed of translation of your source code into machine code impresses!

With Node.js, you can find many articles on the Internet on building fault-tolerant applications. Just follow the best practices! Also, in the application code, I’ve spent some time restoring the MongoDB connection after the unexpected connection failure (it can be a network issue or MongoDB failure, anything). You can check it and use it for your applications.

With Jest, you can write different tests for your application and achieve 100% test coverage. Some tricky scenarios can be emulated with Jest as well.

Happy coding!

--

--