Node.js and Database-Per-Service: A Practical Implementation Guide

Abhishek Karia
Simform Engineering
7 min readMay 13, 2024

Join database tables efficiently using an aggregator pattern with database-per-service.

In this blog post, we will explore implementing a microservices architecture using the “Database per service,” Messaging pattern with RabbitMQ, and Aggregate pattern.

We’ll build the microservices using Node.js, Express.js, and MongoDB and facilitate communication between them using RabbitMQ.

Why should you use a database per service?

When determining the microservices architecture pattern, adopting a “database per service” design is often driven by the desire to develop, deploy, and scale services independently.

This approach provides the flexibility to achieve a “fault-tolerant” and “resilient” architecture, where individual service failures are isolated, allowing the overall system to continue operating without a complete breakdown.

However, it’s crucial to acknowledge that the “database per service” model comes with its set of challenges, which we will delve into in this blog.

POC Overview

In this POC, we will implement four services: User, Post, Comment, and Aggregator. We’ll also use RabbitMQ for communication.

  • The client initiates user registration and retrieves user details through interaction with the User Service.
  • After registration, users can submit social media posts using the Post Service.
  • Users can also comment on posts through the Comment Service.
  • To track comment counts for each post, a mechanism is established.
  • The Comment Service notifies the Post Service via RabbitMQ when a new comment is added.
  • The notification prompts the Post Service to update the comment count in the respective post record within the post database. This ensures an accurate and up-to-date count of comments for each post.
  • To get data from different database tables. which needs to apply DB join, the client can call Aggregator Service, which will call APIs from User service and/or Post service and/or Comment service, combine the responses, and create a new response.

Consider the architecture diagram:

Recognizing the Necessity of Leveraging RabbitMQ for Messaging

Asynchronous messaging ensures seamless data exchange, and the Aggregator pattern simplifies cross-database interactions for enhanced efficiency and scalability.

If considering REST API implementation as a viable solution, it’s essential to acknowledge and address two specific challenges:

  1. Response Time Delay: Let’s say we inform the post service through an API call that increments the comment counter in the post service. In that case, we are calling the post service API from the comment service API, which will add some delay.
  2. Data Loss: Let’s say we ignore the response time delay, and still, we have one problem: data loss. If the post service is down when we call it from the comment service, it results in data loss.

To address these challenges, we’ve adopted RabbitMQ for internal communication. When informing the Post Service from the Comment Service, we utilize RabbitMQ to send messages without waiting for an immediate response.

The inherent reliability of RabbitMQ ensures message delivery, even if the Post Service is temporarily unavailable. Messages persist in the queue until consumed by a consumer, guaranteeing delivery once the service is back online. This decoupled and resilient communication mechanism enhances the overall reliability and responsiveness of our system.

Recognizing the Necessity of Leveraging Aggregator Pattern

In the context of the database-per-service pattern, where joining tables across different databases poses a challenge, we’ve introduced an Aggregator Service as a practical solution.

This service simplifies the complexities associated with cross-database joins by simultaneously calling APIs from multiple services. It efficiently processes the responses from these services, amalgamating them into a cohesive and consolidated output.

This streamlined approach not only overcomes the intricacies of database interactions but also enhances the overall efficiency and maintainability of our system.

Implementation Details

User Service Implementation:

Here, we created a simple Express.js server connected to userdb and implemented 3 APIs in the user service: one to get all the registered users, the second to get the user from userId, and the third to create/register the user.

// user-service.js
const express = require('express');
const mongoose = require('mongoose');
const bodyParser = require('body-parser');

const app = express();
const PORT = 3002;

mongoose.connect('mongodb://localhost/userdb', { useNewUrlParser: true, useUnifiedTopology: true });
const User = mongoose.model('User', { name: String, email: String });

app.use(bodyParser.json());

app.get('/users', async (req, res) => {
const users = await User.find();
res.json({ users });
});

app.get('/users/:userId', async (req, res) => {
const userId = req.params.userId;
const user = await User.findById(userId);
res.json({ user });
});

app.post('/users', async (req, res) => {
const { name, email } = req.body;
const newUser = new User({ name, email });
await newUser.save();
res.status(201).json(newUser);
});

app.listen(PORT, () => {
console.log(`User Service running on port ${PORT}`);
});

Post Service Implementation:

Here, we created a simple Express.js server connected to postdb, established a connection with RabbitMQ to consume messages from the comment service and increment comment counter, and also implemented 3 APIs: one to get all the posts, the second to get the post from userId to get a post of that particular user, and the third to create/add the post.

// post-service.js
const express = require('express');
const mongoose = require('mongoose');
const bodyParser = require('body-parser');
const amqp = require('amqplib');

const app = express();
const PORT = 3001;

mongoose.connect('mongodb://localhost/postdb', { useNewUrlParser: true, useUnifiedTopology: true });
const Post = mongoose.model('Post', { title: String, content: String, commentCount: Number, userId: String });

let channel;

// Connect to RabbitMQ
amqp.connect('amqp://localhost').then((conn) => {
return conn.createChannel();
}).then((ch) => {
channel = ch;
ch.assertQueue('comments');

// Handle messages from RabbitMQ
channel.consume('comments', async (msg) => {
const newComment = JSON.parse(msg.content.toString());

// Increment the comment count for the corresponding post
await Post.updateOne({ _id: newComment.post_id }, { $inc: { commentCount: 1 } });

}, { noAck: true });
}).catch(console.warn);

app.use(bodyParser.json());

app.get('/posts', async (req, res) => {
const posts = await Post.find();
res.json({ posts });
});

app.get('/posts/:userId', async (req, res) => {
const userId = req.params.userId;
const posts = await Post.find({ userId });
res.json({ posts });
});

app.post('/posts', async (req, res) => {
const { title, content, userId } = req.body;
const newPost = new Post({ title, content, commentCount: 0, userId });
await newPost.save();
res.status(201).json(newPost);
});

app.listen(PORT, () => {
console.log(`Post Service running on port ${PORT}`);
});

Comment Service Implementation:

Here, we created a simple Express.js server connected to commentdb, established a connection with RabbitMQ to produce messages, especially for post service to inform when a new comment is added, and also implemented 3 APIs: one to get all the comments, the second to get the comment from userId for that particular user, and the third to create/add the comment.

// comment-service.js
const express = require('express');
const mongoose = require('mongoose');
const bodyParser = require('body-parser');
const amqp = require('amqplib');

const app = express();
const PORT = 3003;

mongoose.connect('mongodb://localhost/commentdb', { useNewUrlParser: true, useUnifiedTopology: true });
const Comment = mongoose.model('Comment', { post_id: String, text: String, userId: String });

app.use(bodyParser.json());

// Connect to RabbitMQ
let channel;
amqp.connect('amqp://localhost').then((conn) => {
return conn.createChannel();
}).then((ch) => {
channel = ch;
return channel.assertQueue('comments');
}).catch(console.warn);

app.get('/comments', async (req, res) => {
const comments = await Comment.find();
res.json({ comments });
});

app.get('/comments/:userId', async (req, res) => {
const userId = req.params.userId;
const comments = await Comment.find({ userId });
res.json({ comments });
});

app.post('/comments', async (req, res) => {
const { post_id, text, userId } = req.body;
const newComment = new Comment({ post_id, text, userId });
await newComment.save();

// Notify the Post Service about the new comment directly via RabbitMQ
channel.sendToQueue('comments', Buffer.from(JSON.stringify(newComment)));

res.status(201).json(newComment);
});

app.listen(PORT, () => {
console.log(`Comment Service running on port ${PORT}`);
});

Aggregator Service Implementation:

Here, we created a simple Express.js server and implemented an API that will call APIs from the user, post, and comment services, combine responses from them, format new responses, and send them back to the API. This simple server doesn’t need a database and RabbitMQ connections.

// aggregator-service.js
const express = require('express');
const axios = require('axios');

const app = express();
const PORT = 3000;

const POST_SERVICE_URL = 'http://localhost:3001';
const USER_SERVICE_URL = 'http://localhost:3002';
const COMMENT_SERVICE_URL = 'http://localhost:3003';

app.use(express.json());

app.get('/aggregatedData/:userId', async (req, res) => {
try {
const userId = req.params.userId;

// Fetch user data
const userResponse = await axios.get(`${USER_SERVICE_URL}/users/${userId}`);
const user = userResponse.data.user;

// Fetch posts for the user
const postsResponse = await axios.get(`${POST_SERVICE_URL}/posts/${userId}`);
const posts = postsResponse.data.posts;

// Fetch comments for the entire user
const commentsResponse = await axios.get(`${COMMENT_SERVICE_URL}/comments/${userId}`);
const comments = commentsResponse.data.comments;


// Map comments to their respective posts
const postsWithComments = posts.map(post => ({
...post,
comments: comments.filter(comment => comment.post_id === post._id.toString()),
}));

// Aggregate data into the user object
const aggregatedData = {
user,
posts: postsWithComments,
};

res.json(aggregatedData);
} catch (error) {
console.error(error);
res.status(500).json({ error: 'Internal Server Error' });
}
});

app.listen(PORT, () => {
console.log(`Aggregator Service running on port ${PORT}`);
});

Key Learnings and Takeaways

  1. Independence through “Database per service”: Adopting a “database per service” model provides the flexibility to develop, deploy, and scale services independently, contributing to a fault-tolerant and resilient architecture.
  2. Asynchronous Messaging with RabbitMQ: Leveraging RabbitMQ for internal communication enhances reliability and responsiveness by allowing services to send messages without waiting for immediate responses. This approach mitigates challenges such as response time delays and potential data loss.
  3. Aggregator Pattern for Cross-Database Joins: In the context of a database-per-service pattern, the Aggregator pattern simplifies cross-database interactions. The Aggregator Service efficiently processes responses from multiple services, overcoming the complexities of joining tables across different databases.
  4. Efficiency and Maintainability: The combination of Node.js, Express, MongoDB, and RabbitMQ showcases a scalable and maintainable microservices system. When understood and implemented correctly, this architecture addresses the dynamic demands of today’s software landscape.

Uncover the full symphony of code from this Git repository!

For more updates on the latest tools and technologies, follow the Simform Engineering blog.

Follow Us: Twitter | LinkedIn

--

--