Microservices with NodeJS using NestJS

Dime Jordanov
7 min readNov 7, 2022

--

Nest (NestJS) is a NodeJS framework suitable for building efficient, scalable Node.js server-side applications. Nest offers a good structure and CLI tools for beginners to get started with the project. It allows you to easily scaffold a new feature with a single command from the CLI.

Nest offers an easy way to convert your application to the microservice architectural style of development. The benefits of it are increased scalability, better resilience and increased productivity. Nest supports several built-in transport layer implementations, called transporters, which are responsible for transmitting messages between different microservice instances. One of the benefits of this is that NestJS abstracts the implementation details and makes it easier to switch between different transporters. We can pick between direct form of communication or utilize a message broker in the process for asynchronous communication.

The focus on this article is the microservice architecture with RabbitMQ. I will be explaining the general structure of the application and the microservices functionality, so some of the other details will be left out.

For the details that are left out you can check the code on Git and the NestJS documentation.

https://github.com/dimitar171/solver-microservices

Application Infrastructure

I wanted to show an application that will have simple functionality but still cover some important microservice design principles. In this article we will design a questions and answers platform. It will consist of an API Gateway that manages the HTTP request and forwards them to the microservices:

· The first service will be an application which will create and read the questions. Each question will have multiple answers.

· The second service will be an application which will create and the read the answers related to the question.

The architecture is shown in the picture below.

Application Architecture

RabbitMQ

To establish an asynchronous communication between the microservice applications, we need to use a message broker. RabbitMQ is an open-source and lightweight message broker which supports multiple messaging protocols. It provides extremely fast service, complex routing and message acknowledgment. As a standard general purpose message broker it is a good fit for our example application.

SQLite

Nest uses TypeORM because it’s the most mature Object Relational Mapper (ORM) available for TypeScript. It provides support for many relational and non-relational databases, but we will use SQLite for simplicity. We will create database for each service, following the Database per service pattern. In the future it will be a good practice to separate the DB in a different container, using PostgresSQL.

Docker

When we develop a microservice architecture it is recommended to containerize the applications. Each service will run on a separate Docker container and we will use docker compose for local container orchestration.

Building the application

Nest offers the option to store the microservices in a single repository using the monorepo project structure. The different microservice application will use the same ts.config.json file and they will be listed in the nest-cli.json file.

To create a monorepo we will start with the standard mode application scaffolding.

$ npm i -g @nestjs/cli

$ nest new solver-microservices

We can convert to monorepo by generating a new project to the standard mode structure:

$ cd solver-microservice

$ nest generate app api-gateway

It will create an api-gateway directory, with its own source folder and ts.config file. We can use the same process for the questions and answers services. Each directory will be stored under the apps folder. We will remove the service (solver-microservice) which was created when we initialized the project. Don’t forget to remove it from the definition in the nest-cli.json file. In the end our project structure will look like this:

The nest CLI structures the code for us. Each directory is scaffolded with main, controller, service and module files. We will define the question-and-answer entities in the respective directories and create data transfer object(DTO) files for them in the api-gateway folder.

Entity()export class Question extends BaseEntity {@PrimaryGeneratedColumn()id: number;@Column()title: string;@Column()description: string;}

Questions

In the previous chapter we created the questions service. Now we need to refactor it to a microservice application so it can establish communication with our api-gateway. In the main function we use createMicroservice. We register the application modules, microservice transport with the port we want to use and the service queue.

async function bootstrap() {const app = await NestFactory.createMicroservice<MicroserviceOptions>(QuestionsModule,{transport: Transport.RMQ,options: {urls: ['amqp://rabbitmq:5672'],queue: 'questions_queue',},},);await app.listen();}bootstrap();

In the controller.ts file we use @MessagePattern() and @EventPattern() instead of classic GET() and POST(). They will respond when they receive the message from the gateway controller, and return the functions defined in the questionService. We use the @EventPattern() for requests from which we don’t expect a response message and the @MessagePattern for requests that retrieve data.

The data from the request body is sent by a DTO.

@Controller()export class QuestionsController {constructor(private questionsService: QuestionsService) {}@EventPattern('question_created')createQuestions(question: CreateQuestionDto) {return this.questionsService.createQuestions(question);}@MessagePattern({ cmd: 'get-all-questions' })getAllQuestions() {return this.questionsService.getAllQuestions();}}

In the QuestionService we define the functions needed to create the questions and retrieve them from the questions repository. Nest creates the repository from the entity class.

@Injectable()export class QuestionsService {constructor(@InjectRepository(Question) private questionRepo: Repository<Question>,) {}async createQuestions(question: CreateQuestionDto) {const newQuestion = await this.questionRepo.create(question);const exists = await this.questionRepo.findOneBy(question);if (exists) {throw new ConflictException('Question title already exists');}try {await this.questionRepo.save(newQuestion);} catch (error) {throw new InternalServerErrorException();}return newQuestion;}async getAllQuestions() {return this.questionRepo.find();}}

Answers

The structure for the answers will be the same as the questions directory. The difference is that we need to pass the questions ID parameter to the controller so the answer can relate to it.

@MessagePattern({ cmd: 'get-all-answers' })getAllAnswers(data: any) {return this.answersService.getAllAnswers(data.id);}

From the repository we retrieve the questions whose questionId is the same as the requests id.

async getAllAnswers(id: number) {const found = await this.answerRepo.find({where: { questionId: id },});return found;}

Api-Gateway

The API gateway is the entry point for the client HTTP requests. The handled requests are sent to the RabbitMQ message broker in different queues depending on the request route. The directory will consist of DTOs folder, main, controller and module file. The main.ts file will define the server on PORT:3000. In the module.ts file we import and register the microservices:

@Module({imports: [ClientsModule.register([{name: 'QUESTIONS_SERVICE',transport: Transport.RMQ,options: {urls: ['amqp://rabbitmq:5672'],queue: 'questions_queue',},},{name: 'ANSWERS_SERVICE',transport: Transport.RMQ,options: {urls: ['amqp://rabbitmq:5672'],queue: 'answers_queue',},},]),],controllers: [ApiGatewayController],})export class ApiGatewayModule {}

The communication between the gateway and the services will go thru the services defined in the module and injected in the gateway controllers. In the controllers we also define the endpoints for the client incoming HTTP requests.

@Controller('/questions')export class ApiGatewayController {constructor(@Inject('QUESTIONS_SERVICE') private clientQuest: ClientProxy,@Inject('ANSWERS_SERVICE') private clientAnsw: ClientProxy,) {}@Post()async createQuestion(@Body() createQuestionDto: CreateQuestionDto) {return this.clientQuest.emit('question_created', createQuestionDto);}@Get()async getQuestions() {return this.clientQuest.send({cmd: 'get-all-questions',},'',);}@Post('/:questionsId/answers')async createAnswer(@Body() createAnswer: CreateAnswerDto,@Param('questionsId', ParseIntPipe) questionsId: number,) {createAnswer.questionId = questionsId;return this.clientAnsw.emit('answer_created', createAnswer);}@Get('/:questionsId/answers')async getAnswers(@Param('questionsId', ParseIntPipe) questionsId: number) {return this.clientAnsw.send({cmd: 'get-all-answers',},{ questionsId },);}}

For the answers controllers we need to pass the questionID parameter from the request.

Running the application on Docker

We use a standard node.js Dockerfile. The base image is node:alpine and we define the work directory as /usr/src/app. The command for starting the application specifies the service distribution folder and will be different for each service Dockerfile.

FROM node:alpineWORKDIR /usr/src/appCOPY package*.json ./RUN npm installCOPY . .RUN npm run buildCMD ["node", "dist/apps/api-gateway/main"]

In the docker-compose.yaml file we define the services. For each service we specify the Dockerfile location, their dependencies and access ports.

services:api-gateway:build:context: .dockerfile: ./apps/api-gateway/Dockerfilecommand: npm run start api-gatewaydepends_on:- rabbitmqvolumes:- .:/usr/src/app- /usr/src/app/node_modulesports:- '3000:3000'answers:build:context: .dockerfile: ./apps/answers/Dockerfilecommand: npm run start answersdepends_on:- rabbitmqvolumes:- .:/usr/src/app- /usr/src/app/node_modulesquestions:..rabbitmq:image: rabbitmq:3-managementcontainer_name: rabbitmqhostname: rabbitmqvolumes:- /var/lib/rabbitmqports:- "5672:5672"- "15672:15672"

When we run docker compose up, the services will be running in separate containers. They will establish connection with the RabbitMQ on PORT:5672. We can monitor the communication thru the broker by accessing the rabbitMQ management system on localhost:15672.

In the queues tab we can see the answer and questions queues defined in our application. We will use Postman to test the application. I will show you the getAnswers endpoint. It will retrive the answers on the first question.

--

--