Build a full-text search with NestJS, MongoDB, Elasticsearch, and Docker (Part-1)

Phat Vo
8 min readAug 27, 2020

--

Actually, we can build a query empower to enable searching relies on the database using as Mysql, MongoDB ..etc.
But for those developers who familiar with Elasticsearch, this will be an alternative choice. Each has its own perks, but Elasticsearch will be carrying accurate and fastly compare to the database query.
Because Elasticsearch is actually a JSON document store built upon the Apache Lucene search engine. Lucene does its magic by indexing documents according to specific rules. You may remember that databases like MySQL, MongoDB perform better in complex queries when tables are indexed it’s much the same idea.

For those Node JS developers who tend to lean toward Typescript and OOP (Object Oriented Programming). NestJS framework would be a good choice to build the application.
In this tutorial, we will build an application that exposes a full-text search API with NestJS, MongoDB, and Docker.

I imagine you have familiar with NestJS and have an application bootstrapped in NestJS. And you are also familiar with 3-Tiers architecture, repository patterns, and abstraction in the application.
This tutorial will start to build an API that allows a full-text search mechanism in the real-world application to rely on an application that has been bootstrapped on NestJS.
If you are looking to build an application with NestJS from scratch and apply 3-Tiers architecture with some design patterns. Then, you can have a look at my other tutorial introduce for building an application with NestJS, MongoDB applies a 3-Tiers architecture and some design patterns.
Here is the tutorial:

https://medium.com/@phatdev/how-to-build-a-scalable-maintainable-application-with-nestjs-mongodb-apply-the-design-patterns-2f71c060652

Let’s going to build our full-text search. Assume, we build an application that allows creating, updating, and searching the product. Meanwhile, the searching product functionality requires searching by keywords and obviously full-text search technique are an approach to suitable this requirement.

Our application has an underlying structure like figure 1.1 below

Figure 1.1: Application structure produces some APIs respective in the product.

Again if you are not familiar with this design architecture, please have a look at my other tutorial introduce NestJS in a 3-Tiers architecture following the link attached below.

So now, we are going to define a product entity with some underlying fields. Let’s follow the snippet below.

product.entity.ts

import {
Column,
Entity, ObjectIdColumn
} from "typeorm";

@Entity({ name: "products" })
export class Product {
@ObjectIdColumn()
id: number;

@Column({
type: "string"
})
name: string;

@Column({
type: "string"
})
description: string;

@Column({
type: "string"
})
price: string;

@Column({
type: "date"
})
createdAt: any;

@Column({
type: "date"
})
updatedAt: any;
}

The products entity represents product collection in MongoDB and we have defined some underlying fields as snippet shows as above.

Next step, let’s define the Dockerfile and docker-compose files that are empowering to enable the application run on Docker. Which is located in the root directory application.

Dockerfile

FROM node:10-alpine

# Install PM2
RUN npm install -g pm2

# Set working directory
RUN mkdir -p /var/www/nest-demo
WORKDIR /var/www/nest-demo

# add `/usr/src/app/node_modules/.bin` to $PATH
ENV PATH /var/www/nest-demo/node_modules/.bin:$PATH
# create user with no password
RUN adduser --disabled-password demo

# Copy existing application directory contents
COPY . /var/www/nest-demo
# install and cache app dependencies
COPY package.json /var/www/nest-demo/package.json
COPY package-lock.json /var/www/nest-demo/package-lock.json

# grant a permission to the application
RUN chown -R demo:demo /var/www/nest-demo
USER demo

# clear application caching
RUN npm cache clean --force
# install all dependencies
RUN npm install

EXPOSE 3002
# start run in production environment
#CMD [ "npm", "run", "pm2:delete" ]
#CMD [ "npm", "run", "build-docker:dev" ]

# start run in development environment
CMD [ "npm", "run", "start:dev" ]

docker-compose.yaml

# docker compose version
version: '3.7'
# all the containers have to declare inside services
services:
# App service
demoapp:
# application rely on database running
depends_on:
- db
# this build context will take the commands from Dockerfile
build:
context: .
dockerfile: Dockerfile
# image name
image: nest-demo-docker
# container name
container_name: demoapp
# always restart the container if it stops.
restart: always
# docker run -t is allow
tty: true
# application port, this is take value from env file
ports:
- "${SERVER_PORT}:${SERVER_PORT}"
# working directory
working_dir: /var/www/nest-demo
# application environment
environment:
SERVICE_NAME: demoapp
SERVICE_TAGS: dev
SERVICE_DB_HOST: ${DATABASE_HOST}:${DATABASE_PORT}
SERVICE_DB_USER: ${DATABASE_USERNAME}
SERVICE_DB_PASSWORD: ${DATABASE_PASSWORD}
# save (persist) data and also to share data between containers
volumes:
- ./:/var/www/nest-demo
- /var/www/nest-demo/node_modules
# application network, each container for a service joins this network
networks:
- nest-demo-network
# Database service
db:
# pull image from docker hub
image: mongo
# container name
container_name: nestmongo
# always restart the container if it stops.
restart: always
# database credentials, this is take value from env file
environment:
MONGO_INITDB_ROOT_DATABASE: ${DATABASE_NAME}
MONGO_INITDB_ROOT_USERNAME: ${DATABASE_USERNAME}
MONGO_INITDB_ROOT_PASSWORD: ${DATABASE_PASSWORD}
# save (persist) data and also to share data between containers
volumes:
- db_data:/data/db
# database port
ports:
- "${DATABASE_PORT}:${DATABASE_PORT}"
# application network, each container for a service joins this network
networks:
- nest-demo-network

#Docker Networks
networks:
# All container connect in a network
nest-demo-network:
driver: bridge
# save (persist) data
volumes:
db_data: {}

The .env will declare some parameters as shown as a file below wheredocker-compose.yaml can take value from this file.

.env

NODE_ENV=development
SERVER_TIMEOUT=1080000
SERVER_PORT=3002

DATABASE_HOST=db
DATABASE_PORT=27017
DATABASE_USERNAME=root
DATABASE_PASSWORD=root
DATABASE_NAME=admin
DATABASE_TYPE=mongodb
DATABASE_CONNECTION_TIME_OUT=150000
DATABASE_ACQUIRE_TIME_OUT=150000
DATABASE_CONNECTION_LIMIT=20

I’ve left there the comment to explain the roles and functionality of each declared line in both Dockerfile and docker-compose.yaml files. So, you can also take a look at more details into the docker documentation.

After configuring the docker in our application, now we can start running the application with a docker command.

  • Run the docker command docker-compose up and the application starts successfully as shows as the image attached below.

We have our application running on docker successfully and product collection has been migrated into MongoDB.

Now we can create those two underlying APIs that allows creating and updating a product, which is store in MongoDB.

Following the coding snippet below, we will produce two APIs that are respective for creating and updating product collection.

create-product.dto.ts

Those things define in the CreateProductDto the class will help API create products automatically validate the input data from the client.

import { IsNotEmpty, IsString} from "class-validator";

export class CreateProductDto {
@IsNotEmpty()
@IsString()
name: string;

@IsNotEmpty()
@IsString()
description: string;

@IsNotEmpty()
@IsString()
price: string;
}

product.service.interface.ts

import { CreateProductDto } from '../dto/create-product.dto';
import { Product } from "../entity/product.entity";

export interface ProductServiceInterface {
create(productDto: CreateProductDto): Promise<Product>;

update(productId: any, updateProduct: any): Promise<Product>;
}

product.service.ts

Product service references the interface from product repository by abstraction. I’ve explained and introduced the repository pattern in another tutorial that mentioned above.

import { Inject, Injectable } from "@nestjs/common";
import { ProductRepositoryInterface } from "./interface/product.repository.interface";
import { ProductServiceInterface } from "./interface/product.service.interface";
import { CreateProductDto } from "./dto/create-product.dto";
import { Product } from "./entity/product.entity";

@Injectable()
export class ProductService implements ProductServiceInterface {
constructor(
@Inject("ProductRepositoryInterface")
private readonly productRepository: ProductRepositoryInterface
) {
}

public async create(productDto: CreateProductDto): Promise<Product> {
const product = new Product();
product.name = productDto.name;
product.description = productDto.description;
product.price = productDto.price;
return await this.productRepository.create(product);
}

public async update(productId: any, updateProduct: any): Promise<Product> {
const product = await this.productRepository.findOneById(productId);
product.name = updateProduct.name;
product.description = updateProduct.description;
product.price = updateProduct.price;
return await this.productRepository.create(product);
}

}

product.controller.ts

import {
Body,
Controller, Inject, Param, Patch,
Post
} from "@nestjs/common";
import { ProductServiceInterface } from "./interface/product.service.interface";
import { CreateProductDto } from "./dto/create-product.dto";
import { Product } from "./entity/product.entity";

@Controller("products")
export class ProductController {

constructor(@Inject("ProductServiceInterface")
private readonly productService: ProductServiceInterface) {
}

@Post()
public async create(@Body() productDto: CreateProductDto): Promise<Product> {
return await this.productService.create(productDto);
}

@Patch("/:id")
public async update(@Param("id") id: string, @Body() updateProduct: any): Promise<Product> {
return await this.productService.update(id, updateProduct);
}

}

Finally, we have implemented two APIs that are empowered to enable creating and updating the product.

Following figure 1.2 and 1.3 illustrates a invoke for API create product and update product in Postman.

Figure 1.2: Create a product via product API

Figure 1.3: Update product A to become product A1

Consequently, our application has implemented and exposed those two APIs. Which can be used to create and update the product collection. But our goal is to create a full-text search API, which allows the client to search the product matching to their search keywords in our application.
Our target to create a product index search using Elastic stack. The product index will pick up data from the database and create a JSON file in Elasticsearch.
So, we can conceivably each time to create or update the product we will also need to create and update product index into Elasticsearch.
Foreseeable, after inserting or updating the product will have a snippet to creating and updating in the Elasticsearch. This means our code tightly coupled and might get damaged sometime.
So, we need a measure to solve this issue by using the observer pattern. This actually an event listener is listening to inserting or updating the product.
We can predictable in the next section we will implement the Elasticsearch service and Observer pattern in our application, based on those things there will be a search product API exposed by full-text technique.

What’s Next?

  • Implement Elasticsearch service in the application
  • Implement Observer pattern that is listening to database inserting and updating event to pick up the data into Elasticsearch
  • Implement an API using Elasticsearch that allows performing a full-text search match to the product in the system.

Let’s have a look at the next section.
Thanks for reading!

The next section: https://medium.com/@phatdev/build-a-full-text-search-with-nestjs-mongodb-elasticsearch-and-docker-final-part-3ff13b93f447

Github source: https://github.com/phatvo21/nestjs-elastic-search-docker

List of content:

Part 1: https://medium.com/@phatdev/build-a-full-text-search-with-nestjs-mongodb-elasticsearch-and-docker-part-1-48449667507d

Final Part: https://medium.com/@phatdev/build-a-full-text-search-with-nestjs-mongodb-elasticsearch-and-docker-final-part-3ff13b93f447

--

--