Running NestJS Queues in a Separate Process

Maintaining Dependency Injection and Scalability

Eamon O'Callaghan
S1SEVEN
4 min readSep 26, 2024

--

Multiple workers working alongside a main process.

Queues are crucial for handling long-running tasks like certificate notarization and email sending in distributed systems. At S1Seven, we rely heavily on queues. However, as the load on our systems grew, we encountered performance bottlenecks — our queue-heavy process began overloading and stopped accepting new requests.

By default, NestJS queues run in the same process as the application, which can cause performance issues if tasks become CPU-intensive. In this article, we’ll explore how to split queues into separate processes in NestJS while maintaining dependency injection, allowing for greater scalability and performance so your application can handle increased load smoothly.

The default NestJS approach

According to the current documentation at https://docs.nestjs.com/techniques/queues, when creating a NestJS app with a queue, the queue shares the same process as the application by default. That means that if you have CPU-intensive code in the queue, it could block the event loop, preventing your application from handling new requests.

Ideally, queues should be run on a separate process, allowing the main process to handle any incoming requests.

Splitting queues out to run on a separate process

The NestJS documentation touches on this point but only provides a few details. If you look at their example of a queue that runs on a separate process (reproduced below), you will notice that it is just a function rather than a class.

Here is what the NestJS documentation currently recommends :

// app.module.ts

@Module({
imports: [
BullModule.forRootAsync({
useFactory: () => ({
connection: {
url: process.env.REDIS_URL || 'redis://localhost:6379',
},
}),
}),
BullModule.registerQueue({
name: 'audio',
processors: [join(__dirname, 'processor.js')],
}),
],
})
export class AppModule {}

// processor.ts

// Here we use bullmq instead of bull as bull is now in maintenance mode
import { Job } from 'bullmq';

export default async function (job: Job): Promise<string> {
// here you have access to the full job object
}

The file path to this processor function is specified in the app.module.ts file. The advantage of this way of running the queue on a forked process is that NestJS takes care of running this file in another process for you. However, as this function is run in a forked process, dependency injection is not available.

Since the function runs in a separate process, it doesn’t share the memory space of the main app, including the DI container, which is why you lose access to NestJS-specific features like ConfigService and other providers.

That means that instead of interfacing with @nestjs/bullmq , you have to interact directly with Bullmq. You also lose access to ConfigService and any other dependencies you may need in your application.

The recommended approach

A workaround is to create a new app with a new entry point, which can be run manually in parallel. This method allows you to use dependency injection and the @Processor decorator from NestJS, but it requires you to start the process manually.

By running the worker in a separate process, you’re adopting a microservice pattern. This allows you to scale worker processes independently from your main application and isolate failures.

Platforms like Heroku and Docker make running separate processes for workers and web servers easy when deploying your application by allowing you to specify different entry points or commands. For instance, Heroku’s Procfile can specify a worker process, and Docker Compose can define multiple services.

// worker.ts

import { NestFactory } from '@nestjs/core';
import { WorkerModule } from './worker/worker.module';

async function bootstrap() {
const app = await NestFactory.create(WorkerModule);
await app.listen(process.env.PORT);
}
bootstrap();
// worker.module.ts

import { Module } from '@nestjs/common';
import { AppService } from '../app/app.service';

import { AudioConsumer } from './worker.processor';
import { BullModule } from '@nestjs/bullmq';

@Module({
imports: [
BullModule.registerQueue({
name: 'audio',
connection: {
url: process.env.REDIS_URL || 'redis://localhost:6379',
},
}),
],
providers: [AppService, AudioConsumer],
})
export class WorkerModule {}
// worker.processor.ts

import { OnWorkerEvent, Processor, WorkerHost } from '@nestjs/bullmq';
import { Job } from 'bullmq';

@Processor('audio')
export class AudioConsumer extends WorkerHost {

async process(job: Job<any, any, string>): Promise<any> {
// eslint-disable-next-line no-console
console.log('worker', process.pid);

// This is an example job that just slowly reports on progress
// while doing no work. Replace this with your own job logic.
let progress = 0;

while (progress < 100) {
await sleep(50);
progress += 1;
job.updateProgress(progress);
}

// A job can return values that will be stored in Redis as JSON
// This return value is unused in this demo application.
return { value: 'This will be stored' };
}
}

You should also update the app.module.ts BullModule.registerQueue section, removing the processors property.

BullModule.registerQueue({
name: 'audio',
}),

Using the above example, you can compile the TypeScript to JavaScript, and run node main.js to start the main process and node worker.js to start the worker.

Because the queue audio is registered in the main application and in the worker application, the queue will be shared, and any jobs created in the main app will be processed in the worker app.

Conclusion

Running NestJS queues in a separate process resolves performance bottlenecks and provides flexibility in scaling worker processes independently. By isolating long-running tasks, your application can remain responsive to incoming requests while still managing queues efficiently.

At S1Seven, adopting this approach allowed us to solve our queue overloading issues and scale quickly. If you’re facing similar challenges, consider implementing this in your project.

See https://github.com/eamon0989/nestjs-worker-queue for a working example. It showcases how to quickly set up and run worker processes alongside a main NestJS application.

I hope you’ve found this article helpful. Check out some of my other articles on development here.

--

--

Eamon O'Callaghan
S1SEVEN

I’m a Software Engineer working at S1Seven. I mainly work with NestJS and TypeScript. I enjoy sharing what I’m learning by writing about it here!