Learning from Legacy: Transforming Callback Hell in Node.js gRPC

From optimizing microservices communication to using promises instead of callbacks, let’s explore practical insights on using gRPC with NodeJS

Michael Truong
Cloud Native Daily
9 min readAug 3, 2023

--

As I was trying to learn about gRPC in Node.js and how it communicates with microservices, I realized my self-taught JavaScript foundation was not the best.

Here’s what I learned:

1. Use gRPC to communicate w/ microservices

gRPC is an RPC framework that is commonly used in microservice architecture because it's designed to enable high-performance communication between different internal microservices.

By different evaluations, gRPC is 5, 7, and even 8 times faster than REST+JSON communication.

At its core, gRPC allows you to define a service using Protocol Buffers (protobufs), which is a language-agnostic, compact, and efficient binary serialization format.

By different evaluations, gRPC is 5, 7, and even 8 times faster than REST+JSON communication.

We prefer to use gRPC over traditional HTTP for internal microservice communication due to its inherent speed advantages. By utilizing Protocol Buffers as its serialization format, gRPC significantly reduces the data size compared to JSON’s textual representation, resulting in more efficient use of bandwidth within our internal network.

Take a look at this mini food ordering system architecture:

https://blog.logrocket.com/wp-content/uploads/2023/02/demo-app-high-level-architecture.png

Here is the recipes.proto file for the communication line between the main microservice and recipe selector.

syntax = "proto3";

service Recipes {
rpc Find (ProductId) returns (Recipe) {}
}

message ProductId {
uint32 id = 1;
}

message Recipe {
uint32 id = 1;
string title = 2;
string notes = 3;
}

This Protocol Buffers (.proto) file defines two messages, ProductId and Recipe along with a service Recipes .

The Recipes service contains an RPC method named Find that takes a ProductId message as input and returns a Recipe message.

The ProductId message has a single field id, while the “Recipe” message includes three fields: id, title, and notes.

Sequential numbering simplifies the parsing process and enables a more efficient binary representation.

In the Protocol Buffers (.proto) file, the numbers associated with each field (e.g., id = 1, title = 2, notes = 3) are known as field numbers. These field numbers are used to uniquely identify the fields in the serialized data. The order in which these field numbers are assigned (1, 2, 3, and so on) matters because it affects backward and forward compatibility.

Forward compatibility in the context of Protocol Buffers refers to the ability of newer versions of the data format to be understood and processed correctly by older parsers.

Backward compatibility, on the other hand, refers to the ability of older versions of the data format to be correctly interpreted and handled by newer parsers.

The decision to use consecutive field numbers (1, 2, 3, and so on) simplifies the structure and improves binary efficiency. However, when anticipating future changes to the data structure, it’s common to leave gaps between field numbers. By doing so (e.g., using 1, 5, 10, 20), it ensures that new fields can be added or modified without renumbering existing fields, preserving compatibility with older data formats. This approach offers greater flexibility for evolving data schemas.

2. Stop throwing errors in callbacks

Say this is the recipe microservice:

const path = require("path");
const grpc = require("@grpc/grpc-js");
const protoLoader = require("@grpc/proto-loader");
const packageDefinition = protoLoader.loadSync(
path.join(__dirname, "../protos/recipes.proto")
);
const recipesProto = grpc.loadPackageDefinition(packageDefinition);

const RECIPES = [
{
id: 100,
productId: 1000,
title: "Pizza",
notes: "See video: pizza_recipe.mp4. Use oven No. 12",
},
{
id: 200,
productId: 2000,
title: "Lasagna",
notes: "Ask from John. Use any oven, but make sure to pre-heat it!",
},
];

function findRecipe(call, callback) {
let recipe = RECIPES.find((recipe) => recipe.productId == call.request.id);
if (recipe) {
callback(null, recipe);
} else {
callback({
message: "Recipe not found",
code: grpc.status.INVALID_ARGUMENT,
});
}
}

const server = new grpc.Server();
server.addService(recipesProto.Recipes.service, { find: findRecipe });
server.bindAsync(
"0.0.0.0:50051",
grpc.ServerCredentials.createInsecure(),
() => {
server.start();
}
);

The recipe microservice code section sets up a gRPC server that loads the recipes.proto file, defines a list of recipes, and implements a function to find a recipe based on a product ID. The server responds to gRPC requests on port 50051, either returning the requested recipe or an error if the recipe is not found.

And this is the main service:

const path = require("path");
const grpc = require("@grpc/grpc-js");
const protoLoader = require("@grpc/proto-loader");
const express = require("express");

const packageDefinitionReci = protoLoader.loadSync(
path.join(__dirname, "../protos/recipes.proto")
);
const packageDefinitionProc = protoLoader.loadSync(
path.join(__dirname, "../protos/processing.proto")
);
const recipesProto = grpc.loadPackageDefinition(packageDefinitionReci);
const processingProto = grpc.loadPackageDefinition(packageDefinitionProc);

const recipesStub = new recipesProto.Recipes(
"0.0.0.0:50051",
grpc.credentials.createInsecure()
);
const processingStub = new processingProto.Processing(
"0.0.0.0:50052",
grpc.credentials.createInsecure()
);

const app = express();
app.use(express.json());

const restPort = 5000;
let orders = {};

function processAsync(order) {
recipesStub.find({ id: order.productId }, (err, recipe) => {
if (err) return;

orders[order.id].recipe = recipe;
const call = processingStub.process({
orderId: order.id,
recipeId: recipe.id,
});
call.on("data", (statusUpdate) => {
orders[order.id].status = statusUpdate.status;
});
});
}

app.post("/orders", (req, res) => {
if (!req.body.productId) {
res.status(400).send("Product identifier is not set");
return;
}
let orderId = Object.keys(orders).length + 1;
let order = {
id: orderId,
status: 0,
productId: req.body.productId,
createdAt: new Date().toLocaleString(),
};
orders[order.id] = order;
processAsync(order);
res.send(order);
});

app.get("/orders/:id", (req, res) => {
if (!req.params.id || !orders[req.params.id]) {
res.status(400).send("Order not found");
return;
}
res.send(orders[req.params.id]);
});

app.listen(restPort, () => {
console.log(`RESTful API is listening on port ${restPort}`);
});

The main service configures an Express.js server as a REST API that communicates with the recipe microservice using gRPC through recipes.proto. It offers endpoints for creating new orders, retrieving order details based on IDs, and processes orders by fetching corresponding recipes and updating order status using gRPC streaming on port 5000.

But in the code, it still sends a response that the food order took place even if the recipe was not found by the recipes microservice.

Unhelpful response when productId does not map to a recipe

As a result, we fix this so that it tells sends back a response the recipe doesn't exist for the given productId.

Here’s what I tried first:

function processAsync(order) {
try {
recipesStub.find({ id: order.productId }, (err, recipe) => {
if(err) throw err; //added throw in callback

orders[order.id].recipe = recipe;
const call = processingStub.process({
orderId: order.id,
recipeId: recipe.id
});
call.on('data', (statusUpdate) => {
orders[order.id].status = statusUpdate.status;
});
});
}
catch {
return err.message
}
}

But I got this error!

My thought process was that it err was truthy, it would throw the error into the catch block.

However, the try-catch block will have gone out of context a long time ago due to the fact asynchronous code (in the callback queue) only executes after all the synchronous code is finished executing(in the stack).

3. Async/Await, Promises > Callbacks

As a result, I figured why not just await the gRPC promise, that way we can easily avoid callbacks altogether and wrap it in a try-catch.

Sadly… gRPC does not support promises! I looked everywhere in the gRPC documentation but it was all callback based. Even though there were some 3rd party packages I found that could promisfy the gRPC functions, I wanted to see if I could do it the way it was intended.

Handle exceptions using callbacks:

function processAsync(order, callback) {
recipesStub.find({ id: order.productId }, (err, recipe) => {
if (err) {
return callback(err); // Pass the error to the callback to handle it in the /orders route
}

orders[order.id].recipe = recipe;
const call = processingStub.process({
orderId: order.id,
recipeId: recipe.id,
});
call.on("data", (statusUpdate) => {
orders[order.id].status = statusUpdate.status;
});

// Call the callback without an error once the order processing is successful
callback(null);
});
}

app.post("/orders", (req, res) => {
if (!req.body.productId) {
res.status(400).send("Product identifier is not set");
return;
}
let orderId = Object.keys(orders).length + 1;
let order = {
id: orderId,
status: 0,
productId: req.body.productId,
createdAt: new Date().toLocaleString(),
};
orders[order.id] = order;
processAsync(order, (err) => {
if (err) {
res.status(400).send(err.message); // Send an appropriate error message
} else {
res.send(order); // Send the response here, only when the processing is successful
}
});
});

In the code above, I created another parameter in the processAsync function in order to pass a callback from the /orders route to send the response accordingly.

If the argument passed was falsy, meaning there was no error and thus the recipe exists, it would send the response that the order went through. If the argument passed was truthy, meaning there was an error and thus the recipe for that productId does not exist.

Now, I wanted to see if I could promisfy the gRPC call on my own.

Handle exceptions using Async/Await and Promises:

function findRecipeAsync(order) {
return new Promise((resolve, reject) => {
recipesStub.find({ id: order.productId }, (err, recipe) => {
if (err) {
//console.log(err)
reject(err);
} else {
console.log(recipe);
resolve(recipe);
}
});
});
}

// async to wait for promise to resolve/reject
async function processAsync(order) {
try {
const recipe = await findRecipeAsync(order);
orders[order.id] = recipe;
const call = processingStub.process({
orderId: order.id,
recipeId: recipe.id,
});
call.on("data", (statusUpdate) => {
orders[order.id].status = statusUpdate.status;
});
} catch (err) {
console.error(err.message);
throw err;
}
}

app.post("/orders", async (req, res) => {
if (!req.body.productId) {
res.status(400).send("Product identifier is not set");
return;
}

let orderId = Object.keys(orders).length + 1;
let order = {
id: orderId,
status: 0,
productId: req.body.productId,
createdAt: new Date().toLocaleString(),
};

try {
await processAsync(order);
res.send(order);
} catch (err) {
res.status(400).send(err.message);
}
});

In the code above, I move the find recipe gRPC into a separate function, where I wrap it in a promise. If it finds the recipe, I resolve the promise. If it doesn’t find the recipe, I reject the promise. This way, it can be caught by the try-catch block.

Here’s the output for both:

The correct response when productId does not map to a recipe

It’s important to note that when working with modern database libraries in JavaScript that support asynchronous operations, you typically don’t need to manually return Promises because these libraries are designed to return Promises natively. This design allows you to use async/await or Promise-based syntax directly, making asynchronous code more manageable and easier to read.

For me, I’ve rarely ever had to return Promises, so it was good for me to strengthen my JavaScript foundations.

In my opinion, use async/await and promises over callbacks because:

  • Using callbacks can make the control flow unclear and hard to debug
  • Two words. Callback hell
  • async/await allows for asynchronous code to be written synchronously

Also unrelated but I prefer to try catch await over .then() .catch() because of its sequential code flow and readability.

Share your thoughts in the comments :)

--

--

Michael Truong
Cloud Native Daily

Your typical, aspiring FAANG software engineer that's still in college.