Parcel Tracking Application With RabbitMQ

Arda Örkin
Apr 19 · 12 min read

In the development process, besides specify which technologies are going to be used, it is important to determine how the project architecture is going to be. These days, the most common project architecture is microservice architecture. One of the basic features of the microservice architecture is that every service can be deployed separately. Thanks to this feature, every single service can use a different protocol(s) in the project. It means a project can contain both HTTP, AMQP, and WS connection protocols.

In this article, we are going to build a service that uses both HTTP, AMQP, and WS. In this service we are going to use these technologies:

We will use the Express.js library to handle HTTP requests. When HTTP requests are received we will execute publishers which publish a message about the event to the RabbitMQ message broker. We will also create consumers to listen to the messages which are sent to the RabbitMQ message broker by the publishers, and save them to the MongoDB database. At last, we will use Socket.IO in order to track every change in MongoDB and to provide rendering those changes in the frontend built by using React, without refreshing the page.

Here is the final code of the project

Requirements

  • NodeJS 12v+

Building Development Environment

Server

  • First of all, let’s create a folder called parcel-tracking-system. In that folder, let’s create one more folder called server. Then open up a terminal and go to the directory where our server folder in you just created and run npm init -y.

Then go back again to the terminal and to install project dependencies run this command in the directory where the folder named server is in:

npm i express dotenv tortoise mongoose socket.io nodemon

To ensure that command was executed successfully and packages were installed both look at the message returned in the terminal and open up the package.json file and look dependencies section.

Also, let’s install development environment dependencies:

Later on, create a file named .babelrc and write this code in it:

{                                        
"presets": ["@babel/env"]
}

Now let’s create one more file named server.js and write this code in it:

//server.jsimport express from "express"              const app = express()              app.use("/", (req, res) => {                res.send("Welcome to parcel tracking system")              })              app.listen(8000, () => console.log(`Server listening on 8000`))

Execute server.js with this command:

nodemon ./server --exec babel -e js

After the execution, open localhost:3000 in a browser. If the “Welcome to parcel tracking system” message comes up on the page, it means Express.js installation has successfully ended.

Publisher and Consumer Creation

In these steps, we are going to create our first publisher and consumer. Publisher will publish a message, send it to exchange where is in RabbitMQ message broker and the exhange will send the message to the message queue and the consumer which listens to the messages which are it cares in the queue, will take the message and will do what we want to do.

For now, we will log to the console the message which we get via the consumer. To do that let’s create two folders named publisher and consumer under the folder called server. In the publisher folder, create a file named shippingPublihser.js and in the consumer folder, create a file named shippingConsumer.js.

Write this code down to the shippingPublihser.js file:

//shippingPublisher.jsimport Tortoise from "tortoise"          
import dotenv from "dotenv";
dotenv.config()
const tortoise = new Tortoise(process.env.AMQP_URL)
tortoise
.exchange("parcel-tracking", "topic", { durable: false })
.publish("parcel.shipping", { name: "test", status: "shipping" });

Now, let’s write the consumer which is going to take the message which is sent to the message broker by the publisher. To do this write this code down to the shippingConsumer.js file:

//shippingConsumer.jsimport Tortoise from "tortoise"                            
import dotenv from "dotenv";
dotenv.config()
const tortoise = new Tortoise(process.env.AMQP_URL) tortoise
.queue("", { durable: false })
.exchange("parcel-tracking", "topic", "*.shipping", { durable: false })
.prefetch(1)
.json()
.subscribe((msg, ack, nack) => {
console.log(msg)
ack();
});

We can prefer to install RabbitMQ on our local machine but in that case, the installation steps would be different from the operating system to the operating system and we would need to mess with some network settings. Therefore, we make this step with cloudamqp.com. To do that let’s create cloudamqp.com. Later on, click on the button Create New Instance and create a new message broker instance. We can name the instance as we wish. To free plan, choose the Little Lemur option. Then press the buttons named Select Region > Review > Create Instance respectively. Go to the page where message brokers are listed and click on the name of the message broker instance we just created. Copy the value of AMQP URL in the Details section where is in the page came up. After applying these steps, go back to the server folder is in and creates a file named .env. Write these lines down in this .env file:

AMQP_URL="<copied_amqp_url>"

By this process, we create an AQMP service. Now it is time to run our publisher and consumer. To do this, in two separate terminals, run these commands respectively:

nodemon ./consumers/shippingConsumer --exec babel-node -e js          
nodemon ./publishers/shippingPublisher --exec babel-node -e js

After applying these steps, we should see this message in the terminal where shippingConsumer.js is running:

{ name: 'test', status: 'shipping' }

At this time we have a working service! This service is using AMQP as a communication protocol. The service has two ends; one is a publisher, and the other is a consumer. The publisher sends a message to the message broker after this process publisher’s job is done and it doesn’t wait for any reply. The consumer just cares about what the message header includes. In that case, the consumer cares about messages which have shipping statement in their header. There are nothing consumer needs except the message header.

In the next step, we are going to publish messages when an HTTP request is received.

Handling HTTP Requests

Let’s create two more each publisher and consumer. Again to the server folder. Create these files in the publishers folder:

  • onroadPublisher.js
  • deliveredPublisher.js

And create these files in the consumers folder:

  • onroadConsumer.js
  • deliveredConsumer.js

Now we are going to write all publisher files as Promises:

//shippingPublisher.jsimport Tortoise from "tortoise";            
import dotenv from "dotenv";
dotenv.config();
const tortoise = new Tortoise(process.env.AMQP_SERVER); const shippingPublisher = (name) =>
new Promise((resolve, reject) => {
tortoise
.exchange("parcel-tracking", "topic", { durable: false })
.publish("parcel.shipping", { name, status: "shipping" });
resolve({ name, status: "shipping" });
});
export default shippingPublisher;//onroadPublisher.jsimport Tortoise from "tortoise";
import dotenv from "dotenv";
dotenv.config();
const tortoise = new Tortoise(process.env.AMQP_SERVER); const onroadPublisher = (name) =>
new Promise((resolve, reject) => {
tortoise
.exchange("parcel-tracking", "topic", { durable: false })
.publish("parcel.onroad", { name, status: "onroad" });
resolve({ name, status: "onroad" });
});
export default onroadPublisher;//deliveredPublisher.jsimport Tortoise from "tortoise";
import dotenv from "dotenv";
dotenv.config();
const tortoise = new Tortoise(process.env.AMQP_SERVER); const deliveredPublisher = (name) =>
new Promise((resolve, reject) => {
tortoise
.exchange("parcel-tracking", "topic", { durable: false })
.publish("parcel.delivered", { name, status: "delivered" });
resolve({ name, status: "delivered" });
});
export default deliveredPublisher;

Later on, let’s code consumers:

//shippingConsumer.jsimport Tortoise from "tortoise";            
import dotenv from "dotenv";
dotenv.config();
const tortoise = new Tortoise(process.env.AMQP_SERVER);
tortoise
.queue("", { durable: false })
.exchange("parcel-tracking", "topic", "*.shipping", { durable: false })
.prefetch(1)
.json()
.subscribe((msg, ack, nack) => {
console.log(msg)
ack();
});
//onroadConsumer.jsimport Tortoise from "tortoise";
import dotenv from "dotenv";
dotenv.config();
const tortoise = new Tortoise(process.env.AMQP_SERVER);
tortoise
.queue("", { durable: false })
.exchange("parcel-tracking", "topic", "*.onroad", { durable: false })
.prefetch(1)
.json()
.subscribe((msg, ack, nack) => {
console.log(msg)
ack();
});
//deliveredConsumer.jsimport Tortoise from "tortoise";
import dotenv from "dotenv";
dotenv.config();
const tortoise = new Tortoise(process.env.AMQP_SERVER);
tortoise
.queue("", { durable: false })
.exchange("parcel-tracking", "topic", "*.delivered", { durable: false })
.prefetch(1)
.json()
.subscribe((msg, ack, nack) => {
console.log(msg)
ack();
});

After these configurations, as the server receives HTTP requests, publishers will be executed and as these publishers send messages to the message broker, our consumers are going to listen to these messages.

Now let’s create HTTP routes. For the sake of best practice and clean code create a folder named routes under the server folder and create a file named index.js in it. Later write these lines down to the index.js file:

//index.jsimport { Router } from "express";            
import shippingPublishers from "../publishers/shippingPublisher";
import onroadPublisher from "../publishers/onroadPublisher";
import deliveredPublisher from "../publishers/deliveredPublisher";
const router = Router(); router.get("/", (req, res) => {
res.send("Welcome to parcel-tracking system");
});
router.get("/shipping/:name", async (req, res, next) => {
const name = req.params.name;
await shippingPublishers(name).then((message) => res.json(message));
});
router.get("/onroad/:name", async (req, res, next) => {
const name = req.params.name;
await onroadPublisher(name).then((message) => res.json(message));
});
router.get("/delivered/:name", async (req, res, next) => {
const name = req.params.name;
await deliveredPublisher(name).then((message) => res.json(message));
});
export default router;

After creating HTTP routes, let’s modify all server.js file like that:

//server.jsimport express from "express";
import router from "./routes";
const app = express();
const port = process.env.PORT || 8000;
app.use(router);app.listen(port, () => console.log(`Server listening on port ${port}`));

Later on, go to under the server folder and run these commands:

nodemon ./server --exec babel-node -e js            
nodemon ./consumers/shippingConsumer --exec babel-node -e js
nodemon ./consumers/onroadConsumer --exec babel-node -e js
nodemon ./consumers/deliveredConsumer --exec babel-node -e js

After all, commands run, open these links up in a browser respectively:

As we open up the links, we should see messages which go to the message broker, in the terminal in JSON format.

After seeing both the terminal where consumers are running on and the JSON data which are returned in the browser, we can be sure that both publishers and consumers are running as the server receives HTTP requests.

In the next step, we are going to save the messages to MongoDB.

MongoDB Configuration

To configure MongoDB, let’s create an account in mongodb.com/cloud. Later on create an organization, a project in the organization, and a cluster in the project. After the cluster is created, in the page where the cluster is in, click Database Access. Click ADD NEW DATABASE USER choice there and create a database user. Later on, click the Cluster option on the left bar. On the opened page, click the Connect button. In Setup connection security step, choose Allow anywhere option and click Choose a connection method. Then let’s click the Connect your application option and copy the connection information under Add your connection string into your application code. Let’s go back to the text editor and create a variable named MONGODB_URL in the .env file and assign the MongoDB connection information to the variable. Change with the password just created in Database Access step and change myFirstDatabase as parceltracking:

MONGODB_URL="mongodb+srv://username:12345@cluster0.mjh9d.mongodb.net/parceltracking?retryWrites=true&w=majority"

After these steps, go to server.js file and write down these lines just after the line where the routes folder is imported:

//server.jsimport mongoose from "mongoose"
import dotenv from "dotenv"
dotenv.config()
mongoose.connect(process.env.MONGODB_URL, {
useNewUrlParser: true,
useUnifiedTopology: true,
});
const db = mongoose.connection;
db.on("error", console.error.bind(console, "connection error:"));
db.once("open", () => console.log("Connected to database"));

With these lines, we opened up a MongoDB connection. To be sure that everything is okay, let’s run server.js with this command:

nodemon ./server --exec babel-node -e js

If we see the message Connected to database in the console of our terminal, it means we did the database configuration correctly. In the next step, we are going to create a MongoDB schema and model. To do that, let’s create a folder named model in the server folder. In the model folder, create a file named Tracking.js and write this code down:

//Tracking.jsimport mongoose from "mongoose";const trackingSchema = new mongoose.Schema({              
name: String,
status: String,
});
const Track = mongoose.model("Track", trackingSchema); export default Track;

After creating the model, we will use it for consumers. But while we use the save() method from the mongoose library, the updateOne() method is going to be used in other consumers. First of all, let’s modify the shippinConsumer.js file like that:

//shippingConsumer.jsimport Tortoise from "tortoise";            
import mongoose from "mongoose"
import Track from "../model/Tracking";
import dotenv from "dotenv"
dotenv.config()
mongoose.connect(process.env.MONGODB_URL, {
useNewUrlParser: true,
useUnifiedTopology: true,
});
const db = mongoose.connection;
db.on("error", console.error.bind(console, "connection error:"));
db.once("open", () => console.log("Connected to database"));
const tortoise = new Tortoise(process.env.AMQP_SERVER);
tortoise
.queue("", { durable: false })
.exchange("parcel-tracking", "topic", "*.shipping", { durable: false })
.prefetch(1)
.json()
.subscribe((msg, ack, nack) => {
const newParcel = new Track(msg);
newParcel.save((err, parcel) => {
if (err) throw err;
console.log("shipped parcel:", parcel);
return parcel;
});
ack();
});

In this way, shippinConsumer.js will create a new record in MongoDB. Now, it is time to update this record. To do that let’s modify onroadConsumer and deliveredConsumer like that:

//onroadConsumer.jsimport Tortoise from "tortoise";              
import mongoose from "mongoose"
import Track from "../model/Tracking";
import dotenv from "dotenv"
dotenv.config()
mongoose.connect(process.env.MONGODB_URL, {
useNewUrlParser: true,
useUnifiedTopology: true,
});
const db = mongoose.connection;
db.on("error", console.error.bind(console, "connection error:"));
db.once("open", () => console.log("Connected to database"));
const tortoise = new Tortoise(process.env.AMQP_SERVER);
tortoise
.queue("", { durable: false })
.exchange("parcel-tracking", "topic", "*.onroad", { durable: false })
.prefetch(1)
.json()
.subscribe(async (msg, ack, nack) => {
const onroadParcel = await Track.updateOne(
{ name: msg.name },
{ status: msg.status },
(err, parcel) => {
if (err) throw err;
else return parcel;
}
);
console.log("parcel is on road:", onroadParcel);
ack();
});
//deliveredConsumer.jsimport Tortoise from "tortoise";
import mongoose from "mongoose"
import Track from "../model/Tracking";
import dotenv from "dotenv"
dotenv.config()
mongoose.connect(process.env.MONGODB_URL, {
useNewUrlParser: true,
useUnifiedTopology: true,
});
const db = mongoose.connection;
db.on("error", console.error.bind(console, "connection error:"));
db.once("open", () => console.log("Connected to database"));
const tortoise = new Tortoise(process.env.AMQP_SERVER);
tortoise
.queue("", { durable: false })
.exchange("parcel-tracking", "topic", "*.delivered", { durable: false })
.prefetch(1)
.json()
.subscribe(async (msg, ack, nack) => {
const deliveredParcel = await Track.updateOne(
{ name: msg.name },
{ status: msg.status },
(err, parcel) => {
if (err) throw err;
else return parcel;
}
);
console.log("parcel was delivered:", deliveredParcel);
ack();
});

After these configurations have done, run these commands:

nodemon ./server --exec babel-node -e js            
nodemon ./consumers/shippingConsumer --exec babel-node -e js
nodemon ./consumers/onroadConsumer --exec babel-node -e js
nodemon ./consumers/deliveredConsumer --exec babel-node -e js

If you see Connected to database in all terminals, it means you skipped all steps successfully. Now open the links below up in a browser in order to test all database queries, and check the MongoDB database each time:

Click the Collections tab in the Cluster page in the MongoDB Cloud. In the page that came up, click the collection named tracks under the parceltracking.

If our queries are correct, each time when shippingConsumer handles the message, a new record is going to be added to MongoDB, and each time other consumers handle the messages, the record will be updated. To show changes in the record, you will need to click the Refresh button on the Collections page

Now it is time to use Web Socket in order to render the changes in the records.

Rendering Real-time Data With WebSocket

To rendering real-time data with WebSocket, we are going to use the Socket.IO library. Each time consumers are making changes in the database, new records will be rendered on the front page of the application without refreshing the page. To do this let’s create folder named socket in the server folder and add a file named trackerSocketç.js in it. Later write these lines down in the file:

//trackerSocket.jsimport socketIo from "socket.io";            
import express from "express";
import http from "http";
import mongoose from "mongoose";
import Track from "../model/Tracking";
import dotenv from "dotenv";
dotenv.config(); const port = process.env.WS_PORT || 8001; mongoose.connect(process.env.MONGODB_URL, {
useNewUrlParser: true,
useUnifiedTopology: true,
});
const db = mongoose.connection;
db.on("error", console.error.bind(console, "connection error:"));
db.once("open", () => console.log("Connected to database"));
const app = express(); const server = http.createServer(app); const io = socketIo(server, {
cors: {
origin: "*",
methods: ["GET", "POST"],
},
});
let interval; const findParcel = async (socket) => {
const parcel = await Track.find({}, (err, parcel) => {
if (err) throw err;
console.log(parcel);
return parcel;
});
socket.emit("parcel", parcel);
};
io.on("connection", (socket) => {
console.log("New client connected");
if (interval) {
clearInterval(interval);
}
interval = setInterval(() => findParcel(socket), 1000);
socket.on("disconnect", () => {
console.log("Client disconnected");
clearInterval(interval);
});
});
server.listen(port, () => console.log(`Listening on port ${port}`));

After writing the lines, run this command:

nodemon ./socket/trackerSocket --exec babel-node -e js

If you see the messages Listening on port 8001 and Connected to database output of the command, it means the command is running successfully.

Now in order to frontend installation, open up a terminal, go to under the folder named parcel-tracking-system where we created at the very first beginning:

npx create-react-app client

Later on, under the client folder, run this command in order to install socket.io-client library:

yarn add socket.io-client

After the installation process has ended, go to under the src folder where is in the client folder. Here, create a file named socket.js and write these lines down in it:

//socket.jsimport socketIOClient from "socket.io-client";            
const ENDPOINT = "http://127.0.0.1:8001";
const socket = socketIOClient(ENDPOINT); export default socket;

After creating sokcet.js, open App.js in the same directory and modify it like that:

//App.jsimport React from "react";            
import socket from "./socket";
function App() {
const [parcels, setParcel] = React.useState([{}]);
React.useEffect(() => {
socket.on("parcel", (data) => setParcel(data));
});
return (
<div>
{parcels.map((parcel) => (
<>
<div>ID: {parcel._id}</div>
<div>Name: {parcel.name}</div>
<div>Status: {parcel.status}</div>
<br></br>
</>
))}
</div>
);
}
export default App;

To run React application run this command under the client directory in a terminal:

yarn start

In the page automatically opened up, you should see the record in MongoDB. To add a new record ve update it, open up these links below respectively:

If you can see the changes in MongoDB with every click, congratulations, you have a little service!

Conclusion

Microservice structures are becoming more and more preferred structures in new applications. Software teams are working to migrate existing monolith structures to microservice architectures. In addition, AMQP stands out as a highly preferred communication protocol and provides solutions to some problems experienced with HTTP. On the end-user side, the interest and demand for real-time applications are also increasing and here web sockets provide great convenience to the software developer. It is a pleasure to master these tools, which are used to respond to new demands and new technologies, and it contributes a lot to the developer. Throughout this article, I wrote a small service and tried to provide a general understanding by using these structures and technologies in summary. I hope it has contributed to those who read, follow, and apply.

Nerd For Tech

From Confusion to Clarification

Nerd For Tech

NFT is an Educational Media House. Our mission is to bring the invaluable knowledge and experiences of experts from all over the world to the novice. To know more about us, visit https://www.nerdfortech.org/. Don’t forget to check out Ask-NFT, a mentorship ecosystem we’ve started

Arda Örkin

Written by

I am a self-taught JavaScript Developer. I like to tell, listen and read stories about JavaScript world

Nerd For Tech

NFT is an Educational Media House. Our mission is to bring the invaluable knowledge and experiences of experts from all over the world to the novice. To know more about us, visit https://www.nerdfortech.org/. Don’t forget to check out Ask-NFT, a mentorship ecosystem we’ve started

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store