Answering Node.js interview questions for applied backend engineer— Part 1

Dmytro Harazdovskiy
CodeX
Published in
20 min readAug 20, 2023

--

The job interview process can be a hard and stressful experience in today’s competitive job market. Consequently, the more you prepare, the easier it becomes to ace the interview and consequently enhance your engineering knowledge.

Navigating through various resources, I accidentally came across some impressive Node.js interview questions, meticulously compiled by Timur Shemsedinov, Lecturer at KPI🇺🇦, Chief Technology Officer at Metarhia, and an accomplished Software Architect with more than ten years of experience.

There are fifty engaging questions, but to enhance readability and understanding, this article will be divided into several parts.

Questions specifically tailored for platform engineers will be placed in another piece, so be sure to follow me so you don’t miss out!

These questions are designed by a proficient Node.js architect therefore they might sound complex even for experienced developers. With this in mind, you can refresh your existing knowledge and broaden your understanding of Node.js development and architecture! Let’s start!

1. What can be done using for await with the IncomingMessage object?

As is mentioned in the official doc:

An IncomingMessage object is created by http.Server or http.ClientRequest and passed as the first argument to the 'request' and 'response' event respectively

You can attach a stream to IncomingMessage a client and write stream data. Then accept this data on the server side and parse it usingfor await syntax:

2. How to natively hash passwords in Node.js and in what cases do you need dependencies?

There is an awesome native crypto module in Node.js. It supports various hash algorithms, the most popular: SHA-256, Scrypt, Argon2.

For hashing passwords, we can use sha256 and pbkdf2Sync method (Password-Based Key Derivation Function 2🥲) that would hash our password using salt and output a buffer.

import crypto from 'crypto';

const hashPassword = (password, salt) => {
return crypto.pbkdf2Sync(password, salt, 600000, 64, 'sha256');
};

const compare = (input, hashedPassword, salt) => {
const actualHashedPassword = hashPassword(input, salt);
return actualHashedPassword.toString() === hashedPassword.toString();
};

(() => {
const password = 'password';

const salt = crypto.randomBytes(16);
const hashedPassword = hashPassword(password, salt);

console.log(compare(password, hashedPassword, salt));
console.log(compare('other password', hashedPassword, salt));
})();

Now, when do you need dependencies?

  1. Bcrypt vs. Native Crypto: Using the native crypto module is generally fine for password hashing. However, some developers prefer using third-party libraries like bcrypt or bcryptjs because they abstract away some of the complexity and provide a simpler API. Bcrypt libraries also automatically handle generating and managing the salt, which is an essential aspect of secure password hashing.
  2. Password Complexity and Policies: If you have specific password complexity requirements or policies (e.g., minimum length, character requirements), you may need to use external libraries to validate or enforce those policies during password creation or updating.
  3. Security Auditing and Updates: Using well-maintained, widely-used, and regularly updated third-party libraries can provide an added layer of assurance since they are actively maintained, audited for security issues, and improved over time.

In summary, using the native crypto module for password hashing in Node.js is a viable option, but you may consider using third-party libraries like bcrypt or Argon2 for convenience, better security features, or to simplify your codebase. When choosing a library, always verify its reputation, security record, and current maintenance status.

3. What API does nodejs/undici implement?

Nodejs/Undici implements the HTTP 1.1 API based on the Node.js core HTTP module, with an improved client API that supports pipelining, retries, cancellation, and file uploads. It’s designed to provide efficient and reliable communication with HTTP servers.

Let’s take a look atnodejs/undici the client and why it is a great HTTP client.

import {Client} from 'undici'

const client = new Client('https://flaviocopes.com')
client.request({
path: '/',
method: 'GET'
}, (err, data) => {
if (err) console.error(err)
// Show response
console.log(data.statusCode)
data.body.setEncoding('utf8')
data.body.on('data', console.log)
})

In Node.js 18, an experimental global fetch API is available by default. The implementation comes from undici and is inspired by node-fetch which was originally based upon undici-fetch.So

  1. Performance: Undici is designed from the ground up for high performance. Benchmarks show that it can provide up to 10 times the throughput of Node.js’ built-in http/https modules.

2. HTTP/1.1 Support: Unlike Node’s built-in http module which supports only HTTP/1.0, Undici is built to fully utilize the capabilities of HTTP/1.1, including pipelining and connection keep-alive.

3. Advanced Features: Undici supports more advanced features such as request retries, cancellations, and connection pooling. These features can significantly improve the efficiency of communicating with HTTP servers.

4. Flexible API: Undici offers both callback-based and promise-based APIs. This flexibility allows developers to use the style that best fits their application’s architecture.

However, whether Undici is better or not can depend on your specific needs. Each client has its strengths and weaknesses. For example, if your application deals with more complex cases like SOAP protocol, multipart uploads, or OAuth, other clients like request or axios may fit better because they provide more high-level APIs to handle those cases.

4. What is a modern replacement for the node:domain API?

The domain module was developed as a part of Node.js to provide a way to handle multiple different I/O operations as a single group. Essentially, it was for error handling and managing multiple callbacks that share an error-handling scope.

When you have several callbacks in your application, and they can potentially throw errors, it can be tough to handle those errors correctly, especially in cases where an error in one callback can put your application’s data in an inconsistent state. The domain module was aimed to solve this issue by grouping related I/O operations together and providing a mechanism to catch and handle errors at the group level.

However, the API had some issues (like error-swallowing, and complexity) and caused unexpected behavior, which is why it’s been deprecated and has been replaced by other solutions, like promises and async_hooks.

The easiest way you can replace it — with try catch a block that would wrap your multiple async calls. For more complex situations you may consider async_hooks.

5. When can we use synchronous versions of file operations node:fs instead of asynchronous ones and what should we pay attention to when making this decision?

Synchronous file operations in Node.js File System (fs) module block the execution of code until the operation is complete. This means the entire application comes to a halt. On the other hand, asynchronous versions don’t block other operations from executing while reading from or writing to a file.

Given this behavior, synchronous versions of file operations are suitable for use:

  1. During the initial startup of the application where blocking the event loop has minimal impact.
  2. When you need a simple script to run operations sequentially, with each operation depending on the previous one.
  3. When the file operation tasks aren’t complex and are not expected to take much time.

However, you should always keep the following considerations in mind before deciding to use synchronous methods:

  1. Performance Impact: Node.js is single-threaded. Blocking the thread with synchronous operations can impact performance and usability, especially for high-traffic, I/O intensive, or real-time applications.
  2. Scalability: Because synchronous operations halt the execution of your code, they are not suitable for scenarios where the application needs to serve multiple requests simultaneously.
  3. Unhandled Errors: If a synchronous method throws an error and it’s not caught by a try-catch block, it can crash the Node.js process.

6. What are the best practices for error handling in asynchronous code?

Absolutely, error handling in asynchronous code is crucial in Node.js, because, unlike traditional synchronous code, errors that are thrown inside of a callback method cannot be caught outside of it with a try/catch block.

Below are some of the best practices I can suggest:

Always Handle Errors In Callbacks

In Node.js, most I/O methods expect callbacks as the last argument, and the first parameter of these callbacks is an error object. If an error occurred, this object will be populated. An unset `error` parameter means that the operation was successful.

fs.readFile('non-existence-file.txt', function (err, data) {
if (err) {
// Handle error here, for example:
console.error('There was an error reading the file!', err);
return;
}
// Otherwise handle successful case here...
});

Use Promises and async/await

Promises represent the completion or failure of an asynchronous operation. They provide a great way to handle async errors because instead of using callbacks, Promises use the .then() for peaceful operations and .catch() for errors.

async function someFunction() {
try {
var data = await fs.promises.readFile('non-existence-file.txt');
// Handle data here
} catch (err) {
// Handle errors here
}
}

Use Events for Error Handling

Some Node.js objects like streams and requests are EventEmitters. You can listen to the ‘error’ event for error handling.

var readable = getReadableStreamSomehow();
readable.on('error', function(err) {
// Handle errors here
});

Handle Unhandled Promise Rejection

You can listen for the `unhandledRejection` process event, which is emitted whenever a Promise is rejected and no error handler is attached to it.

process.on('unhandledRejection', (reason, promise) => {
console.log('Unhandled Rejection at:', promise, 'reason:', reason);
// Application specific logging, throwing an error, or other logic here
});

Proper Use of Error Types

Use JavaScript Error or subclass objects for passing and throwing errors. They provide a stack trace, an error message, and an error type.

7. How can vulnerabilities appear in Node.js projects? Explain XSS, path traversal, SQL injection, and CSRF. How to protect yourself from them?

There may be different ways the security of our Node.js application can be threatened:

Concurrent model of I/O

The concurrent model means that one process/thread handles multiple requests, therefore data source, configuration params, and execution permissions are shared too. It can cause one after some manipulations can access to some resources that are shared for code but not him.

The same could be for env vars all of which are stored per server instance that are handling both admin and user requests. It may also lead to leaking that cred whenever users try to breach your code and get to admin credentials.

For this case, you need to segregate the permissions and domains of your applications. Use different threads for them, also store configs in runtime — use passwords storage services like AWS SSM. Think about graceful shutdown so that one failed service from the admin side would not fail all the other user's services that might run on the same machine.

Path traversal vulnerability

Path Traversal is a vulnerability that allows attackers to access files and directories that are stored outside the web root folder by manipulating variables that refer to filesystem paths.

You can mitigate path traversal by not using user input directly in filesystem APIs. When that’s unavoidable, make sure to validate or sanitize the input, and use it in a way that ensures it does not point to unintended files.

SQL Injection

SQL Injection may happen when you don’t sanitize user input before performing queries with them. It is crucial to do since the user may prepare malicious input that your code does not handle and access more data from your storage than he is allowed to.

Use parameterized queries or prepared statements instead of string concatenation within queries. Also, keeping your database software up-to-date can provide additional layers of security.

XSRF, XSS

This kind of attack is about tricking a victim into executing unwanted actions on a web application in which they’re authenticated.

Cross-Site Request Forgery — this threat may happen because someone sent your user an email with a link that contains a URL with malicious query parameters. That param may execute lead to execute some malicious transaction or action on your server. Generally to avoid such an issue you need for API to accept the secure params as a body from a POST method not from query params or add some intermediate secure tokens for validation.

Cross-Site Scripting — similarly to XSRF attacker sends you the URL where in the query params might be enclosed some script for your browser to execute from your side. It may be the same malicious links as for XSS but for interaction with your logged-in account.

To prevent XSS attacks in Node.js applications, it is essential to validate and sanitize user inputs, escape special characters and implement strict content security policies. Additionally, using templating engines with auto-escape features, enabling HTTP-only cookies, and regularly updating your application and its dependencies can further enhance security.

Other things you should consider:

  • Dependencies of node_modules
  • Memory leaks that lead to server downtime
  • Storing sensitive data like passwords without hashing and salt
  • DDoS attacks

How to prevent security vulnerabilities in an automated way?

8. How is a race condition possible in asynchronous programming? And how to protect yourself from it?

A race condition in asynchronous programming occurs when two or more operations need to access or change shared data concurrently and the outcome depends on the sequence or timing of these operations. Because of the non-deterministic nature of asynchronous programming in Node.js, there is no guarantee about the order in which asynchronous functions will be complete, leading to potential race conditions.

For example, imagine a situation where multiple asynchronous calls are made to read and update a certain piece of data. There is no guarantee which operation will complete first, so you may end up reading stale data, or updates may overwrite each other.

To protect yourself from race conditions, there are several strategies you can employ:

Mutexes and Locks: Using locking mechanisms such as mutexes or semaphores can ensure that certain pieces of code that access shared data do not execute concurrently.

Promises and Async/Await: Use these features of JavaScript to ensure certain operations are complete before others are started.

Atomic Operations: Some systems provide operations that are guaranteed to be atomic, meaning they are uninterruptible and can be used to ensure data integrity. For instance, in databases, you should use transactions to ensure that related updates to data occur atomically.

Avoiding Shared State: If possible, avoid sharing the state between different parts of your program. This can often be done using functional programming principles or by copying data so that each function has its own private copy.

Queuing: If the nature of your operations allows, you can use queues to ensure certain operations are completed before others.

9. What are the pros and cons of dividing the code into .js and .d.ts type definitions?

Dividing code into .js and .d.ts type definitions can have a number of pros and cons.

Good parts

  • Improved type safety: TypeScript type definitions can help to catch errors at compile time, which can save time and frustration in the long run.
  • Better code reuse and documentation: By separating the code into .js and .d.ts files, you can easily reuse the code definitions in other projects.

Bad parts

  • Increased Complexity: This separation could result in increased complexity as now developers have to maintain two different types of files. The separation might also confuse beginners unfamiliar with TypeScript or the concept of types in JavaScript.
  • Syncing issues: There is a possibility of misalignment between the .js and .d.ts files. Any changes or updates made to the .js file would also have to be reflected in the .d.ts file, and keeping them in sync could become challenging.

Use .d.ts when working with external JavaScript libraries, maintaining large codebases, migrating from JS to TS, working on open-source projects, or when incrementally adopting TypeScript in your project.

You should avoid using .d.ts files, when you have small or rapidly changing projects or your project, is written in pure JavaScript projects and you don’t want to introduce more complexity.

10. Provide several typical Node.js design patterns (according to GoF and not only) with examples.

Observer

It is a design pattern that establishes a one-to-many dependency between objects so that when one object changes its state, all its dependent objects are notified and updated automatically. This is especially useful in event-driven programming and is at the heart of Node.js. Node.js utilizes this pattern at its core, an event listener method like on or emit.

let observer = {
subscribers: [],
subscribe: (event, callback) => {
this.subscribers[event] = callback;
},
publish: (event, data) => {
this.subscribers[event](data);
},
};

observer.subscribe("greet", data => console.log(`Hello, ${data}!`));
observer.publish("greet", "John Doe");

Module

The Module pattern is used to mimic the concept of classes in such a way that we’re able to include both public/private methods and variables inside a single object, thus shielding particular parts from the global scope. This leads to a reduction in the likelihood of our function names conflicting with other functions defined in additional scripts on the page.

let person = (() => {
let privateAge = 25;
return {
getAge: () => privateAge,
setAge: (age) => { privateAge = age; },
};
})();

console.log(person.getAge());
person.setAge(30);
console.log(person.getAge());

Factory

This is a creational pattern that uses factory methods to deal with the problem of creating objects without having to specify the exact class of the object that will be created. This is done by creating objects by calling a factory method — either specified in an interface and implemented by child classes, or implemented in a base class and optionally overridden by derived classes — rather than by calling a constructor.

function CarMaker(type) {
if (type === "sports") return { maxSpeed: 200 };
else if (type === "SUV") return { maxSpeed: 120 };
}

let sportsCar = CarMaker("sports");
console.log(sportsCar.maxSpeed);

Middleware

This pattern is used in many popular frameworks (like Express.js, for example). The functions are executed in the order they are declared. Each function can decide to end a web request or pass it to the next function(handler).

let app = {
middleware: [],
use: function(fn) {
this.middleware.push(fn);
},
go: function() {
this.middleware.forEach(mw => mw());
}
};

app.use(() => console.log("Middleware 1"));
app.use(() => console.log("Middleware 2"));
app.go();

11. What is the problem with thick controllers? (provide examples in Node.js)

Thick controllers refer to a common anti-pattern in software development where the logic within a controller becomes excessively large, complex, and difficult to manage.

This violates the principle of separation of concerns, making the codebase less maintainable, testable, and scalable. In the context of Node.js applications, thick controllers can lead to code that’s hard to understand, debug, and extend.

Here are some examples:

1. Code Duplication:

class UserController {
async createUser(req, res) {
// ... lots of validation, business logic, and database interactions
const userData = req.body;
// Validation
if (!userData.name || !userData.email) {
return res.status(400).json({ error: 'Name and email are required' });
}
// Business logic
const newUser = await User.create(userData);
// ...
}

async updateUser(req, res) {
// ... similar logic to createUser
const userId = req.params.id;
const userData = req.body;
// Validation
if (!userData.name) {
return res.status(400).json({ error: 'Name is required' });
}
// Business logic
const updatedUser = await User.findByIdAndUpdate(userId, userData, { new: true });
// ...
}
}

In this example, both the createUser and updateUser methods might share similar validation logic, input sanitization, and database interactions. However, if this logic is duplicated within the controller, any changes or fixes must be applied in multiple places. This increases the risk of inconsistencies and maintenance challenges.

Solution: Separate common logic into reusable modules:

// validation.js
const validateUserInput = (userData) => {
if (!userData.name || !userData.email) {
throw new Error('Name and email are required');
}
return userData;
};

// database.js
const createUserInDatabase = async (userData) => {
// Database logic to create user
};

const updateUserInDatabase = async (userId, userData) => {
// Database logic to update user
};

// UserController.js
import { validateUserInput } from './validation';
import { createUserInDatabase, updateUserInDatabase } from './database';

class UserController {
async createUser(req, res) {
const validatedData = validateUserInput(req.body);
const user = await createUserInDatabase(validatedData);
// ...
}

async updateUser(req, res) {
const validatedData = validateUserInput(req.body);
const updatedUser = await updateUserInDatabase(req.params.id, validatedData);
// ...
}
}

2. Difficult Testing

class ProductController {
async createProduct(req, res) {
// ... complex logic, including database updates and external API calls
const productData = req.body;
// Complex logic for database updates
const newProduct = await Product.create(productData);
// Complex logic for external API call
try {
await externalAPI.sendProductInfo(newProduct);
} catch (error) {
// Handle API error
}
// ...
}

async updateProduct(req, res) {
// ... more complex logic and integration with other systems
// Similar complex logic as createProduct
// ...
}
}

When a controller contains complex logic involving database interactions and external API calls, it becomes challenging to write unit tests that cover different scenarios effectively.

Solution: Extract database interactions and API calls into separate services or modules that can be mocked or stubbed for testing:

// productDatabase.js
const createProductInDatabase = async (productData) => {
// Database logic to create product
};

const updateProductInDatabase = async (productId, productData) => {
// Database logic to update product
};

// externalAPI.js
const sendProductUpdateToExternalAPI = async (updatedProduct) => {
// Logic to send update to external API
};

// ProductController.js
import { createProductInDatabase, updateProductInDatabase } from './productDatabase';
import { sendProductUpdateToExternalAPI } from ('./externalAPI';

class ProductController {
async createProduct(req, res) {
const productData = req.body;
const createdProduct = await createProductInDatabase(productData);
// ...
}

async updateProduct(req, res) {
const productId = req.params.id;
const updatedProduct = await updateProductInDatabase(productId, req.body);
await sendProductUpdateToExternalAPI(updatedProduct);
// ...
}
}

3. Scalability and Maintainability

class BlogController {
async createPost(req, res) {
// ... extensive business logic, including file uploads and email notifications
const postContent = req.body;
// Complex logic for file uploads
const uploadedFile = await uploadService.uploadFile(postContent.file);
// Complex logic for database updates
const newPost = await Post.create({ ...postContent, fileUrl: uploadedFile.url });
// Complex logic for email notifications
await emailService.sendEmail(postContent.authorEmail, 'Your post was published!');
// ...
}

async updatePost(req, res) {
// ... even more logic and potential integrations
// Similar complex logic as createPost
// ...
}
}

As the application grows, the BlogController may accumulate more and more logic, making it harder to maintain and extend.

Solution: Break down the responsibilities and use modular design patterns:

// blogLogic.js
const processPostCreation = async (postContent) => {
// Complex logic for creating a post
};

const processPostUpdate = async (postId, updatedPostData) => {
// Complex logic for updating a post
};

// notifications.js
const sendEmailNotification = async (recipientEmail, message) => {
// Logic to send email notification
};

// BlogController.js
import { processPostCreation, processPostUpdate } from './blogLogic';
import { sendEmailNotification } from './notifications';

class BlogController {
async createPost(req, res) {
const postContent = req.body;
const createdPost = await processPostCreation(postContent);
await sendEmailNotification(createdPost.author, 'Your post was published!');
// ...
}

async updatePost(req, res) {
const postId = req.params.id;
const updatedPostData = req.body;
const updatedPost = await processPostUpdate(postId, updatedPostData);
// ...
}
}

12. Provide examples of abstraction leaks (typical for Node.js).

You can treat your Node.js as your best friend. We all know he’s one cool guy who helps you out by making things simpler. That simplification is what we call “abstraction”. But sometimes, your buddy spills the beans by uncovering things that are supposed to be hidden. Clever lads call it an “abstraction leak”.

These leaks can create a real mess by causing unexpected bugs and creating work that you didn’t sign up for. You need to understand these leaks to prevent the mess from spilling over and causing troubles in your code and debugging process.

1. Asynchronous Operations Management

This includes problems like callback hell, promises, and async/await complexities. Asynchronous programming is meant to enhance efficiency but can uncover the intricacies of event-driven programming, thus leaking abstraction.

2. Error Handling

The way Node.js handles errors (by passing them as arguments to callback functions) exposes the underlying system’s error-handling procedures. It leaks abstraction by requiring developers to manually manage these errors in every function.

3. System Performances

Both the non-blocking I/O and garbage collection points fall into this category. Both are efficiency-improvement features of Node.js that, under certain conditions, reveal the system’s low-level operations and interact directly with system resources.

4. Database Operations

Even though there are high-level query interfaces for developers to interact with databases, there can still be situations where you’ll have to write raw SQL queries. It means the abstraction around database operation sometimes leaks, revealing the complexities of database management systems.

13. How to create a Singleton using the modularity system in Node.js?

So, a singleton is like the one and only moon that we have. No matter where you are on Earth, it’s still the same moon you’re seeing.

Creating a singleton in Node.js is like saying, “Whenever someone calls for the ‘moon’, make sure we’re all talking about the same one.”

Here’s how you’d do that:

// Let's name the file as singleton.js

let instance = null;

class Singleton {
constructor() {
if (!instance){
instance = this;
}

// just an example property
this.time = new Date();

return instance;
}
}

module.exports = Singleton;

So when anyone asks for a new Singleton (kinda like looking for a ‘moon’), they’ll get the one we first made (the ‘only moon’).

Here’s how to use it:

let Singleton = require('./singleton');
let moon1 = new Singleton(); // Someone in US looking at the moon

let Singleton2 = require('./singleton');
let moon2 = new Singleton2(); // Someone in Australia looking at the moon

console.log(moon1 === moon2); // Logs 'true'. Yup, it's the same moon!

In the example above, you can see that even if it looks like we created two moons (moon1 and moon2), when we check if they are the same, we get ‘true’. That means they’re actually referring to the same singleton instance! So, even though it seems like we’ve made more than one, we are always getting back that first, one-of-a-kind, singleton object.

14. How to easily implement the Strategy pattern in JavaScript (and where to use it in Node.js)?

The Strategy Pattern is a behavioral design pattern that enables selecting an algorithm at runtime. It’s like creating an easy-to-switch TV remote that can be used readily for any TV brand.

Now, implementing the Strategy Pattern in JavaScript involves creating interchangeable objects (the strategies) which implement a similar method. Let’s use an example of a Route strategy to understand it better:

class DrivingRouteStrategy {
buildRoute(A, B) {
return `Driving directions from ${A} to ${B}`;
}
}

class TransitRouteStrategy {
buildRoute(A, B) {
return `Transit directions from ${A} to ${B}`;
}
}

class BicyclingRouteStrategy {
buildRoute(A, B) {
return `Bicycling directions from ${A} to ${B}`;
}
}

As you can see, each strategy has a `buildRoute` method but the implementation varies. Now, we can have our Navigator object which can use these strategies interchangeably:

class Navigator {
constructor(strategy) {
this.strategy = strategy;
}

setStrategy(strategy) {
this.strategy = strategy;
}

buildRoute(A, B) {
return this.strategy.buildRoute(A, B);
}
}

It is simple to change the strategy at runtime:

const navigator = new Navigator(new DrivingRouteStrategy());
console.log(navigator.buildRoute('A', 'B'));
// Outputs: Driving directions from A to B

navigator.setStrategy(new TransitRouteStrategy());
console.log(navigator.buildRoute('A', 'B'));
// Outputs: Transit directions from A to B

navigator.setStrategy(new BicyclingRouteStrategy());
console.log(navigator.buildRoute('A', 'B'));
// Outputs: Bicycling directions from A to B

In Node.js, the Strategy Pattern can be useful in various scenarios, for instance in a payment gateway system where you can switch between different payment methods (Paypal, Credit Card, Bitcoin), each with a distinct implementation. You could also use it in authentication when there are multiple ways to authenticate a user (JWT, OAuth, Basic Auth).

Remember, the main goal of the Strategy Pattern is to enable the algorithm independently from clients that use it. That way, it’s easier to switch, understand, and extend the code.

15. Provide an example of the Adapter pattern from built-in Node.js libraries (there are several).

The Adapter pattern is a behavioral design pattern that’s all about the relationship between objects. It’s about getting unrelated classes to work together through interfaces that they both understand! Much like adopting a three-pin plug into a two-pin socket.

FS

A good example in the Node.js libraries would be the streams module. Streams are a facing interface over I/O operations; they let us read from and write to all sorts of things in the same way. If you want to work with a file, an HTTP response, a ZIP file, or a Buffer, you can do so using the same interface. The stream is adapting each of these different underlying operations to a common interface that you can use throughout your code.

Here’s a simple example where we read data chunk by chunk from a file using a readable stream (fs is the File System module):

import fs from 'fs';
const readStream = fs.createReadStream('example.txt');

readStream.on('data', (chunk) => {
console.log(`Received ${chunk.length} bytes of data.`);
});

The fs.createReadStream is effectively acting as an adapter. The stream doesn’t need to know where the data is coming from, all it cares about is that data is arriving. The read stream from the File System is responsible for the actual reading from the file, but it presents the data in a way that the stream can understand.

OS

The “os” module acts as an adapter for different operating systems. It provides methods like os.userInfo() or os.platform() that work the same regardless of whether the code is running on Linux, Windows, or Mac.

import os from 'os';

console.log('OS platform:', os.platform());
console.log('User info:', os.userInfo());

The os methods adapt the OS-specific operations to a common interface, providing easy access to os-related functionalities.

Path

The path module can be considered an adapter because it works with file paths for different operating systems. For example, Windows uses backward slash `\` while Unix-based systems use forward slash `/` for file paths.

const path = require('path');

const fullPath = path.join("/home", "user", "documents", "example.txt");
console.log(fullPath);

The path.join acts as an adapter as it’s capable of manipulating path strings in a way that’s consistent across various operating systems.

In all these examples, Node.js is adapting lower-level, more complex, or varied interfaces into higher-level, easier, and uniform interfaces, which is the core idea of the Adapter pattern.

That’s all for now!

Big thanks for reading up to this point! I really hope this Q&A helped you to understand more about Node.js and improve you as a backend engineer.

Clap and comment if you want to disassemble more questions like:

  • What are the approaches to logging? Their differences, pros, and cons.
  • Where to store secrets? (API keys, tokens, and database passwords)
  • What are the weaknesses of Node.js? What is bad or impossible to write on Node.js?
  • How not to block servicing other users while processing a request from one of them?
  • How do you organize the data access layer?

--

--