Writing Async Programs in JavaScript

Learn how to write asynchronous programs using promises and async/await

AJ Meyghani
Dec 5, 2018 · 19 min read

There is no doubt that JavaScript, despite its history, has become to be one of the most popular programming languages today. JavaScript, due to its asynchronous nature, can present some challenges for those who are new to the language. In this article, we are going to write small async programs using promises and async/await. Using these examples, we are going to identify some simple patterns that you can use for your own programs.

If you are new to JavaScript, you may want to first check out my other article before reading this one.

All the code examples in this article are written for the Node environment. If you don’t have Node installed, you can see the Appendix 1 for instructions. Even though all the programs are written for Node, you can apply the same principles for scripts running in the browser. Also, all the code examples for this article are available on Gitlab.

Introduction

Whether or not people believe that JavaScript is a real programming language, the reality is that it’s not going to go anywhere anytime soon. If you are a web developer you might as well spend some time and learn its good and bad parts.

JavaScript is single-threaded and favors non-blocking asynchronous flows. If you are new to the language, it can become really frustrating when things don’t work the way you expect them to. Asynchronous programming demands more patience and a different kind of thinking that’s different from synchronous programming.

In the synchronous model everything happens in a sequence one at a time. Because of that it’s easier to reason about programs. But in the asynchronous model, operations can start or finish in any order at any point in time. Because of that simply relying on a sequence is not sufficient. Asynchronous programming demands more thought in terms of program flow and design.

In this article, we are going to explore a couple of short async programs. We are going to start with simple programs and work our way up to more complex ones. Below is an overview of the scripts that we are going to be writing:

  • A script that writes the content of a file to another file.

Promises and Async/await

Let’s take a moment and quickly review the basics of promises and async/await.

Promises

  • A promise is an object that represents the result of an asynchronous operation.

Async/await

  • The purpose of async/await functions is to simplify the behavior of using promises synchronously and to perform some behavior on a group of Promises. From MDN

Read & Write a Single File

In this section, we are going to write a script that reads the contents of a single file and writes the result to a new file.

You can access all the scripts for this section on Gitlab.

First, we are going to create an async function for the entry point of our program:

async function main() {
// body goes here...
}

Then, we are going to need to create two promises, one that represents the contents of the file. And the other represents the result of writing the content to another file:

async function main() {
const fileContent = readFile("./file.txt", "utf-8");
const writeResult = writeFile("./file-copy.txt", fileContent);
}

In the snippet above, both readFile and writeFile are asynchronous, and they both return a promise. Because of that, first we need to make sure that we await the result of readFile so that we can use it in writeFile:

async function main() {
const fileContent = await readFile("./file.txt", "utf-8");
const writeResult = writeFile("./file-copy.txt", fileContent);
}

And finally, we can decide what to return from the main function. Here we are just going to return the name of the new file that we are writing to. Note that the returned value will be automatically wrapped in a promise. But we need to make sure to await for the result of writeFile before hitting the last line of the function:

async function main() {
const fileContent = await readFile("./file.txt", "utf-8");
const writeResult = await writeFile(
"./file-copy.txt", fileContent);
return "file-copy.txt";
}

Now, we can call the main function and log the results or any uncaught exceptions to the console:

main()
.then(r => console.log("Result:", r))
.catch(err => console.log("An error occurred", err));

To make the program complete, we need to require the fs module and promisify the fs.readFile and fs.writeFile. The complete script is shown below:

const util = require("util");
const fs = require("fs");
const readFile = util.promisify(fs.readFile);
const writeFile = util.promisify(fs.writeFile);

async function main() {
const fileContent = await readFile("./file.txt", "utf-8");
const writeResult = await writeFile(
"./file-copy.txt", fileContent);
return "file-copy.txt";
}

main()
.then(r => console.log("Result:", r))
.catch(err => console.log("An error occurred", err));

In the snippet above, we promisify fs.writeFile and fs.readFile. Promisify can turn any callback-style function, provided that they follow the Node convention, to a promise-based function.

Now let’s talk a little about error handling. There are a couple of ways that you can approach error handling and it all depends on how much control you need. For example, in the snippet above, we are basically catching any error that could occur in the main function in the catch block. That works because inside an async function, any uncaught exception immediately causes the function to return a promise that is rejected with the exception.

But let’s say you need more control and you would like to do different things depending on possible errors for each async operation. In that case, you could either use a try-catch block or use a catch block on each async operation. First, let's look at using try-catch blocks.

async function main() {
let fileContent;
try {
fileContent = await readFile("./files.txt", "utf-8");
} catch(err) {
return {message: "Error while reading the file", error: err};
}

try {
const writeResult = await writeFile(
"./file-copy.txt", fileContent);
} catch(err) {
return {message: "Error while writing the file", error: err};
}

return "file-copy.txt";
}

In the snippet above, we have added two try-catch blocks. Also, we have created the fileContent variable outside of the first block so that it can be visible throughout the main function. Notice that in each try-catch block, we return an object if there is an error. The error object contains a message field, and the details of the error. Now if any error happens, the function will immediately return with our custom error object. Remember that the returned object will be automatically wrapped in a promise. We can call the main function just like before, but this time we can check for error objects in the then block:

main()
.then(r => {
if(r.error) {
return console.log(
"An error occurred, recover here. Details:", r);
}
return console.log("Done, no error. Result:", r);
})
.catch(err => console.log("An error occurred", err));

Notice that in the then block we are checking if the resolved object has an error. If it does then we handle it there. Otherwise, we simply log the results to the console. The other catch block is going to catch any runtime errors or any other errors that are not handled by the program.

In addition to try-catch blocks, we can use catch blocks associated with each promise:

async function main() {
const fileContent = await readFile("./file.txt", "utf-8")
.catch(err => ({
message: "Error while reading the file", error: err,
}));

if (fileContent.error) {
return fileContent;
}

const writeResult = await writeFile(
"./file-copy.txt", fileContent)
.then(result => ({}))
.catch(err => ({
message: "Error while writing the file", error: err,
}));

if(writeResult.error) {
return writeResult;
}

return "file-copy.txt";
}

If you notice, we are calling the catch method for each promise and we are returning a custom error object, similar to the previous example. If there is an error in any step, we are simply going to return the result, which just contains our custom error object.

However, for the second operation, we are explicitly returning an empty object if the write operation is successful. That's because writeFile will resolve toundefined if the operation is successful. Because of that, we won't be able to access the error field on an undefined value. That's why we are explicitly returning a promise that will resolve to an empty object if the write operation is successful.

We can also optionally create two helper functions to save us from writing the same boilerplate code:

const call = (promise) =>
promise.then(r => r == null ? ({result: r}): r)
.catch(error => ({error}));

const error = (result, msg) => ({error: result.error, message: msg});

The call function takes a promise and returns a promise that will either resolve with an empty object (if the result is null or undefined) or it will simply resolve to the operation's result. And if there is an error, the promise will resolve to an object that contains an error field with the error's value.

The error helper function takes a result and a message, and it will return an object that has the result's error and the custom optional message. After adding the two helper functions, we can update our main function:

async function main() {
const fileContent = await call(readFile("./file.txt", "utf-8"));
if(fileContent.error) {
return error(fileContent, "Error while reading the file");
}

const writeResult = await call(
writeFile("./file-copy.txt", fileContent));

if(writeResult.error) {
return error(writeResult, "Error while writing the file");
}

return "file-copy.txt";
}

As you can see, we are passing each operation to the call function. And then we check if there is an error. If so, then we simply call our error function to return a custom error with a custom error message. The complete snippet is shown below:

const util = require("util");
const fs = require("fs");
const readFile = util.promisify(fs.readFile);
const writeFile = util.promisify(fs.writeFile);

const call = (promise) =>
promise.then(r => r == null ? ({result: r}): r)
.catch(error => ({error}));

const error = (result, msg) => ({
error: result.error, message: msg});

async function main() {
const fileContent = await call(readFile("./file.txt", "utf-8"));
if(fileContent.error) {
return error(fileContent, "Error while reading the file");
}

const writeResult = await call(
writeFile("./file-copy.txt", fileContent));

if(writeResult.error) {
return error(writeResult, "Error while writing the file");
}

return "file-copy.txt";
}

main()
.then(r => {
if(r.error) {
return console.log(
"An error occurred, recover here. Details:", r);
}
return console.log("Done, no error. Result:", r);
})
.catch(err => console.log("An error occurred", err));

To reduce even more boilerplate and make things a bit more modular, we can do two things:

  • We can use fs-extra and remove all the calls to util.promisify.

After that, we will have the following:

const fs = require("fs-extra");
const {error, call} = require("../call");

async function main() {
const fileContent = await call(
fs.readFile("./file.txt", "utf-8"));
if(fileContent.error) {
return error(fileContent, "Error while reading the file");
}

const writeResult = await call(
fs.writeFile("./file-copy.txt", fileContent));

if(writeResult.error) {
return error(writeResult, "Error while writing the file");
}

return "file-copy.txt";
}

main()
.then(r => {
if(r.error) {
return console.log(
"An error occurred, recover here. Details:", r);
}
return console.log("Done, no error. Result:", r);
})
.catch(err => console.log("An error occurred", err));

Notice that since we are using fs-extra, if we don't pass a callback to a method, the function is going to return a promise by default. That's why we removed all the promisify calls, and converted all the fs calls directly on the fs variable. Also, we moved the two helper functions into their own file called call.js.

Read & write Multiple Files

In this section, we are going to write a script that reads the contents of multiple files and writes the results to new files.

You can access all the scripts for this section on Gitlab.

The setup for this example is very similar to the previous one:

const fs = require("fs-extra");

async function main() {
const files = ["files/file1.txt", "files/file2.txt"];
// ...
}

main()
.then(console.log)
.catch(err => console.log("An error occurred", err));

In the snippet above, first we require the fs-extra module that has all the promise-based versions of the fs methods. Then, we define the main async function as the entry point of the program. We also define an array that contains the hard-coded paths to the files that we are going to be reading from.

Next, we are going to write a for-loop that goes through the file paths and reads the contents of each file:

const fs = require("fs-extra");

async function main() {
const files = ["files/file1.txt", "files/file2.txt"];

for (const file of files) { // A
const content = await fs.readFile(file, "utf-8"); // B
console.log(content); // C
}
}

main()
.then(console.log)
.catch(err => console.log("An error occurred", err));

On line A we define the for-loop. And on line B we await on the result of fs.readFile and we assign it to the content variable. Finally, on line C, we log the content to the console. Let's replace the log statement with an actual write-to-file operation:

const fs = require("fs-extra");

async function main() {
const files = ["files/file1.txt", "files/file2.txt"];

for (const file of files) {
const content = await fs.readFile(file, "utf-8");
const path = file.replace(".txt", "-copy.txt"); // A
const writeResult = await fs.writeFile(path, content); // B
}

return files; // C
}

main()
.then(console.log)
.catch(err => console.log("An error occurred", err));

In the snippet above, first we define the path of the file on line A. Then, on line B, we write the result to the new path, and we make sure to await on it as well. We need to await here because we want to make sure that the write is finished before we move onto the next file. And finally on line C, we return the input file paths.

Now, the implementation above is okay, but we can do better. In the implementation above, we process each file one at a time. That is, we wait for the read-write operation for each file to finish before moving onto the next file. We can actually run each read-write process concurrently by creating an array of promises where each promise represents the read-write operation on a file. Finally, we can use Promise.all to process all the promises concurrently:

const fs = require("fs-extra");

async function main() {
const files = ["files/file1.txt", "files/file2.txt"];

const readWrites = []; // A

for (const file of files) { // B
readWrites.push((async() => { // C
const content = await fs.readFile(file, "utf-8"); // D
const path = file.replace(".txt", "-copy.txt"); // E
return await fs.writeFile(path, content); // F
})());
}

return await Promise.all(readWrites); // G
}

main()
.then(console.log)
.catch(err => console.log("An error occurred", err));

In the snippet above, we define an array on line A to hold the read-write promises. On line B, we start the for loop that goes through each file path. On line C, we push a self-invoked async function to the readWrites array. Inside the body of each async function we read the content of each file and write to a new file. On line F, we return the result of fs.writeFile which is a promise object. Finally, on line G we use Promise.all to process all the promises concurrently. We also await on the result as well which resolves to a single array holding the write results. If the write operations are successful, we should get an array of undefined values. That's because the write method resolves to undefined if no errors are occurred.

Even though the implementation above gets the job done, we can do a little bit better. We can use the map method on the files array, with an async function, and eliminate the need for the self-invoking async function. It will also be a little easier to follow:

const fs = require("fs-extra");

async function main() {
const files = ["files/file1.txt", "files/file2.txt"];

const readWrites = files.map(async file => { // A
const content = await fs.readFile(file, "utf-8"); // B
return await fs.writeFile(
file.replace(".txt", "-copy.txt"), content); // C
});

return await Promise.all(readWrites); // D
}

main()
.then(console.log)
.catch(err => console.log("An error occurred", err));

In the snippet above, on line A we call map on the files array and we pass it an async function. Inside the async function we simply perform the read-write operation. And finally on line D, we call Promise.all and we pass the readWrites array. The readWrites array holds promises where each promise represents the result of each read and write.

Now, let’s expand on the example above. Let’s create a folder and put all the new files into it. We will need to create an async function that handles creating the output folder for us before we move onto the read-write operations:

async function prepare() {
await fs.remove("output"); // A
return await fs.mkdir("output"); // B
}

In the snippet above, first we create an async function called prepare. On line A, first we remove the output folder if it already exists. We also wait for the promise to be resolved before moving onto line B. On line B, we create the output folder and we also wait for that to finish. Now, we can use the prepare function inside our main function, before starting the read-write operations:

const fs = require("fs-extra");

const files = ["files/file1.txt", "files/file2.txt"];
const output = "output";

async function prepare() {
await fs.remove(output);
return await fs.mkdir(output);
}

async function main() {
await prepare(); // A

const readWrites = files.map(async file => {
const content = await fs.readFile(file, "utf-8");
const path = file.replace("files", output); // B
return await fs.writeFile(path, content);
});

return await Promise.all(readWrites);
}

main()
.then(console.log)
.catch(err => console.log("An error occurred", err));

On line A, we wait for the prepare function to finish before moving onto the read-write operations. We also updated the output file paths on line B. The rest of the script is pretty much the same. We also moved the files and output variable outside of the main function. If you run the script above, you should see an output folder that contains a copy of each input file.

Format CSV Files

In this section, we are going to write a script that reads a couple of CSV files, formats them, and writes the results to another directory.

You can access all the scripts for this section on Gitlab.

Below is an overview of each task that the script needs to perform:

  • Identify CSV files in a directory (one level deep) by checking the extensions and the file stats.

Using the tasks above, we can define the following steps:

  • Start by making an output folder if it doesn’t already exist. In other words, remove the output directory regardless, and create it again.

Using the flow above, we can define the following functions for each step:

async function setup() {}

async function csvFiles(inputFolder) {}

function format(content) {}

async function formatWrite() {}

async function main() {
const src = "input-folder";
const [output, files] = await Promise.all([ // A
setup(), csvFiles(src),
]);

return await Promise.all( // B
files.map(async file => formatWrite(file, output))
);
}
  • The setup function is going to handle creating the output directory. It will return a promise that resolves to the name of the output directory.

Let’s start with the setup function. The setup function doesn't do much. It removes the output folder, whether or not it exists. Then, it creates it and returns the name of the folder:

async function setup() {
const output = "output";
await fs.remove(output);
await fs.mkdir(output);
return output;
}

Next, let’s look at the csvFiles function. This function is basically going to read the contents of a folder and then figure out which ones are files, and then it's going to filter out the files with the .csv extension. It's going to perform that only one level deep:

async function csvFiles(inputFolder) {
const dirContent = await fs.readdir(inputFolder); // A
const paths = dirContent.map(c => path.join(inputFolder, c)); // B

return await Promise.all(paths.map(async p => { // C
const isFileAndCSV =
((await fs.stat(p)).isFile() && /\.csv$/.test(p)); // D
return isFileAndCSV ? p : "";
}))
.then(paths => paths.filter(v => v)); // E
}
  • On line A, we call fs.readdir to read the contents of the input folder. Then we wait for the result to come back and we store the results in the dirContent variable.

The next function that we are going to look at is the formatWrite function. This function is basically going to read the contents of each file, parse it with a CSV parser, and the it's going to write the formatted content to a new file:

async function formatWrite(file, output) {
const content = await fs.readFile(file, "utf-8"); // A
const parsed = await csvParse(content); // B
const formatted = format(parsed); // C
const stringified = await csvStringify(formatted); // D
const outPath = path.join(
output, file.split("/").slice(-1)[0]); // E
await fs.writeFile(outPath, stringified); // F
return file;
}
  • One line A, we read the content of a given file and wait for the result and we store it in the content variable.

That’s really it. Below is the complete program including all the require statements and the implementation for the format function:

const fs = require("fs-extra");
const util = require("util");
const path = require("path");
const csvParse = util.promisify(require("csv-parse"));
const csvStringify = util.promisify(require("csv-stringify"));

async function setup() {
const output = "output";
await fs.remove(output);
await fs.mkdir(output);
return output;
}

async function csvFiles(inputFolder) {
const dirContent = await fs.readdir(inputFolder);
const paths = dirContent.map(c => path.join(inputFolder, c));

return await Promise.all(paths.map(async p => {
const isFileAndCSV =
((await fs.stat(p)).isFile() && /\.csv$/.test(p));
return isFileAndCSV ? p : "";
}))
.then(paths => paths.filter(v => v));
}

function format(content) {
return content.map((v, i) => {
if(i === 0) {
return v.map(h => h.toUpperCase());
}
return v;
});
}

async function formatWrite (file, output) {
const content = await fs.readFile(file, "utf-8");
const parsed = await csvParse(content);
const formatted = format(parsed);
const stringified = await csvStringify(formatted);
const outPath = path.join(output, file.split("/").slice(-1)[0]);
await fs.writeFile(outPath, stringified);
return file;
}

async function main() {
const src = "input-folder";
const [output, files] = await Promise.all([
setup(), csvFiles(src),
]);

return await Promise.all(
files.map(async file => formatWrite(file, output))
);
}

main()
.then(console.log)
.catch(console.log);

If we need to process a large number of files concurrently, we can limit the number of files being processed at a time. For that, we can use a module like p-limit. After we require it, we can update the main function to limit the concurrent tasks to two promises at a time:

async function main() {
const src = "input-folder";
const [output, files] = await Promise.all([
setup(), csvFiles(src),
]);

/* limit concurrent tasks to 2 */
const limit = pLimit(2); // A
return await Promise.all(
files.map(file => limit(() => formatWrite(file, output))) // B
);
}

On line A, we create a limit function from pLimit and we specify how many concurrent tasks we want to run at a time. On line B, we wrap our formatWrite function with limit and it will take care of the rest. You can see the complete script on Gitlab.

Conclusion

JavaScript has definitely come a long way and promises, along with async/await, have made it much easier to write better async programs. Now that we have reached the end of the article, let’s recap some important take-aways:

  • We can divide async tasks into concurrent and sequential flows. We can capture flows in promises and decide which parts of the programs should run concurrently and which ones sequentially.

Appendix 1: Installing Node

The easiest and the most consistent way of installing Node is through a version manager like NVM. First, install NVM using the following:

curl -o- https://raw.githubusercontent.com/creationix/nvm/v0.33.11/install.sh | bash

Then check your “profile” file to see if the following entries have been added:

export NVM_DIR="$HOME/.nvm"
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh" # This loads nvm
[ -s "$NVM_DIR/bash_completion" ] && \. "$NVM_DIR/bash_completion" # This loads nvm bash_completion

Then restart your terminal and make sure that you can get an output for nvm --version. After that simply run nvm install 8 to install the latest Node 8. Afterwards, run node -v and npm -v to verify that both Node and Npm are available.

JavaScript In Plain English

New articles every day.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store