Writing Async Programs in JavaScript

Learn how to write asynchronous programs using promises and async/await

There is no doubt that JavaScript, despite its history, has become to be one of the most popular programming languages today. JavaScript, due to its asynchronous nature, can present some challenges for those who are new to the language. In this article, we are going to write small async programs using promises and async/await. Using these examples, we are going to identify some simple patterns that you can use for your own programs.

If you are new to JavaScript, you may want to first check out my other article before reading this one.

All the code examples in this article are written for the Node environment. If you don’t have Node installed, you can see the Appendix 1 for instructions. Even though all the programs are written for Node, you can apply the same principles for scripts running in the browser. Also, all the code examples for this article are available on Gitlab.

Introduction

Whether or not people believe that JavaScript is a real programming language, the reality is that it’s not going to go anywhere anytime soon. If you are a web developer you might as well spend some time and learn its good and bad parts.

JavaScript is single-threaded and favors non-blocking asynchronous flows. If you are new to the language, it can become really frustrating when things don’t work the way you expect them to. Asynchronous programming demands more patience and a different kind of thinking that’s different from synchronous programming.

In the synchronous model everything happens in a sequence one at a time. Because of that it’s easier to reason about programs. But in the asynchronous model, operations can start or finish in any order at any point in time. Because of that simply relying on a sequence is not sufficient. Asynchronous programming demands more thought in terms of program flow and design.

In this article, we are going to explore a couple of short async programs. We are going to start with simple programs and work our way up to more complex ones. Below is an overview of the scripts that we are going to be writing:

  • A script that writes the content of a file to another file.
  • A script that writes the contents of multiple files to new files.
  • A script that parses and formats CSV files in a directory and outputs new CSV files to another folder.

Promises and Async/await

Let’s take a moment and quickly review the basics of promises and async/await.

Promises

  • A promise is an object that represents the result of an asynchronous operation.
  • A promise is either resolved with a “success” value or rejected with a “failure” value.
  • Generally speaking, resolved values are accessed with a callback’s argument to a then block. And the rejected values are access with a callback's argument to a catch block.
  • In modern JavaScript environments, you can access a promise constructor through the global object as Promise.
  • A promise can be created using the Promise constructor using the new keyword. That is:
  • const p = new Promise((r, j) => {});
  • The r callback is used to resolve the promise with a value and the j callback is used to reject the promise.
  • The Promise constructor has some useful static methods like all, race, resolve, and reject. The all method accepts an array of promises and will attempt to resolve all of them concurrently and return a promise that resolves to an array with the resolved values. The race methods takes an array of promises and resolves or rejects the first promise that finishes. The resolve method creates a promise and resolves it to the given value. The reject method creates a promise and rejects it with the given value.

Async/await

  • The purpose of async/await functions is to simplify the behavior of using promises synchronously and to perform some behavior on a group of Promises. From MDN
  • Just as Promises are similar to structured callbacks, async/await is similar to combining generators and promises. From MDN
  • A function can be marked as an asynchronous function with the async keyword. That is: async function hello() {} or const hello = async() => {};.
  • An async function always returns a promise. If a value is returned from an async function, it will be implicitly wrapped in a promise.
  • If there is an uncaught exception thrown inside an async function, the promise returned is rejected with the exception.
  • The await operator can be used inside an async function before statements that return a promise. In that case, the execution of the function is "paused" until the promise is resolved or rejected.
  • The await operator is only valid inside an async function.

Read & Write a Single File

In this section, we are going to write a script that reads the contents of a single file and writes the result to a new file.

You can access all the scripts for this section on Gitlab.

First, we are going to create an async function for the entry point of our program:

async function main() {
// body goes here...
}

Then, we are going to need to create two promises, one that represents the contents of the file. And the other represents the result of writing the content to another file:

async function main() {
const fileContent = readFile("./file.txt", "utf-8");
const writeResult = writeFile("./file-copy.txt", fileContent);
}

In the snippet above, both readFile and writeFile are asynchronous, and they both return a promise. Because of that, first we need to make sure that we await the result of readFile so that we can use it in writeFile:

async function main() {
const fileContent = await readFile("./file.txt", "utf-8");
const writeResult = writeFile("./file-copy.txt", fileContent);
}

And finally, we can decide what to return from the main function. Here we are just going to return the name of the new file that we are writing to. Note that the returned value will be automatically wrapped in a promise. But we need to make sure to await for the result of writeFile before hitting the last line of the function:

async function main() {
const fileContent = await readFile("./file.txt", "utf-8");
const writeResult = await writeFile(
"./file-copy.txt", fileContent);
return "file-copy.txt";
}

Now, we can call the main function and log the results or any uncaught exceptions to the console:

main()
.then(r => console.log("Result:", r))
.catch(err => console.log("An error occurred", err));

To make the program complete, we need to require the fs module and promisify the fs.readFile and fs.writeFile. The complete script is shown below:

const util = require("util");
const fs = require("fs");
const readFile = util.promisify(fs.readFile);
const writeFile = util.promisify(fs.writeFile);

async function main() {
const fileContent = await readFile("./file.txt", "utf-8");
const writeResult = await writeFile(
"./file-copy.txt", fileContent);
return "file-copy.txt";
}

main()
.then(r => console.log("Result:", r))
.catch(err => console.log("An error occurred", err));

In the snippet above, we promisify fs.writeFile and fs.readFile. Promisify can turn any callback-style function, provided that they follow the Node convention, to a promise-based function.

Now let’s talk a little about error handling. There are a couple of ways that you can approach error handling and it all depends on how much control you need. For example, in the snippet above, we are basically catching any error that could occur in the main function in the catch block. That works because inside an async function, any uncaught exception immediately causes the function to return a promise that is rejected with the exception.

But let’s say you need more control and you would like to do different things depending on possible errors for each async operation. In that case, you could either use a try-catch block or use a catch block on each async operation. First, let's look at using try-catch blocks.

async function main() {
let fileContent;
try {
fileContent = await readFile("./files.txt", "utf-8");
} catch(err) {
return {message: "Error while reading the file", error: err};
}

try {
const writeResult = await writeFile(
"./file-copy.txt", fileContent);
} catch(err) {
return {message: "Error while writing the file", error: err};
}

return "file-copy.txt";
}

In the snippet above, we have added two try-catch blocks. Also, we have created the fileContent variable outside of the first block so that it can be visible throughout the main function. Notice that in each try-catch block, we return an object if there is an error. The error object contains a message field, and the details of the error. Now if any error happens, the function will immediately return with our custom error object. Remember that the returned object will be automatically wrapped in a promise. We can call the main function just like before, but this time we can check for error objects in the then block:

main()
.then(r => {
if(r.error) {
return console.log(
"An error occurred, recover here. Details:", r);
}
return console.log("Done, no error. Result:", r);
})
.catch(err => console.log("An error occurred", err));

Notice that in the then block we are checking if the resolved object has an error. If it does then we handle it there. Otherwise, we simply log the results to the console. The other catch block is going to catch any runtime errors or any other errors that are not handled by the program.

In addition to try-catch blocks, we can use catch blocks associated with each promise:

async function main() {
const fileContent = await readFile("./file.txt", "utf-8")
.catch(err => ({
message: "Error while reading the file", error: err,
}));

if (fileContent.error) {
return fileContent;
}

const writeResult = await writeFile(
"./file-copy.txt", fileContent)
.then(result => ({}))
.catch(err => ({
message: "Error while writing the file", error: err,
}));

if(writeResult.error) {
return writeResult;
}

return "file-copy.txt";
}

If you notice, we are calling the catch method for each promise and we are returning a custom error object, similar to the previous example. If there is an error in any step, we are simply going to return the result, which just contains our custom error object.

However, for the second operation, we are explicitly returning an empty object if the write operation is successful. That's because writeFile will resolve toundefined if the operation is successful. Because of that, we won't be able to access the error field on an undefined value. That's why we are explicitly returning a promise that will resolve to an empty object if the write operation is successful.

We can also optionally create two helper functions to save us from writing the same boilerplate code:

const call = (promise) =>
promise.then(r => r == null ? ({result: r}): r)
.catch(error => ({error}));

const error = (result, msg) => ({error: result.error, message: msg});

The call function takes a promise and returns a promise that will either resolve with an empty object (if the result is null or undefined) or it will simply resolve to the operation's result. And if there is an error, the promise will resolve to an object that contains an error field with the error's value.

The error helper function takes a result and a message, and it will return an object that has the result's error and the custom optional message. After adding the two helper functions, we can update our main function:

async function main() {
const fileContent = await call(readFile("./file.txt", "utf-8"));
if(fileContent.error) {
return error(fileContent, "Error while reading the file");
}

const writeResult = await call(
writeFile("./file-copy.txt", fileContent));

if(writeResult.error) {
return error(writeResult, "Error while writing the file");
}

return "file-copy.txt";
}

As you can see, we are passing each operation to the call function. And then we check if there is an error. If so, then we simply call our error function to return a custom error with a custom error message. The complete snippet is shown below:

const util = require("util");
const fs = require("fs");
const readFile = util.promisify(fs.readFile);
const writeFile = util.promisify(fs.writeFile);

const call = (promise) =>
promise.then(r => r == null ? ({result: r}): r)
.catch(error => ({error}));

const error = (result, msg) => ({
error: result.error, message: msg});

async function main() {
const fileContent = await call(readFile("./file.txt", "utf-8"));
if(fileContent.error) {
return error(fileContent, "Error while reading the file");
}

const writeResult = await call(
writeFile("./file-copy.txt", fileContent));

if(writeResult.error) {
return error(writeResult, "Error while writing the file");
}

return "file-copy.txt";
}

main()
.then(r => {
if(r.error) {
return console.log(
"An error occurred, recover here. Details:", r);
}
return console.log("Done, no error. Result:", r);
})
.catch(err => console.log("An error occurred", err));

To reduce even more boilerplate and make things a bit more modular, we can do two things:

  • We can use fs-extra and remove all the calls to util.promisify.
  • We can also move the two helper functions into their own file.

After that, we will have the following:

const fs = require("fs-extra");
const {error, call} = require("../call");

async function main() {
const fileContent = await call(
fs.readFile("./file.txt", "utf-8"));
  if(fileContent.error) {
return error(fileContent, "Error while reading the file");
}

const writeResult = await call(
fs.writeFile("./file-copy.txt", fileContent));

if(writeResult.error) {
return error(writeResult, "Error while writing the file");
}

return "file-copy.txt";
}

main()
.then(r => {
if(r.error) {
return console.log(
"An error occurred, recover here. Details:", r);
}
return console.log("Done, no error. Result:", r);
})
.catch(err => console.log("An error occurred", err));

Notice that since we are using fs-extra, if we don't pass a callback to a method, the function is going to return a promise by default. That's why we removed all the promisify calls, and converted all the fs calls directly on the fs variable. Also, we moved the two helper functions into their own file called call.js.

Read & write Multiple Files

In this section, we are going to write a script that reads the contents of multiple files and writes the results to new files.

You can access all the scripts for this section on Gitlab.

The setup for this example is very similar to the previous one:

const fs = require("fs-extra");

async function main() {
const files = ["files/file1.txt", "files/file2.txt"];
// ...
}

main()
.then(console.log)
.catch(err => console.log("An error occurred", err));

In the snippet above, first we require the fs-extra module that has all the promise-based versions of the fs methods. Then, we define the main async function as the entry point of the program. We also define an array that contains the hard-coded paths to the files that we are going to be reading from.

Next, we are going to write a for-loop that goes through the file paths and reads the contents of each file:

const fs = require("fs-extra");

async function main() {
const files = ["files/file1.txt", "files/file2.txt"];

for (const file of files) { // A
const content = await fs.readFile(file, "utf-8"); // B
console.log(content); // C
}
}

main()
.then(console.log)
.catch(err => console.log("An error occurred", err));

On line A we define the for-loop. And on line B we await on the result of fs.readFile and we assign it to the content variable. Finally, on line C, we log the content to the console. Let's replace the log statement with an actual write-to-file operation:

const fs = require("fs-extra");

async function main() {
const files = ["files/file1.txt", "files/file2.txt"];

for (const file of files) {
const content = await fs.readFile(file, "utf-8");
const path = file.replace(".txt", "-copy.txt"); // A
const writeResult = await fs.writeFile(path, content); // B
}

return files; // C
}

main()
.then(console.log)
.catch(err => console.log("An error occurred", err));

In the snippet above, first we define the path of the file on line A. Then, on line B, we write the result to the new path, and we make sure to await on it as well. We need to await here because we want to make sure that the write is finished before we move onto the next file. And finally on line C, we return the input file paths.

Now, the implementation above is okay, but we can do better. In the implementation above, we process each file one at a time. That is, we wait for the read-write operation for each file to finish before moving onto the next file. We can actually run each read-write process concurrently by creating an array of promises where each promise represents the read-write operation on a file. Finally, we can use Promise.all to process all the promises concurrently:

const fs = require("fs-extra");

async function main() {
const files = ["files/file1.txt", "files/file2.txt"];

const readWrites = []; // A

for (const file of files) { // B
readWrites.push((async() => { // C
const content = await fs.readFile(file, "utf-8"); // D
const path = file.replace(".txt", "-copy.txt"); // E
return await fs.writeFile(path, content); // F
})());
}

return await Promise.all(readWrites); // G
}

main()
.then(console.log)
.catch(err => console.log("An error occurred", err));

In the snippet above, we define an array on line A to hold the read-write promises. On line B, we start the for loop that goes through each file path. On line C, we push a self-invoked async function to the readWrites array. Inside the body of each async function we read the content of each file and write to a new file. On line F, we return the result of fs.writeFile which is a promise object. Finally, on line G we use Promise.all to process all the promises concurrently. We also await on the result as well which resolves to a single array holding the write results. If the write operations are successful, we should get an array of undefined values. That's because the write method resolves to undefined if no errors are occurred.

Even though the implementation above gets the job done, we can do a little bit better. We can use the map method on the files array, with an async function, and eliminate the need for the self-invoking async function. It will also be a little easier to follow:

const fs = require("fs-extra");

async function main() {
const files = ["files/file1.txt", "files/file2.txt"];

const readWrites = files.map(async file => { // A
const content = await fs.readFile(file, "utf-8"); // B
return await fs.writeFile(
file.replace(".txt", "-copy.txt"), content); // C
});

return await Promise.all(readWrites); // D
}

main()
.then(console.log)
.catch(err => console.log("An error occurred", err));

In the snippet above, on line A we call map on the files array and we pass it an async function. Inside the async function we simply perform the read-write operation. And finally on line D, we call Promise.all and we pass the readWrites array. The readWrites array holds promises where each promise represents the result of each read and write.

Now, let’s expand on the example above. Let’s create a folder and put all the new files into it. We will need to create an async function that handles creating the output folder for us before we move onto the read-write operations:

async function prepare() {
await fs.remove("output"); // A
return await fs.mkdir("output"); // B
}

In the snippet above, first we create an async function called prepare. On line A, first we remove the output folder if it already exists. We also wait for the promise to be resolved before moving onto line B. On line B, we create the output folder and we also wait for that to finish. Now, we can use the prepare function inside our main function, before starting the read-write operations:

const fs = require("fs-extra");

const files = ["files/file1.txt", "files/file2.txt"];
const output = "output";

async function prepare() {
await fs.remove(output);
return await fs.mkdir(output);
}

async function main() {
await prepare(); // A

const readWrites = files.map(async file => {
const content = await fs.readFile(file, "utf-8");
const path = file.replace("files", output); // B
return await fs.writeFile(path, content);
});

return await Promise.all(readWrites);
}

main()
.then(console.log)
.catch(err => console.log("An error occurred", err));

On line A, we wait for the prepare function to finish before moving onto the read-write operations. We also updated the output file paths on line B. The rest of the script is pretty much the same. We also moved the files and output variable outside of the main function. If you run the script above, you should see an output folder that contains a copy of each input file.

Format CSV Files

In this section, we are going to write a script that reads a couple of CSV files, formats them, and writes the results to another directory.

You can access all the scripts for this section on Gitlab.

Below is an overview of each task that the script needs to perform:

  • Identify CSV files in a directory (one level deep) by checking the extensions and the file stats.
  • Read the content of each file and parse the content using a CSV parser.
  • Perform basic formatting on each file.
  • Stringify formatted result and write each one to a new CSV file and place them in an output folder.

Using the tasks above, we can define the following steps:

  • Start by making an output folder if it doesn’t already exist. In other words, remove the output directory regardless, and create it again.
  • Then, identify the CSV files in the given directory.
  • Then, read the contents of each file, do the formatting, and write the results to a new file in the output directory.

Using the flow above, we can define the following functions for each step:

async function setup() {}

async function csvFiles(inputFolder) {}

function format(content) {}

async function formatWrite() {}

async function main() {
const src = "input-folder";
const [output, files] = await Promise.all([ // A
setup(), csvFiles(src),
]);

return await Promise.all( // B
files.map(async file => formatWrite(file, output))
);
}
  • The setup function is going to handle creating the output directory. It will return a promise that resolves to the name of the output directory.
  • The csvFiles function is going to find all the CSV files in a given an input folder and return an array of file paths.
  • The format function is going to do some basic formatting given the content of a parsed CSV file.
  • The main function is the entry point of the program. First, we are going to run setup and csvFiles concurrently. But we will wait for both of them to finish. After that, we are going to create an array of promises where each promise represents reading, formatting, and writing each file. We wait for that to finish and we return the result.

Let’s start with the setup function. The setup function doesn't do much. It removes the output folder, whether or not it exists. Then, it creates it and returns the name of the folder:

async function setup() {
const output = "output";
await fs.remove(output);
await fs.mkdir(output);
return output;
}

Next, let’s look at the csvFiles function. This function is basically going to read the contents of a folder and then figure out which ones are files, and then it's going to filter out the files with the .csv extension. It's going to perform that only one level deep:

async function csvFiles(inputFolder) {
const dirContent = await fs.readdir(inputFolder); // A
const paths = dirContent.map(c => path.join(inputFolder, c)); // B

return await Promise.all(paths.map(async p => { // C
const isFileAndCSV =
((await fs.stat(p)).isFile() && /\.csv$/.test(p)); // D
return isFileAndCSV ? p : "";
}))
.then(paths => paths.filter(v => v)); // E
}
  • On line A, we call fs.readdir to read the contents of the input folder. Then we wait for the result to come back and we store the results in the dirContent variable.
  • On line B, we create an array of full paths, starting with the input folder, and the result from the previous step.
  • On line C, we start identifying the CSV files.
  • On line D, first we call fs.stat to get the stat of a path. We then wait for the result and use that to identify if the given path is a file using the isFile method. We also use a regular expression to check the extension of the given file.
  • One line E, we filter out all the values that are not falsy. At the end, the result is an array of promises where each promise resolves to a CSV file. We use Promise.all to fulfill the promises concurrently and we return the resolved array of files.

The next function that we are going to look at is the formatWrite function. This function is basically going to read the contents of each file, parse it with a CSV parser, and the it's going to write the formatted content to a new file:

async function formatWrite(file, output) {
const content = await fs.readFile(file, "utf-8"); // A
const parsed = await csvParse(content); // B
const formatted = format(parsed); // C
const stringified = await csvStringify(formatted); // D
const outPath = path.join(
output, file.split("/").slice(-1)[0]); // E
await fs.writeFile(outPath, stringified); // F
return file;
}
  • One line A, we read the content of a given file and wait for the result and we store it in the content variable.
  • One line B, we parse the result from the previous step and wait for it to finish. Then, we store the result in the parsed variable.
  • One line C, we call a simple format function on the results from the previous step and we store that in the formatted variable.
  • One line D, we call the csvStringify function on the formatted content and we wait for that to finish. Then, we store the result in the stringified variable.
  • One line E, we define the path to the output file by joining the output path with the name of the file.
  • One line F, we write the stringified content to the path from previous line and then we wait for that to finish.
  • And finally, we return the name of the input file that we processed.

That’s really it. Below is the complete program including all the require statements and the implementation for the format function:

const fs = require("fs-extra");
const util = require("util");
const path = require("path");
const csvParse = util.promisify(require("csv-parse"));
const csvStringify = util.promisify(require("csv-stringify"));

async function setup() {
const output = "output";
await fs.remove(output);
await fs.mkdir(output);
return output;
}

async function csvFiles(inputFolder) {
const dirContent = await fs.readdir(inputFolder);
const paths = dirContent.map(c => path.join(inputFolder, c));

return await Promise.all(paths.map(async p => {
const isFileAndCSV =
((await fs.stat(p)).isFile() && /\.csv$/.test(p));
return isFileAndCSV ? p : "";
}))
.then(paths => paths.filter(v => v));
}

function format(content) {
return content.map((v, i) => {
if(i === 0) {
return v.map(h => h.toUpperCase());
}
return v;
});
}

async function formatWrite (file, output) {
const content = await fs.readFile(file, "utf-8");
const parsed = await csvParse(content);
const formatted = format(parsed);
const stringified = await csvStringify(formatted);
const outPath = path.join(output, file.split("/").slice(-1)[0]);
await fs.writeFile(outPath, stringified);
return file;
}

async function main() {
const src = "input-folder";
const [output, files] = await Promise.all([
setup(), csvFiles(src),
]);

return await Promise.all(
files.map(async file => formatWrite(file, output))
);
}

main()
.then(console.log)
.catch(console.log);

If we need to process a large number of files concurrently, we can limit the number of files being processed at a time. For that, we can use a module like p-limit. After we require it, we can update the main function to limit the concurrent tasks to two promises at a time:

async function main() {
const src = "input-folder";
const [output, files] = await Promise.all([
setup(), csvFiles(src),
]);

/* limit concurrent tasks to 2 */
const limit = pLimit(2); // A
return await Promise.all(
files.map(file => limit(() => formatWrite(file, output))) // B
);
}

On line A, we create a limit function from pLimit and we specify how many concurrent tasks we want to run at a time. On line B, we wrap our formatWrite function with limit and it will take care of the rest. You can see the complete script on Gitlab.

Conclusion

JavaScript has definitely come a long way and promises, along with async/await, have made it much easier to write better async programs. Now that we have reached the end of the article, let’s recap some important take-aways:

  • We can divide async tasks into concurrent and sequential flows. We can capture flows in promises and decide which parts of the programs should run concurrently and which ones sequentially.
  • We can use Promise.all along with the map method of arrays to create promises and process them concurrently. We can also use the await operator before Promise.all to wait for all the promises to be resolved. That is:
  • await Promise.all(inputs.map(async v => {}));
  • If we want to use try-catch blocks inside an async function, we need to use the await operator before any promise value or function that returns a promise.
  • If system resource is a concern or if we are dealing with large inputs, it’s usually a good idea to limit concurrent tasks. We can use packages like p-limit to limit the number of concurrent tasks running at a time.

Appendix 1: Installing Node

The easiest and the most consistent way of installing Node is through a version manager like NVM. First, install NVM using the following:

curl -o- https://raw.githubusercontent.com/creationix/nvm/v0.33.11/install.sh | bash

Then check your “profile” file to see if the following entries have been added:

export NVM_DIR="$HOME/.nvm"
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh" # This loads nvm
[ -s "$NVM_DIR/bash_completion" ] && \. "$NVM_DIR/bash_completion" # This loads nvm bash_completion

Then restart your terminal and make sure that you can get an output for nvm --version. After that simply run nvm install 8 to install the latest Node 8. Afterwards, run node -v and npm -v to verify that both Node and Npm are available.