Build Faster Apps with Concurrent API Requests

Brian Davenport
BigCommerce Developer Blog
10 min readOct 8, 2019

Let’s say you want to build an app to retrieve lots of data over an API quickly and safely. Maybe you want to copy this data into a spreadsheet or your own database.

Depending on your experience, you may have tried putting API requests in a loop and found your app logging a pile of exceptions as your app started getting blocked by the API. You may have also been surprised when the results came back out of order. With an asynchronous language like Node.js, it’s very easy to find yourself in this situation. Node will fire off all the requests in a loop at virtually the same time, without waiting for any of them to finish.

A simple approach is to wait for a response before sending each request. This is responsible, and your API provider will probably appreciate how kind you are being to their infrastructure. However, this approach doesn’t take advantage of concurrency, since you would be running requests one at a time. Depending on how quickly the API responds to each request, you might only achieve a rate of about 1 to 2 requests a second.

If you’re using an API provider like BigCommerce that supports a higher rate of requests, it is possible to safely send multiple requests at once, getting your data faster (and in the order you want). In this article, we’ll build a simple app with Node.js with a configurable rate of API requests.

What We’ll Build

Imagine we have a client with thousands of coupon codes they want to audit. Currently, there isn’t a native way to export all the coupon codes from a BigCommerce store, but the coupon API should give us all the information we need to populate a CSV file.

We’ll identify how many coupons there are on a store, then segment our API requests so we can configure the number of requests to execute at the same time. When we receive coupons from the API, we’ll write them to a CSV file.

The completed project code is available here: https://github.com/bdav87/bc-couponexporter

When making 1 API request at a time, it takes about 17 seconds to write a little over 5000 coupon codes to a CSV file.
When making 5 API requests at a time, it takes about 4 seconds to do the same thing.
If your store is on the Enterprise plan, you can set a higher rate.

Setting Up

This project was developed using Node version 10, but it also runs on Node 8 and 12. Other Node versions have not been tested.

First, install the following dependencies in your project directory:

npm install — save dotenv fast-csv node-bigcommerce

Next, you’ll need your own set of API credentials to access coupons on a store. You can learn how to generate credentials in a BigCommerce store by following the guide here: https://developer.bigcommerce.com/api-docs/getting-started/authentication#authentication_getting-api-credentials

When you create your credentials, you’ll want to make sure you at least have the Marketing scope enabled. Since we’re only retrieving coupons, you can set this to read-only. If you want to use the same credentials and expand on the app later (maybe to import coupons) you can give yourself modify permissions.

Note: if your BigCommerce login is not in the store owner role, you won’t be able to create your own API account and you’ll need to get the store owner to generate API credentials for you. Alternatively, you can start a new BigCommerce trial yourself.

Our app will read API credentials from environmental variables written in a file named .env. By reading from this file, we can avoid committing credentials to version control, and have the ability to change which store we connect to easily without directly modifying our code. Create the .env file in your project root directory with the following values:

.env

CLIENT=your client id
TOKEN=your auth token
STOREHASH=your store hash

If you don’t have a lot of coupon codes on your store, you can quickly generate up to 1000 coupons at a time using the free Coupon Manager app: https://www.bigcommerce.com/apps/coupon-manager/

The performance difference based on rate of requests will be easier to see if you have several thousand coupons on your store.

Project Structure

We are going to abstract our app functionality into 2 modules, one for setting up the API client and the other to handle all Coupon related functionality. Our app’s main index.js file will ultimately have only 3 lines of code!

First, create a directory in your project named lib. This folder will contain our 2 modules. Create files in the lib directory named BC.js and Coupon.js.

Creating the API Client

First, we’ll set up BC.js so that we can read our authentication credentials and pass them to a BigCommerce instance from the node-bigcommerce library. This will make it easy to make API requests with very little code.

/lib/BC.js

const nodebc = require('node-bigcommerce');
const dotenv = require('dotenv');
dotenv.config();
const BigCommerce = new nodebc({
clientId: process.env.CLIENT,
accessToken: process.env.TOKEN,
storeHash: process.env.STOREHASH,
responseType: 'json'
})
module.exports = BigCommerce;

This module will read credentials from our .env file and pass them to a BigCommerce object created from the node-bigcommerce module. We’re going to import this into our Coupon.js file so that we can call the BigCommerce API.

/lib/Coupon.js

const fs = require('fs');
const csv = require('fast-csv');
const BigCommerce = require('./BC');

We are also requiring the filesystem module and the csv module so that we can write to a CSV file.

Creating the Coupon Class

Most of our functionality is going to live in /lib/Coupon.js, enabling us to initiate a CSV export with a simple function call in our main index.js file. We’re going to keep track of a queue of API requests and the export progress by setting values in the class constructor:

/lib/Coupon.js

class Coupon {
constructor(rate = 1) {
this.bc = BigCommerce;
this.Streams = {};
this.state = {
totalPages: 0,
queue: [],
rate: rate,
startTime: 0
};
}
// --snip--
}
module.exports = Coupon;

Whenever we instantiate a new Coupon class, we can set the rate at which requests are made by passing in an integer. For example, this would set our rate at 5 concurrent requests:

index.js

const Coupon = require(‘./lib/Coupon’);
const coupon = new Coupon(5);

If no value is passed, a default value of 1 request at a time is used to be safe.

From our index.js file, it would be great if we could just call code like this to start the whole process of pulling coupons from the API and writing them to a CSV:

index.js

coupon.InitExport();

We’ll go over the functionality we’ll need to make this happen. The Coupon class will use its own methods and call them on each other to set up different flows, so if you need help following along you can also reference the completed version of Coupon.js at https://github.com/bdav87/bc-couponexporter/blob/master/lib/Coupon.js

In Coupon.js our InitExport method will look like this:

InitExport() {
this.state.startTime = Date.now();
this.Streams = this.InitExportStream();
this.GetAllCoupons();
}

To give ourselves an idea of how long the whole process takes based on the rate of concurrency we set, we’ll keep track of the time by getting a timestamp as soon as we initiate the export. Next, we’ll set up the file streams so that we can stream data to a CSV as we receive it.

This is in our next method, InitExportStream:

InitExportStream() {
const filename = this.CreateCSVFile();
const writableStream = fs.createWriteStream(filename);
const csvStream = csv.format({headers: true});
csvStream.pipe(writableStream);
const Streams = {
writableStream: writableStream,
csvStream: csvStream,
filename: filename
}
return Streams;
}

A Streams object will be added to our Coupon class, enabling us to keep track of the data flowing into our file within an instance of Coupon. You can also see that we start by setting a filename with CreateCSVFile. The CreateCSVFile method will set up the CSV in our app root directory so that we can write to it, while appending a date to the filename so it’s easier to see when an export took place:

CreateCSVFile() {
const date = new Date().toDateString().split(' ').join('_');
const filename = `coupon-export-${date}.csv`;
return filename;
}

Finally, in InitExport we call GetAllCoupons to start the process of pulling data from the API. This is what sets off the process of retrieving information from the BigCommerce API.

GetAllCoupons() {
this.CountPages()
.then(this.PrepareQueue.bind(this))
}

We have to get a count of all coupon codes on a store so we can determine how many pages to iterate over. CountPages will call a method named CouponCount, which returns the total number of coupons on a store, then CountPages takes that value and divides by 250, since 250 is the max number of resources you can receive when getting a page from the API.

CouponCount() {
return this.bc.get(`/coupons/count`);
}
CountPages() {
return (async () => {
const { count } = await this.CouponCount();
return Math.ceil(count / 250);
})();
}

After the coupons are counted and the number of pages is determined, we can pass that value to the method PrepareQueue. PrepareQueue will pass page numbers to an array to help us keep track of what pages are being requested from the API at any given time, while also enabling us to set the rate at which we make API requests.

An aside about the syntax around the call to PrepareQueue in GetAllCoupons:
.then(this.PrepareQueue.bind(this)) is equivalent to
.then(data => this.PrepareQueue(data).bind(this))

We also have to use bind because otherwise PrepareQueue is not in scope within the .then at this point.

The PrepareQueue method uses a simple loop to populate our queue array with numbers representing each page:

PrepareQueue(pages) {
this.state.totalPages = pages;
for (let pageNumber = 1; pageNumber < pages + 1; pageNumber++) {
this.state.queue.push(pageNumber);
}
return this.SetupRequestBlock();
}

With the queue populated with all the pages of coupons we need to retrieve, we can go into more detail around how we’re going to segment our concurrent requests and write the responses to a CSV.

Running Concurrent Requests

After PrepareQueue finishes populating the queue, SetupRequestBlock is called to start grabbing values from our queue and setting up API requests to get each page of coupons.

/lib/Coupon.js

SetupRequestBlock() {
const block = this.state.queue.splice(0, this.state.rate);
const start = block[0];
const end = block[block.length - 1] + 1;
console.log('Queue:', this.state.queue);
console.log('Page(s) being requested:', block);
const requestGroup = [];
for (let index = start; index < end; index++) {
requestGroup.push(this.GetPage(index))
}
if (block.length) {
return this.ExecuteAPIRequests(requestGroup);
} else {
const finishTime = Date.now();
console.log(`Time elapsed: ${Math.floor(finishTime - this.state.startTime) / 1000} seconds`);
}
}

We splice groups of values off the queue array based on the rate defined when the Coupon class is first created. If we set the rate to 5, then 5 page numbers will be spliced off our queue each time SetupRequestBlock is called.

In the for loop we are passing each page number to a function called GetPage.

GetPage(page) {
return this.bc.get(`/coupons?limit=250&page=${page}`);
}

The nice thing about making requests with the node-bigcommerce library is that every request returns a Promise. This means we end up with an array of Promises when we push these requests to the requestGroup variable.

The requestGroup array is passed to ExecuteAPIRequests, which will wait for all the Promises to resolve before writing the API responses to a CSV.

async ExecuteAPIRequests(requestGroup) {
try {
const couponPages = await Promise.all(requestGroup);
return this.WriteEachResultToCSV(couponPages);
} catch(err) {
console.log('Error executing requests: ',err)
}
}

All of the requests will be sent at roughly the same time, and it doesn’t matter which order they finish in. When they’re all done, we can move forward with the results, which will be returned in an array (in the order the Promises were added to requestGroup) and passed to the couponPages variable.

The WriteEachResultToCSV method actually takes the array of returned coupon pages and passes them along with a generator to another function that will cycle through each page and write each coupon to a row in the CSV.

WriteEachResultToCSV(pagesOfCoupons) {
const total = pagesOfCoupons.length;
const generator = this.SpawnGenerator(total);
return this.CyclePages(generator, pagesOfCoupons);
}

We create a generator using the SpawnGenerator method, which returns a new generator to help us keep track of each page of coupons being written to the CSV.

SpawnGenerator(total) {
function* generator(total) {
for (let i = 0; i < total; i++) {
yield i;
}
}
return generator(total);
}

The CyclePages method will go through each page of 250 coupons we received from the API and write each coupon to a row in the CSV until we’ve gone through every page returned from our block of API requests.

CyclePages(generator, pagesOfCoupons) {
const cycle = generator.next();
if (!cycle.done) {
const currentCouponPage = cycle.value;
let counter = 0;
pagesOfCoupons[currentCouponPage]
.forEach((coupon, _, page) => {
this.WriteToCSV(coupon);
counter++;
if (counter == page.length) {
this.CyclePages(generator, pagesOfCoupons);
}
});
} else {
this.SetupRequestBlock();
}
}

Each page of coupons is an array of coupon objects, so in CyclePages we can call forEach and write each coupon value to a row in the CSV by passing the coupon to WriteToCSV.

WriteToCSV(coupon) {
this.Streams.csvStream.write(this.FormatExportContent(coupon));
}

We pass each coupon to FormatExportContent to define how a coupon is written to the CSV. The value returned from this method is what’s actually streamed to the CSV.

FormatExportContent(coupon) {
return {
'Coupon ID': parseInt(coupon['id']),
'Coupon Name': coupon['name'],
'Discount Type': coupon['type'],
'Min Purchase': coupon['min_purchase'],
'Expires': coupon['expires'],
'Enabled': coupon['enabled'],
'Coupon Code': coupon['code'],
'Applies To': JSON.stringify(coupon['applies_to']),
'Number of Uses': coupon['num_uses'],
'Max Uses': coupon['max_uses'],
'Max Uses per Customer': coupon['max_uses_per_customer'],
'Restricted to': coupon['restricted_to'],
'Shipping Methods': coupon['shipping_methods'],
'Date Created': coupon['date_created'],
};
}

The fast-csv module takes the object keys and uses them as column headers, creating a new row with values organized correctly under each column in the CSV.

A generator will conveniently return a value of true on its done property whenever an iteration is complete, and will continue to do so even if you continue to call next() on the generator. We call CyclePages recursively with the same generator, which increments with another value, letting us grab the next page of coupons to write to the CSV.

When we’ve finished writing this current set of data to the CSV, SetupRequestBlock is called to slice off another set of page numbers from our queue.

When there are no more values left in the queue, SetupRequestBlock will log the finish time, letting us know how long it took to get all the coupon values and write them to a CSV. The CSV file be available in your app’s root directory.

Conclusion

You now have a simple coupon export app that can get data quickly with concurrent API requests, using a queue to manage how many requests are fired off simultaneously.

Try experimenting with the rate to see how quickly you can pull all the coupons off your store! Note that Enterprise stores are not strictly limited, but other plans will limit apps to roughly 5 requests per second.

You can also try adding your own features. Here are some suggestions:

  • Automatic rate limiting based on store plan
  • A retry feature for API requests that fail to resolve
  • Export a different store resource to CSV, like gift certificates

--

--