Make more efficient requests with DataLoader

Use Facebook’s DataLoader library to make more efficient requests without breaking a sweat.

4 min readFeb 13, 2018

--

We all want our network requests to be faster and more efficient. I mean, who doesn’t want that? There are a lot of libraries and services out there that can help us with that, but what if we want something simple?

DataLoader can help us with just that.

We put it in practice here at GAPLabs and it reduced the data fetching time of some endpoints from around 500 milliseconds before caching to around 10 milliseconds after caching, which is around 50x! (measured with Apollo Tracing)

Caveats

A little caveat here from Facebook:

DataLoader caching does not replace Redis, Memcache, or any other shared application-level cache. DataLoader is first and foremost a data loading mechanism, and its cache only serves the purpose of not repeatedly loading the same data in the context of a single request to your Application.

Basically, DataLoader allows you to avoid getting the data you already have by caching, but if you need the cache for other purposes, you’ll still need other libraries and/or structures for that.

To use DataLoader, you also need to be familiar with Promises, so if you’re not familiar with them, check out the MDN docs on Promises. While you’re at it, check out my article on async and await for an easier way of working with Promises.

What does using DataLoader look like?

Once you’ve gotten your DataLoader instance initialized, you can fetch data like this:

Getting data via the `load()` method.

Here, assume that there exists a loader which is an initialized DataLoader instance.

The following line:

const users = await loader.load('/users')

will get the response corresponding to the /users argument.

Then the result will be logged on the console.

The result of the loader.load() method call is cached automatically so when the function is called again, it will respond instantly.

Very easy to use, isn’t it?

We will see how to create a DataLoader instance later, but for now let’s take a look at what DataLoader really is.

What is DataLoader?

According to their GitHub repo:

DataLoader is a generic utility to be used as part of your application’s data fetching layer to provide a simplified and consistent API over various remote data sources such as databases or web services via batching and caching.

There’s a lot of buzzwords there, but essentially, DataLoader is a JavaScript library that allows you to:

  • Send requests by batch
  • Avoid making costly requests via caching
  • Get data from different sources
  • Get data using just the load() and loadMany() methods.

While it is a JavaScript library, there are also implementations in other languages.

Creating a DataLoader instance

Importing DataLoader

First of all, we need to get the DataLoader library with:

npm install dataloader --save
or
yarn add dataloader

then we import it into our project with:

const DataLoader = require('dataloader')
or
var DataLoader = require('dataloader')

Creating the instance

To create the instance we do:

const loader = new DataLoader(loaderFunction);

The DataLoader constructor accepts a loaderFunction and returns a loader instance.

The loaderFunction is a function that accepts an array of keys and returns a Promise that resolves to an array of values that correspond to the array of keys.

An example would be:

Here we define the loaderFunction and create an instance of DataLoader using that function.

Using the DataLoader instance

Fetching Data

Now that we have a loader instance, we can now start fetching data. We can do so by running either the load() or the loadMany() method.

A load() method example.
A loadMany() method example.

As you can see, the load() method takes a single key while the loadMany() method takes an array of keys. Both methods return Promises that resolve to values.

Caching and Batching

You don’t have to do anything special to do caching and batching with DataLoader.

Using the load() and loadMany() methods cache their results, so that next time you request with the same key, they just give you the values immediately, without making the request. This saves you from making additional network requests which are, by nature, expensive.

Requests are also automatically batched when using the load methods. Requests made within the same tick in the event loop are batched to reduce the amount of requests made.

Clearing the Cache

To clear the DataLoader cache, we just use the clearAll() method:

loader.clearAll();

Conclusion

As you can see, making requests efficient is easy with DataLoader, with its caching and batching capabilities using the load() and loadMany() methods.

Complete code

Complete DataLoader code example.

If you want to know more about DataLoader, checkout their GitHub repo.

Thank you for reading this article! If you liked it, hold the clap button below! Thank you! 👏

--

--