Mowing Through API Request Limiting with Redis

Illustration by: Tara Jacoby (source)

Imagine you wrote a web app that also provides a REST API to users. Also imagine this app is just a hobby project, so between your day job, family life, yard work, and cryptocurrency trading you only have moments here and there to work on it. Over time the app has earned the privilege of having a decent number of users that leverage the API. Inevitably, some user’s script is going to go nuts and pound your poor web server at around 50 requests per second. What do you do? You implement a proper API gateway solution like Kong, which has plugins to handle API traffic control. That’s not what I did!

My app is called HoneyDB, and it’s a bit of a monster. A Frankenstein to be exact. In an effort to keep the cost of operating the app and APIs low, it is made up of free tier services across multiple services from multiple cloud providers. Like I said, it’s a Frankenstein. In short, it looks something like this…

To address the active pounding on the API, which was causing a noticeable performance impact to other queries I took a look at the popular Kong solution — just like any hip API coder would. I installed in a VM to try it out and it was pretty easy to get up and running. But then I started to look at the plugins for rate limiting, and while on the surface it didn’t seem too complicated, I started to think about the changes I’d have to make to my existing code and stack to integrate everything properly. If I had the time to map it all out carefully I’m sure it could be done. But what I really needed was to at least get the front yard mowed!


Mowing gives you time to think. Instead of mapping out how to forklift my existing app to Kong, I remembered I had signed up for a free tier account with Redis Labs months ago. At the time I had no idea what I would use it for, but Redis is an extremely performant key-value store database, and it was free. I figured someday I’d think of a use for it, and it could become another limb for my Frankenstein. As I pushed my EGO 21-inch 56-volt lawn mower across the front yard I realized, I can use Redis to track the number of API requests per user! In addition, it could be done with just a few minor code changes.

Fortunately, there is already a great PHP library for Redis so installing it was as simple as running apt-get. Next, I needed a way to identify a unique user and create two new functions in the app. One function to increment the count of requests per API user, and the other to check if the user has exceeded their limit. Since API users are required to submit an API_ID with each request, I had an easy way to uniquely identify users. After testing things out, like connecting to Redis and setting and retrieving counter values, with a few test scripts I was ready to put the pieces together in code. The only methods I needed to use from the PHP Redis library were:

  • connect(), to connect to the Redis database.
  • auth(), to authenticate to the Redis database.
  • incr(), to increment values in the Redis database.
  • get(), to retrieve values from the Redis database.

Simple enough. However, before I could start coding this up in the app, the backyard was calling. While mowing the backyard it gave me time to think about how I would keep track of individual user limits, rather than having one static limit for all users. Also, since the limits need to be enforced on a monthly bases, I needed to work out how the app would reset the limit counters at the end of each month. The solutions for these two requirements were straightforward. Just adding a limit column to the API user table, which is in a MySQL database, takes care of individual user limits. For the second requirement, a simple cron job that resets all request counters at the start of every month will do the trick. With the backyard freshly mowed, now I was ready to put all the pieces together!

The first function I integrated into the app was the api_request() function. This function is called every time a user makes a request to one of the API endpoints. Using the user’s API_ID as the key, it just simply increments a counter value. When calling the incr() method, if the key does not already exist in the database Redis automatically create a new entry for it.

Example function api_request() for incrementing a counter based on the $id parameter.

The next function, api_limit_exceeded(), is called right after each request to an API endpoint is authenticated. It takes two parameters, the API_ID and the limit value, which defaults to 1500 if a limit value is not provided. The function retrieves the current request count value and compares it to the limit. It returns true if the count is over the limit value, and returns false if not. When the function returns true, the app will immediately respond with the HTTP status code 429 — Too Many Requests.

Example function api_limit_exceeded() to check request counts against request limits.

With these two functions implemented, I have all the counting, checking, and enforcement I needed to implement API request limiting. The last and final piece was to establish a way to reset request counters on a monthly bases. For this task I needed to use two additional methods from the PHP Redis library, which were:

  • keys(), to return keys in the Redis database based on a specified pattern.
  • set(), to set values in the Redis database.

In the example script below, it calls keys() with the pattern of “*”, e.g. keys(‘*’). This will return all of the keys from the database. Next, it loops through all the keys setting their value to zero. I configured this script to run as a cron job at the start of every month, and with that my solution is complete.

Example script to set all key values to 0.

The End Result

Is this the most sophisticated API traffic control solution? No way! Was it relatively quick to build and works well? Yes way! More importantly, is my lawn looking great? For the time being, oh yes! Could I have implemented Redis locally on my web server? Yes but why bother. With the cloud service, I don’t have to worry about configuring and maintaining another service. Plus, it is fast enough for my needs. Also bonus, the Redis Labs console provides some awesome metrics.

Awesome metrics

However, the best end result is making out-of-control client scripts that are pounding on my API eat 429s:

Finally, if you want to test out this request limits implementation, and at the same time leverage some pretty good threat information to help defend your network or applications, give the HoneyDB API a try. You can find the details here. To easily get started with querying the API from the command line or via your own Python script, there is a Python package available here.

This was a great little weekend project that solves a potentially big problem, e.g. web app and API outages. So with this complete, the lawn complete, I’m feeling a bit more relaxed.