Fast and simple rate limiting with Ruby on Rails and Redis
In one of my side projects, I needed user-specific rate limiting in order to protect the web service from over-use by individual members. By leveraging the EXPIRE feature of Redis, I was able to create a simple, user-individual rate limiting function with just a few lines of code.
My Requirements
My requirements to the rate-limiting were:
- User specific (don’t stop the web service for everyone but just for the individual)
- Short “bursts” of rapidly received requests are OK
- Stop the request early in the call stack
- Let the user know he was over the limit
- Keep the impact on overall performance as low as possible
The Concept
At the time I was looking at Redis for other stuff and learned about the built-in EXPIRE functionality. It lets you set a time to life for stored data. That concept sparked the idea for my rate limiting function:
- Count the requests of individual users over a defined time-span (in my case, 60 seconds)
- Set the counter to zero after the time-span is over
- If the user excesses the limit of 30 requests within the time-span, block any subsequent requests
- Accept requests again after a cool-down period of 5 minutes
The Implementation
Thanks to Redis’ EXPIRE functionality, this concept can be implemented without having to handle all the timing stuff in your code. And hence the coding needed shrinks to a few lines of code.
The installation of Redis is not complicated. If you run your app on Heroku as I do, you can just activate the add-on.
I included the function “track_api_usage” described below into the application controller in my Rails app. You can also put it somewhere else — just make sure that it is called very early in the call stack.
The code snippets below are simplified for better readability.
track_api_usage
- Set configuration parameters
# time-span to count the requests (in seconds)
watching_timespan=60
# maximum request allowed within the time-span
allowed_requests=30
# "cool-down" period in seconds
blocking_timespan=3002. Compute an individual key for each user (needed for identifying the right data in Redis)
#compose key for counting requests
user_key="counting_#{user.id}"#compose key for identifying blocked users
blocked_user_key="locked_#{user.id}"
3. Stop request if user shall be blocked
#check if there is a "blocked" entry in Redis
if $redis.get(blocked_user_key)
# block user
render :status => 429, :json => {:message => "You have fired too many requests. Please wait for a couple of minutes."}
return false
end4. If the request is accepted, set or increment the individual counter
# check if the user already has a counter
if $redis.get(user_key)
# main action: increment counter
number_of_requests=$redis.incr(user_key)
# check limit
if number_of_requests > allowed_requests
# write something into the log file for alerting
Rails.logger.warn "Overheat: User with id #{user.id} is over usage limit."
# mark the user as "blocked"
$redis.set(blocked_user_key,1)
# make the blocking expiring itself after the defined cool-down period
$redis.expire(blocked_user_key,blocking_timespan)
end
else
# no key for counting exists yet - so set a new one with ttl
$redis.set(user_key,1)
$redis.expire(user_key,watching_timespan)
end
endStatistics and Alerts
As an interesting byproduct, you can visualize the Redis activity (e.g. with Redsmin) and get simple statistics about the usage of your service:

Additionally, you can setup a log file analyzer (e.g. Paptertrail) to watch for entries created by the rate limiter and be informed about over-usage.
Impact on overall Performance
Lastly, an analysis of the app performance with New Relic shows that the impact of the Redis read/write times are minimal:

