When building web applications, optimizing performance is essential, especially for APIs that serve multiple requests. One way to enhance performance is by implementing caching mechanisms.
In our previous articles, we explored the use of FastAPI, covered basic CRUD operations in “Mastering CRUD Operations with FastAPI: A Step-by-Step Guide”, and demonstrated how to connect to a MySQL database in “FastAPI with SQLAlchemy: Building Scalable APIs with a Database Backend”. In this article, we’ll delve into building a REST API using FastAPI and leveraging Redis for caching, which will significantly improve your API’s performance.
Why Redis?
Redis is an in-memory key-value store known for its speed and versatility. It can be used for caching data to reduce the number of times an API calls a database for frequently requested data, significantly reducing the response time. It supports various data structures such as strings, hashes, lists, sets, and more.
Redis is known for its:
- High performance
- Flexibility
- Wide language support
- Persistence
- Replication and high availability
Setting Up the Environment
Before we begin coding, you need to install the required libraries:
pip install fastapi uvicorn redis
Ensure that you have a Redis instance running. You can either run Redis locally or use a cloud-based Redis service like AWS ElastiCache or Redis Labs.
Step 1: Creating the FastAPI App
We’ll start by creating a simple FastAPI application.
from fastapi import FastAPI
app = FastAPI()
@app.get("/items/{item_id}")
async def read_item(item_id: int):
return {"item_id": item_id}
This is a basic FastAPI route that returns the requested item_id
. Now, we’ll improve it by adding Redis caching.
Step 2: Connecting to Redis
Next, let’s connect our FastAPI app to a Redis instance. We’ll use the redis-py
library, which provides a Python interface to interact with Redis.
import redis
from fastapi import FastAPI
app = FastAPI()
# Connect to Redis
redis_client = redis.Redis(host='localhost', port=6379, db=0)
@app.get("/items/{item_id}")
async def read_item(item_id: int):
cached_item = redis_client.get(f"item_{item_id}")
if cached_item:
return {"item_id": item_id, "cached": True, "data": cached_item.decode('utf-8')}
# Simulate data fetching process
item_data = f"Item data for {item_id}"
# Store the item in Redis for future requests
redis_client.set(f"item_{item_id}", item_data)
return {"item_id": item_id, "cached": False, "data": item_data}
- We attempt to fetch the item from Redis using
redis_client.get(f"item_{item_id}")
. - If it exists, we return the cached data.
- If not, we simulate fetching the data and cache it using
redis_client.set
.
Step 3: Adding Cache Expiration
Caching forever is not ideal because data can become stale. To manage this, we can add an expiration time to our cached items.
@app.get("/items/{item_id}")
async def read_item(item_id: int):
cached_item = redis_client.get(f"item_{item_id}")
if cached_item:
return {"item_id": item_id, "cached": True, "data": cached_item.decode('utf-8')}
item_data = f"Item data for {item_id}"
# Store the item in Redis with an expiration time of 1 hour (3600 seconds)
redis_client.setex(f"item_{item_id}", 3600, item_data)
return {"item_id": item_id, "cached": False, "data": item_data}
Now, the cached data will expire after one hour, ensuring that the cache stays fresh.
Step 4: Testing the Application
To run the FastAPI app, use Uvicorn:
uvicorn main:app --reload
Visit http://127.0.0.1:8000/items/1
in your browser or use curl
to test:
curl http://127.0.0.1:8000/items/1
You’ll notice that the first request fetches uncached data, while subsequent requests return the cached version.
output: {“item_id”:1,”cached”:true,”data”:”Item data for 1"}
Step 5: Clearing the Cache
In some scenarios, you should manually clear or update the cache. This can be done using the redis_client.delete
method.
@app.delete("/items/{item_id}")
async def delete_item(item_id: int):
redis_client.delete(f"item_{item_id}")
return {"status": "cache cleared"}
Now, if you call DELETE /items/{item_id}
, it will remove the item from the cache, forcing the API to fetch fresh data on the next request.
Conclusion
Implementing Redis caching in your FastAPI application can greatly improve your API’s performance, especially for endpoints that serve repetitive or computationally expensive data. Redis helps reduce database load and speeds up response times, leading to a more responsive user experience.
FastAPI’s async support and Redis’s high-speed caching make for a powerful combination, enabling you to build scalable and efficient APIs.
Happy coding!