Building a Real-Time E-Commerce Platform with Python: A Comprehensive Guide to FastAPI, Kafka, and Docker
Repository
Tech Stack
- Python
- Flask
- Kafka
- PostgreSQL
Overview
The Setup: A digital marketplace where users shop to their heart’s content.
The Heroes:
- Order Service: The friendly cashier, taking orders and announcing them to the world. Equipped with FastAPI for its swift API responses, it communicates through the loudspeaker known as Kafka to make sure everyone (well, every service) hears about the new orders.
- Inventory Service: The ever-alert store manager, always on the lookout. With the precision of SQLAlchemy, it keeps an eye on stock levels, ensuring everything’s in order. And if you ever want to peek into the inventory, it’s got FastAPI’s OpenAPI docs ready for you to browse through.
For those eager beavers who want to jump straight into the code, here’s the GitHub repository. But if you’re keen on understanding the journey, read on!
Setting Up the Project with Poetry
In the vast universe of Python, there’s a tool that stands out for its simplicity and elegance: Poetry. But why all the buzz around it? Let’s dive in.
Why Poetry?
When it comes to managing Python projects, there’s no shortage of tools. But Poetry brings something different to the table. It’s not just about managing dependencies or packaging; it’s about creating a seamless environment where everything just… works. No more wrestling with pip
, setuptools
, or dependency conflicts. Poetry handles it all, making your development process smoother and more enjoyable.
Kickstarting the Order Service setup
Create the main project directory:
mkdir realtime-order-processing
cd realtime-order-processing
Initialize a Git repository (optional but recommended):
git init
Create the basic directory structure:
mkdir -p order_service/src
mkdir -p inventory_service/src
Navigate to the Project Directory:
cd realtime-order-processing/order_service
Initialize a New Project: If you’re starting from scratch:
poetry new order_service
Setting Python Version: Ensure you’re using the right Python version by setting it in pyproject.toml
:
[tool.poetry.dependencies]
python = "^3.9"
Installing Dependencies: Add the necessary packages for our service:
poetry add fastapi[all] sqlalchemy kafka-python pydantic uvicorn psycopg2-binary
This will install FastAPI for our REST API, SQLAlchemy for database interactions, Kafka for event publishing, Pydantic for data validation, Uvicorn as our ASGI server, and psycopg2-binary for PostgreSQL support.
Managing Dependencies: Simplified
With Poetry, dependency management is straightforward. The pyproject.toml
file is your go-to for all dependencies. Add, update, or remove as you need, and let Poetry handle the intricacies. It ensures your project's environment remains consistent, so you can focus on coding, not troubleshooting.
In essence, Poetry isn’t just a tool; it’s a companion for your Python journey, ensuring that the road is smooth and the journey enjoyable.
Diving into Docker: Simplifying Development and Deployment
In the modern software development landscape, Docker has emerged as a game-changer. But why is it so revered? Let’s embark on this journey to understand Docker’s significance and how it integrates seamlessly with our e-commerce platform.
Why Docker?
Imagine a world where you can wrap your application, with all its dependencies, into a single package and then run it consistently across various environments. That’s Docker for you! It ensures that “it works on my machine” is a phrase of the past. Docker containers encapsulate everything an application needs to run, ensuring consistency, scalability, and isolation.
Crafting the Dockerfile for order_service
The heart of Docker is the Dockerfile. It’s a script that contains all the commands a user could call to assemble an image. For our order_service
, the Dockerfile looks like this:
FROM custom-python-flyway:latest
WORKDIR /app
COPY . /app
RUN pip install --upgrade pip \
&& pip install poetry \
&& poetry config virtualenvs.create false \
&& poetry install --no-dev
EXPOSE 80
CMD ["uvicorn", "src.main:app", "--host", "0.0.0.0", "--port", "80"]
This Dockerfile does a few things:
- Uses a custom Python image with Flyway included.
- Sets the working directory in the container to
/app
. - Copies the current directory content into the container.
- Installs necessary Python packages and dependencies.
- Exposes port 80 for the application.
- Specifies the command to run when the container starts.
Crafting a Custom Docker Image: Integrating Flyway with Python
While Docker Hub offers a plethora of pre-built images for almost every use case, sometimes, the need arises to craft a custom image tailored to specific requirements. For our e-commerce platform, integrating database migrations into the service startup was one such requirement. Enter Flyway — a database migration tool.
Why a Custom Image?
When services start, it’s crucial that the database schema is in the expected state. Flyway ensures this by applying version-controlled scripts to the database, bringing it to the desired state. But how do we ensure Flyway runs before our service starts? The answer is a custom Docker image that has both Python (for our service) and Flyway (for migrations).
Building the Custom Image
In the realtime-order-processing
directory, we have a Dockerfile dedicated to this task:
# Dockerfile for custom base image
FROM python:3.9-slim
RUN apt-get update \
&& apt-get install -y wget tar \
&& pip install --upgrade pip
RUN wget -O flyway.tar.gz https://repo1.maven.org/maven2/org/flywaydb/flyway-commandline/7.15.0/flyway-commandline-7.15.0-linux-x64.tar.gz \
&& tar -xzf flyway.tar.gz -C /opt \
&& ln -s /opt/flyway-7.15.0/flyway /usr/local/bin
Here’s a breakdown of what’s happening:
- We start with the
python:3.9-slim
image, a lightweight Python image. - We update the package list and install
wget
(to download files) andtar
(to extract archives). - We download the Flyway command-line tool and extract it to
/opt
. - A symbolic link is created to ensure the
flyway
command is available globally in the container.
With this custom image, every time a service container starts, it has both Python and Flyway at its disposal, ensuring smooth service initialization and database migrations.
docker-compose
: Orchestrating Multi-container Applications
While Docker is fantastic for containerizing a single application, in the real world, applications often interact with other services. Enter docker-compose
. It's a tool to define and run multi-container Docker applications. For our e-commerce platform, the docker-compose.yml
in the realtime-order-processing
directory defines the orchestration:
version: '3'
services:
zookeeper:
image: wurstmeister/zookeeper:3.4.6
ports:
- "2181:2181"
kafka:
image: wurstmeister/kafka:latest
ports:
- "9092:9092"
environment:
KAFKA_ADVERTISED_LISTENERS: INSIDE://kafka:9093,OUTSIDE://localhost:9092
KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: INSIDE:PLAINTEXT,OUTSIDE:PLAINTEXT
KAFKA_INTER_BROKER_LISTENER_NAME: INSIDE
KAFKA_LISTENERS: INSIDE://0.0.0.0:9093,OUTSIDE://0.0.0.0:9092
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
volumes:
- /var/run/docker.sock:/var/run/docker.sock
depends_on:
- zookeeper
order_postgres:
image: postgres:latest
environment:
POSTGRES_USER: user
POSTGRES_PASSWORD: password
POSTGRES_DB: orderdb
ports:
- "5433:5432"
volumes:
- order_postgres_data:/var/lib/postgresql/data
order_service:
build: ./services/order_service
environment:
DATABASE_URL: postgresql://user:password@order_postgres:5432/orderdb
ports:
- "80:80"
depends_on:
- kafka
- order_postgres
volumes:
order_postgres_data:
This file describes:
- How services like Kafka, Zookeeper, and PostgreSQL are set up.
- The dependencies between these services.
- The environment variables and volumes required for persistence and configuration.
With docker-compose up
, all these services spring to life, communicating and working in harmony, just as described.
In essence, Docker and docker-compose
together ensure that our application and its services are not just isolated and consistent but also orchestrated in harmony, making deployment a breeze.
Building a RESTful API with FastAPI and Storing Data with SQLAlchemy
In the modern world of web development, crafting a RESTful API is more of a necessity than a luxury. For our e-commerce platform, we needed a robust, fast, and intuitive framework. Enter FastAPI — a modern, fast (high-performance) web framework for building APIs with Python based on standard Python type hints.
Why FastAPI?
FastAPI is not just another framework; it’s a game-changer. Here’s why:
- Type Hints: With Python’s type hints, FastAPI allows for intuitive function signatures, ensuring that the data you receive and send is of the expected type.
- Automatic API Docs: Forget manually updating Swagger or ReDoc. FastAPI generates it for you.
- Performance: Comparable to NodeJS and Go, it’s one of the fastest frameworks for Python, especially with I/O bound operations.
Storing Data with SQLAlchemy
While FastAPI handles the web part, we needed an ORM (Object Relational Mapper) to interact with our database. SQLAlchemy, a popular Python library, was our choice. It provides a set of high-level API to connect to relational databases. Using SQLAlchemy, we can model our tables as classes and query them without writing raw SQL.
Here’s a snippet from our Order model:
from sqlalchemy import create_engine, Column, Integer, String, Float
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
class Order(Base):
__tablename__ = 'orders'
id = Column(Integer, primary_key=True)
product_name = Column(String)
quantity = Column(Integer)
price = Column(Float)
status = Column(String, default = 'Pending')
Pydantic and Data Validation
While SQLAlchemy models represent tables in our database, we needed a way to validate and serialize data for our API. Pydantic, a data validation and serialization library, came to the rescue. It uses Python type annotations to validate data and convert them into Python data types.
For instance, when creating an order, we use a Pydantic model to ensure the data is valid:
from pydantic import BaseModel
class OrderRequest(BaseModel):
product_name: str
quantity: int
price: float
This ensures that whenever an order is placed, the data adheres to the structure and type we expect.
Bringing It All Together
With FastAPI, SQLAlchemy, and Pydantic, we crafted endpoints to create and retrieve orders. Here’s a glimpse:
from fastapi import APIRouter
from .schemas import OrderResponse, OrderRequest
from . import db_service
router = APIRouter()
@router.post("/order", response_model=OrderResponse)
def create_order(order: OrderRequest):
new_order = db_service.create_order_in_db(order.dict())
return new_order
In this section, we’ve seen how FastAPI, combined with SQLAlchemy and Pydantic, provides a seamless experience in crafting robust and efficient web services. The trio ensures that from data validation to storage, everything is smooth and type-safe.
Publishing Events to Kafka
In a microservices architecture, communication between services is crucial. While RESTful APIs are one way to achieve this, event-driven architectures offer a more scalable and decoupled approach. Apache Kafka, a distributed event streaming platform, is a perfect fit for such scenarios.
Why Kafka?
Kafka is not just a messaging system; it’s a full-fledged event streaming platform. Here’s why we chose Kafka:
- Scalability: Kafka can handle millions of events per second, making it suitable for high-throughput systems.
- Durability: Data in Kafka is persisted, ensuring no loss of events.
- Real-time Processing: Kafka allows for real-time processing of events, making it perfect for our real-time order processing system.
Setting Up Kafka with Docker
To avoid the hassle of setting up Kafka manually, we used Docker. Here’s a snippet from our docker-compose.yml
:
services:
kafka:
image: wurstmeister/kafka:latest
ports:
- "9092:9092"
environment:
KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: INSIDE:PLAINTEXT,OUTSIDE:PLAINTEXT
KAFKA_INTER_BROKER_LISTENER_NAME: INSIDE
KAFKA_LISTENERS: INSIDE://0.0.0.0:9093,OUTSIDE://0.0.0.0:9092
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
depends_on:
- zookeeper
Publishing Events to Kafka from FastAPI
With Kafka set up, the next step was to integrate it into our FastAPI application. We used the kafka-python
library to produce events to Kafka.
Here’s how we set up our Kafka producer:
from kafka import KafkaProducer
import json
producer = KafkaProducer(
bootstrap_servers='kafka:9093',
value_serializer=lambda v: json.dumps(v).encode('utf-8')
)
And here’s how we published an order event after storing it in the database:
from . import kafka_service
@router.post("/order", response_model=OrderResponse)
def create_order(order: OrderRequest):
new_order = db_service.create_order_in_db(order.dict())
kafka_service.send_order_event(new_order.dict())
return new_order
In the kafka_service
, the send_order_event
function pushes the order data to a Kafka topic:
def send_order_event(order_data):
producer.send('order_topic', order_data)
With Kafka integrated into our FastAPI application, we’ve set the stage for real-time order processing. As orders come in, they’re not only stored in the database but also broadcasted as events, ready to be consumed by any service interested in them, like our Inventory Service. This decoupled architecture ensures that our system is scalable and resilient.
Running the Order Service Locally with Docker Compose
With everything set up, running the service is as simple as:
docker-compose up order_service
This command will:
- Build the
order_service
image (if not already built). - Start the required dependencies (
kafka
andorder_postgres
). - Run the
order_service
.
You can now access the service API at http://localhost:80/
.
With Docker Compose, we’ve streamlined the process of running our order-service
and its dependencies. This ensures that developers can easily replicate the production environment on their local machines, leading to fewer "it works on my machine" issues.
OpenAPI Documentation for Order Service
One of the standout features of FastAPI is its automatic generation of OpenAPI documentation. This provides an interactive API documentation interface, allowing developers and users to understand and test the API endpoints without any additional tools.
To access the OpenAPI documentation for the order-service
, simply navigate to:
http://localhost:80/docs
Here, you’ll find a detailed and interactive list of all the available endpoints, their expected input parameters, and the responses they return. This not only aids in understanding the service’s capabilities but also facilitates easy testing and debugging.
Inventory Service: Keeping Track of Stock Levels
The inventory-service
plays a pivotal role in our e-commerce platform. As orders are placed, it's crucial to ensure that the inventory levels are updated in real-time. This service not only processes order events from Kafka to update the inventory but also exposes a REST API to check the current inventory records.
Processing Order Events and Updating Inventory
Once an order event is received from Kafka, the inventory-service
processes it to update the inventory table in the database. Here's a simplified version of how this is achieved:
from kafka import KafkaConsumer
from . import db_service
consumer = KafkaConsumer('order_topic', bootstrap_servers='kafka:9093')
for message in consumer:
order_data = message.value
db_service.update_inventory(order_data)
In the db_service
, the update_inventory
function might look something like this:
from .models import Inventory
def update_inventory(order_data):
session = Session()
inventory_record = session.query(Inventory).filter_by(product_name=order_data['product_name']).first()
if inventory_record:
inventory_record.quantity -= order_data['quantity']
session.commit()
Exposing a REST API to Check Inventory
To allow users or other services to check the current inventory levels, a REST API endpoint is exposed:
from fastapi import FastAPI
from . import db_service
app = FastAPI()
@app.get("/inventory/{product_name}")
def get_inventory(product_name: str):
return db_service.get_inventory(product_name)
In the db_service
, the get_inventory
function might look like:
def get_inventory(product_name):
session = Session()
inventory_record = session.query(Inventory).filter_by(product_name=product_name).first()
if inventory_record:
return {"product_name": inventory_record.product_name, "quantity": inventory_record.quantity}
else:
return {"error": "Product not found"}
Docker Compose Configuration for Inventory Service
To ensure that the inventory-service
runs smoothly with its dependencies, here's the docker-compose
configuration specific to it:
...
services:
inventory_postgres:
image: postgres:latest
environment:
POSTGRES_USER: user
POSTGRES_PASSWORD: password
POSTGRES_DB: inventorydb
ports:
- "5434:5432"
volumes:
- inventory_postgres_data:/var/lib/postgresql/data
inventory_service:
build: ./services/inventory_service
environment:
DATABASE_URL: postgresql://user:password@inventory_postgres:5432/inventorydb
ports:
- "81:80"
depends_on:
- kafka
- inventory_postgres
volumes:
...
inventory_postgres_data:
With this setup, the inventory-service
is equipped to handle real-time inventory updates as orders come in, ensuring that stock levels are always accurate and up-to-date.
Running the Inventory Service with Docker Compose
To run the inventory-service
locally:
docker-compose up inventory_service
This will build the inventory_service
Docker image, start the necessary dependencies, and run the service. You can access the service API at http://localhost:81/
.
OpenAPI Documentation for Inventory Service
FastAPI’s automatic OpenAPI documentation generation is also available for the inventory-service
. This interactive documentation provides a clear overview of all the available API endpoints, making it easier for developers and users to understand and test the service.
To access the OpenAPI documentation for the inventory-service
, navigate to:
http://localhost:81/docs
This page presents a comprehensive list of all the API endpoints, detailing their expected input parameters, and the responses they provide. It’s an invaluable tool for understanding, testing, and debugging the service in real-time.
Summary:
In this guide, we’ve taken a deep dive into the Python ecosystem, leveraging its diverse tools to build a dynamic e-commerce platform:
- Poetry: Simplified our project management, ensuring consistent dependencies and a smooth development experience.
- FastAPI: Emerged as the hero for crafting efficient and robust web services. Its automatic OpenAPI documentation feature was a boon, providing clarity and interactivity to our API endpoints.
- SQLAlchemy: Enabled seamless interactions with our database, allowing us to model, query, and manage our data with ease.
- Kafka: Integrated seamlessly into our Python services, ensuring real-time event streaming and facilitating communication between our order and inventory services.
- Docker: Ensured consistent development, testing, and deployment, encapsulating our services and their dependencies into isolated containers.
By intertwining these technologies, we’ve showcased the prowess of Python in building scalable, efficient, and real-time systems. Whether you’re a Python enthusiast or exploring its capabilities, this guide underscores Python’s potential in modern e-commerce solutions.
Further Reading:
For those eager to delve deeper into the technologies and concepts discussed in this blog, I’ve curated a list of resources to guide your journey:
- FastAPI:
- Official FastAPI Documentation
- FastAPI Tutorial: Building a RESTful API - SQLAlchemy:
- SQLAlchemy Documentation
- A Comprehensive SQLAlchemy Tutorial - Kafka:
- Apache Kafka Official Documentation
- Kafka: The Definitive Guide - Docker & Docker Compose:
- Docker’s Official “Getting Started” Guide
- Docker Compose Overview - Poetry:
- Poetry Official Documentation
- An Introduction to Packaging with Poetry - Pydantic:
- Pydantic Documentation
- Data validation and settings management using Python type annotations with Pydantic
Embark on these explorations and expand your horizons. Happy coding! 🚀