Python With Docker | FastAPI [Part-1]

Simplify your python web-development with FastAPI and docker

Ahmed Nafies
Oct 22, 2020 · 6 min read

Intro

In any web development project, you would usually have many components, a database or multiple databases, a cronjob, a backend, a frontend.. etc. Setting up databases takes time and in case of change for example moving away from Postgres to MariaDB would require uninstalling Postgres and installing MariaDB. Let’s say you are using OSX and your colleagues are using Windows yet the production environment is Linux based. Now you have to ensure that all of your setup works on all three operating systems. This is where docker comes into play and saves us all.

According to the docker documentation

Docker is an open platform for developing, shipping, and running applications. Docker enables you to separate your applications from your infrastructure so you can deliver software quickly. With Docker, you can manage your infrastructure in the same ways you manage your applications. By taking advantage of Docker’s methodologies for shipping, testing, and deploying code quickly, you can significantly reduce the delay between writing code and running it in production.

Think about containers as the above food glass container. The above container can be used in the oven, a microwave, in the fridge or you can put it on the table serve right from it. This is exactly how containers work, food in the above example represents your python code which comes in the container with all of its ingredients/packages/dependencies/requirements. As long as the system can run docker, you do not need to install anything else, not even python.

For more info about docker containers check the documentation. Before we learn how to make docker images, first we need to install docker.

To verify that docker is installed, lets check the docker version

$ docker -v

A docker image is template for creating a container. It has all the properties and functions of the container, an image is like class in python where a container would be the instance of that class.

class Image(object):
name: str
container_1 = Image(name="container_1")
container_2 = Image(name="container_2")

Images can be pulled from any remote repo, but the default is docker-hub, think about docker-hub as pypi for python. when you use pip it fetches packages from pypi

Let’s say we need a PostgreSQL database, we have it with just one command

$ docker run \
--rm \
--name postgres \
-p 5432:5432 \
-e POSTGRES_USER=postgres \
-e POSTGRES_PASSWORD=postgres \
-e POSTGRES_DB=postgres \
-d postgres

And just like that we have a postgres database running in its own docker container .. MAGIC!!

Let’s test our new database

$ psql -h localhost -U postgres -d postgres -W

The user’s password is postgres

Now let’s create our first containerised python application. In this tutorial we will use FastAPI, but you are of course welcome to use the web framework of your choice.

To create a simple FastAPI app you can follow this tutorial or clone this repo and you should see the following in your directory

  1. main.py
  2. pipfile
  3. pipfile.lock
  4. README.md

Your main.py should look like this

import uvicorn
from fastapi import FastAPI
from pydantic import BaseModel
class User(BaseModel):
first_name: str
last_name: str = None
age: int
app = FastAPI()@app.post("/user/", response_model=User)
async def create_user(user: User):
return user
if __name__ == "__main__":
uvicorn.run(app, host="0.0.0.0", port=8000)

To install package, you can just use pipenv

$ pipenv install

or if you use pip, you can copy the following to your requirements.txt file

pyhumps
fastapi
uvicorn

Activate your virtual environment and run

$ python main.py

and go to http://localhost:8000/docs

Now we can see that out FastAPI application is running correctly. Let’s add docker configuration.

create a file Dockerfile in your directory.

# Pull base image
FROM python:3.7
# Set environment varibles
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
WORKDIR /code/# Install dependencies
RUN pip install pipenv
COPY Pipfile Pipfile.lock /code/
RUN pipenv install --system --dev
COPY . /code/EXPOSE 8000
CMD ["python", "main.py"]

This file tells docker what to do when we build the docker image which we will use to run the container. The beauty of docker is that we can create images from other images.

An image created with python version 3.7 installed in it, and all what we need to do is to pull it.

# Pull base image
FROM python:3.7

then we set some python environment variables

# Set environment varibles
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1

we set our working directory inside the container

WORKDIR /code/

then we copy our package manager files and install dependencies

# Install dependencies
RUN pip install pipenv
COPY Pipfile Pipfile.lock /code/
RUN pipenv install --system --dev

We copy our directory from our machine to the container

COPY . /code/

Then we expose the http port and run our application from within the container

EXPOSE 8000
CMD ["python", "main.py"]

Let’s build our image

$ docker build -t fastapi_example .

What the above command uses the Dockerfile to create an image with name fastapi_example , we will need this name in order to run the container later.

If you want to check the created image you can use

$ docker images

Let’s run the container

$ docker run --name app -p 8000:8000 fastapi_example

You can verify that everything works by going to http://localhost:8000/docs

We can also check the created containers

$ docker container ls -a

We will enhance our main.py to be able to work with the database

import uvicorn
from fastapi import FastAPI, Depends
from pydantic import BaseModel
from sqlalchemy import create_engine, Column, Integer, String
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker, Session
SQLALCHEMY_DATABASE_URL = "postgresql://postgres:postgres@localhost/postgres"engine = create_engine(SQLALCHEMY_DATABASE_URL)
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
Base = declarative_base()class UserModel(Base):
__tablename__ = "users"
id = Column(Integer, primary_key=True, index=True)
first_name = Column(String, unique=True, index=True)
last_name = Column(String)
age = Column(Integer,)
Base.metadata.create_all(bind=engine)def get_db():
db = SessionLocal()
try:
yield db
finally:
db.close()
app = FastAPI()class UserSchema(BaseModel):
first_name: str
last_name: str = None
age: int
class Config:
orm_mode = True
@app.post("/user/", response_model=UserSchema)
async def create_user(user: UserSchema, db: Session = Depends(get_db)):
_user = UserModel(
first_name=user.first_name, last_name=user.last_name, age=user.age
)
db.add(_user)
db.commit()
db.refresh(_user)
return _user
@app.get("/user/", response_model=UserSchema)
async def get_user(first_name: str, db: Session = Depends(get_db)):
_user = db.query(UserModel).filter_by(first_name=first_name).first()
return _user
if __name__ == "__main__":
uvicorn.run(app, host="0.0.0.0", port=8000)

Let’s run

$ python main.py

and test our connection to database. We will create a user and get the same user by first_name

Right now our code is running on our machine and the database is running in docker container. In order to run both in containers and connect them we need to create a network.

$ docker network create mynet

Next we should run the containers in the network mynet but before that we should stop the postgres container

$ docker container stop postgres

And then we run it in the mynet network

$  docker run \
--rm \
--name postgres \
--net=mynet \
-p 5432:5432 \
-e POSTGRES_USER=postgres \
-e POSTGRES_PASSWORD=postgres \
-e POSTGRES_DB=postgres \
-d postgres

Then we run the app in the same network

$  docker run --rm --name app --net=mynet -p 8000:8000 fastapi_example

In this tutorial we have covered how one can pull images and containerise own apps. As well we have managed to run and connect multiple docker containers. However, every time we have a new app we will have to build it, and for each app we have to run it separately and make sure that all apps are in the correct network. Moreover some apps depend on others, for example, our FastAPI app depends on the PostgreSQL database. If we try to run the app first, before the database is ready - it will fail. This is where docker-compose shines. We will cover docker-compose in Part-2 of this article.

The Startup

Get smarter at building your thing. Join The Startup’s +800K followers.

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store