Dockerizing Django Project With a PostgreSQL database

Afraz Je
5 min readJun 14, 2024

Containers.

These are used in shipping, right? You know, the ones with MAERSK logos on them?

What do containers have to do with code?

Containerization is an important part in code orchestration. It ensures that all developers and production environments receive the exact same build of the code with the exact same dependencies.

To be honest, I was FRIGHTENED by the thought of dockerizing my projects (I still am).

The reason is that docker runs into soooo many errors. That is why, today I am going to walk you through how to dockerize your django app using a PostgreSQL database.

Note: This walkthrough assumes you have installed postgreSQL on your system and also created a virtual environment in Python. If not, please refer to https://www.postgresql.org/download/ for postgreSQL download and https://code.visualstudio.com/docs/python/environments for virtual environment management.

Step 1: Creating a boilerplate django app

Let’s create a django app first.

By this time, in your VSCode or any favorite editor shell, the virtual environment should have been activated.

Hit up:

pip install django

Then, when it is done:

django-admin startproject yourappname

This creates a scaffold of the app in the working directory.

By now, your project should appear as follows:

Basic Boilerplate Django Project Structure.

Step 2: Securing Credentials

First thing first guys, secure your credentials. Never, ever hardcode your credentials into the app!

First, lets install the PostgreSQL wrapper for the django ORM. We are also going to need `python-dotenv`, which is an library to deal with environment variables.

pip install psycopg2 python-dotenv

Then, we are going to open up the settings.py file in the application. Scroll down to where this block of code exists:

DATABASES = {
'default': {
......
}
}

and replace it with this block of code:

from dotenv import load_dotenv
import os

load_dotenv()

DATABASES = {
'default': {

'ENGINE': 'django.db.backends.postgresql_psycopg2',

'NAME': os.getenv('POSTGRES_DB', 'test'),

'USER': os.getenv('POSTGRES_USER', 'postgres'),

'PASSWORD': os.getenv('POSTGRES_PWD'),

'HOST': os.getenv('POSTGRES_HOST', 'localhost'),

'PORT': os.getenv('POSTGRES_PORT', '5432'),

}
}

I’ll explain what this means in just a minute.

Now, at the root of the project, at the same level as the manage.py file, create a .env file and populate it with these values: (I have purposefully left the POSTGRES_HOST variable not written, I will write that in the next part of this app).

POSTGRES_USER=<your value>
POSTGRES_PWD=<your value>
POSTGRES_PORT=<your value>
POSTGRES_Db=<your value>

Phew! That was a lot of work done. Let me explain what we just did here.

First, we removed the hardcoded sqlite3 dependencies of our application and told it to use the psycopg2 extension (indirectly the postgre database) for our application.

Security wise, then what we did was tell our application to use the variables from the .env file (POSTGRES_USER, POSTGRES_PWD…) as credentials into our django application.

(This goes without saying that the .env file should never be pushed to production!)

Go to the terminal at manage.py level and hit up:

python manage.py runserver

You should see at localhost:8000 in the browser:

That’s it for the django application setup.

Step 3: Dockerfile

Perhaps the most important part in container orchestration is writing the Dockerfile.

The Dockerfile tells the Docker Engine https://www.docker.com/products/docker-desktop/ how to build a single container and what ports to expose in its otherwise isolated nature, and many other details as well.

At the manage.py level, create a ‘Dockerfile’ (no extensions) and paste the following code in it:

FROM python:3.11.9

WORKDIR /app

COPY ./requirements.txt .

RUN pip install --no-cache-dir --upgrade pip && \
pip install --no-cache-dir -r requirements.txt

COPY . .

EXPOSE 8000
Location of Dockerfile

Also, as shown above, just beside Dockerfile, create the docker-compose.yml file. What this file does is we’ll explain later.

Basically, the Dockerfile told the Docker Engine to first copy the requirements.txt file to the WORKDIR, and then upgrade pip and install all dependencies. Then it copied the whole django project into the WORKDIR.

Step 4: docker-compose.yml

This is where we’ll ‘build’ our django application and the associated postgreSQL file into a single app build.

Let me walk you through the terminology first:

Service:

A service is an isolated instance of an application. A service may be dependent on other services.

Build:

A build represents a error-free instance of a container (Runtime exceptions are, well, excepted).

Now, let us build the docker-compose.yml file.

version: '3.3'
services:
db: # Service name
image: postgres:16.3-bullseye # Base PostgreSQL image
container_name: db # Container name
restart: always
env_file:
- ./.env
environment: # These are the credentials from .env file
- POSTGRES_USER=${POSTGRES_USER}
- POSTGRES_PASSWORD=${POSTGRES_PWD}
- POSTGRES_DB=${POSTGRES_DB}
ports: # XPORT:PORT mapping so docker connects to postgre instance
- '5437:5432'

backend: # Django application service
build:
context: .
dockerfile: Dockerfile
container_name: backend
command: sh -c "python3 manage.py makemigrations && python3 manage.py migrate --noinput && python3 manage.py collectstatic --noinput && python manage.py runserver 0.0.0.0:8000"
restart: always
volumes:
- .:/app
ports:
- "8000:8000"
env_file:
- .env
depends_on:
- db

You can explore what these specific terms are in some tutorials and documentations online. Basically, the crux is that we have setup a postgreSQL instance from the dockerhub, and then connected our db instance to that application.

Now, do you remember that POSTGRES_HOST variable that I did not populate in the .env file? (Seriously guys, you need to work on your memory. Just kidding!)

Its time we put it in the .env file and gave it a value:

POSTGRES_HOST=db

This step is very very important. Basically, Docker runs in an isolated environment, with its own networking structures and all. But out PostgreSQL database is OUTSIDE of that environment, and we just mapped our db service in the docker-compose.yml to that instance.

But our django application depends on the db container, and can not simultaneously connect to the postgre instance. (Race conditions, fellas)

The solution lies in that we setup an environment variable, such that the service acts as an intermediary and provides us with the correct access.

I just rambled a lot technical stuff! Once you guys do this, you’ll understand what I’m talking about.

If you hit up localhost:8000, you should see the output:

and open up pgAdmin4 tool for Postgres, we can see:

Migrations for Django Applied

Ta da!

That marks the end of shipping containers! And if you’ve made it this far, consider following me on Linkedin:

https://linkedin.com/in/afraz-butt/

--

--