Running Django with PostgreSQL in Docker: A Step-by-Step Guide

Jonas Granlund
3 min readAug 26, 2023

--

Photo by Clément Hélardot on Unsplash

In the previous post, we Dockerized a Django app and used a PostgreSQL image from DockerHub. In this guide, we’ll configure the Django app to connect to the PostgreSQL database. We’ll explore two approaches. First by creating and configuring the docker containers individually to run on the same network, and the second approach, we’ll introduce docker-compose, a tool for defining and handling multi container environments in a much easier way.

Check out the previous post if you need to create a Docker image for the Django app.

Let’s get started!

Step 1: Setting Up PostgreSQL in Django

First, update your Django application’s requirements.txt file to include the PostgreSQL driver for Python:

psycopg2-binary>=2.9

This package provides a stand-alone version of psycopg2, optimized for production use. If you prefer, you can use the non-binary version.

Then, install the added requirement:

pip install -r requirements.txt

Adjust your Django settings. Open the myproject/settings.py file and modify the DATABASES section as follows:

DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'postgres',
'USER': 'postgres',
'PASSWORD': 'mypassword',
'HOST': 'my-postgres',
'PORT': '5432',
}
}

Here, the HOST points to my-postgres, which is the name of our PostgreSQL Docker container and it will resolve into the IP address for the container inside our network.

Approach 1: Docker Networking

Step 1: Creating a Custom Docker Network

First, let’s create a custom Docker network:

docker network create mynetwork

This command creates a new network named “mynetwork” of type ‘bridge’, the default network driver in Docker.

To see that the network is created, type:

docker network ls

Step 2: Start the PostgreSQL Container on the Custom Network

Start the PostgreSQL container and make sure it’s attached to the custom network:

docker run --name my-postgres -e POSTGRES_PASSWORD=mypassword -d --network=mynetwork postgres

Step 3: Run the Django app on the same network.

docker run -d -p 8000:8000 --network=mynetwork mydjangoapp

Your Django app should now be integrated with the PostgreSQL database.

Testing:

To run your Django migrations manually, you need to run these commands:

docker ps
docker exec -it <container id> bash
python manage.py makemigrations
python manage.py migrate

You can easily test if it’s working by creating a super user and navigating to the Django admin interface.

Create a super user with this command in the bash shell in the /app directory inside the container:

python manage.py createsuperuser

Navigate to the Django admin interface on http://127.0.0.1:8000/admin and log in with your credentials for the super user account you just created.

Approach 2: Using docker-compose

If you are using Ubuntu, install docker-compose by running:

sudo apt install docker-compose

Docker-compose Setup:

This method utilizes the docker-compose.yml file, simplifying the process of managing multi-container applications.

In your project directory, create a docker-compose.yml file. This will define and run our application’s services.

Here’s a basic example to get you started:

version: '3'

services:
web:
build: .
command: ["python", "manage.py", "runserver", "0.0.0.0:8000"]
volumes:
- .:/app
ports:
- "8000:8000"
depends_on:
- my-postgres

my-postgres:
image: postgres
environment:
POSTGRES_PASSWORD: mypassword

With docker-compose, you can start your services using a single command:

docker-compose up

Testing:

Just as before, go to http://127.0.0.1:8000/. The Django application should run and connect to PostgreSQL seamlessly.

You can easily test if it’s working by following the steps described under “testing” in approach 1.

Conclusion:

In this guide, we explored two approaches to achieve the same goal. The first approach introduced Docker networking and contrasted it with docker-compose. Managing just two containers in this guide hints at the challenges when orchestrating dozens of containers individually. The second approach, using docker-compose, is powerful and designed for simplicity when managing multiple containers.

Stay tuned for our next post where we’ll continue exploring tools for handling and orchestrating our Django and PostgreSQL containers.

I hope you learned something and please leave a comment or reaction below! I’m excited to hear your thoughts or challenges with this kind of setup.

Check out the next part: Understanding Docker Volumes and Persistent Storage

--

--

Jonas Granlund

Data Visualization Expert and web dev enthusiast. Join our journey on exploring tech