Dockerizing Django DRF Application

Abhijit Maity
4 min readAug 19, 2023

Django is very good python framework for web development. It comes with all the required components when you bootstrap a project by using its own CLI. In this article we will see how to write a Dockerfile to package a DRF project and also see how we can optimize the incremental build.

Suppose I am considering the project folder structure is like below and your all source code related to project is inside src/.

src/
tests/
Dockerfile
requirements.txt

Below is an example Dockerfile for DRF project.

Now I will go through the Dockerfile statements and elaborate in details.

  1. FROM python:3.11-alpine : Base docker image which has python preinstalled. Using alpine image as it is lightweight.
  2. LABEL maintainer=”Abhijit” email=”email@abhijit.com” : This is labels you want to add to the image. Comes with key=value pattern. Anything relevent to your project or organization you can add here.
  3. ENV PYTHONUNBUFFERED 1 : This is necesary when running python in docker container as it stops container from buffering the python output. Rather that buffer the the output it just prints directly.
  4. RUN echo “Install your system dependencies here if any!!!” : If you have any system dependencies you can install here using system package manager. In this case you can use apk and install system dependencies.
  5. COPY requirements.txt /requirements.txt : This statement copies the file requirements.txt to image.
  6. RUN pip install -r /requirements.txt : This statement installs the dependencies while building the image.
  7. WORKDIR /app : This statement makes the default landing directory to /app.
  8. COPY ./src /app : This statement copies all the project related files to /app directory.
  9. RUN useradd -ms /bin/sh appuser : This statement ads user appuser to image which can be used at container runtime.
  10. USER appuser : Here setting the user for container runtime.

You might be thinking that why did I add the requirements.txt to image before project source code. Ok, this is the optimization I was talking about at the starting of this write.

At the time of docker build it detects if anything changed in any of the statement in Dockerfile and if there any changes then the build recreates all next image layers by executing the same statement or it takes from the cache built by previous buid. If you think clearly, you can find that we are not updating the requirements.txt frequently and not need to install the requirements when there only change in source code.

Now you can build the docker image by running the docker build command like below.

docker build -t mydjango-drf-project:1.0.0 .

Here another specific thing I want to mention regarding the ENTRYPOINT or CMD statement in Dockerfile. I have added those 2 as by default the image has ENTRYPOINT to set to python executable but you can override and also, I left the CMD as you can set it at the time of container creation.

OK, so you have your docker image which can be run locally or can be pushed to docker repository for later use.
But do you think it is a good idea to run the application on production using the django runserver command which looks for a file change to reload the changes. No definitely not.

For production it is always recommended to use gunicorn or similar server which can give you more flexibility to configure real life scenario. To add, configure and use gunicorn, you can add the dependency to requirements.txt and start using gunicorn command in CMD section or container creation time.

Well, now you have a setup and docker image which can run on production system as well but there is small drawback in case of gunicorn as it does not serve static files, you might see the DRF REST doc pages are not able to get css and js files and not rendering as expected. But the workaround is simple, and you can achieve it using any http server like lightweight python http server.

Set STATIC_ROOT in settings.py and run the below command which will generate the static files to a specific folder mentioned on STATIC_ROOT variable and then run the http server pointing to the folder.

# settings.py
STATIC_ROOT = static/
pip install http.server;
python manage.py collectstatic;
python -m http.server static/

In upcoming writing, I will describe a complete guide to write a docker-compose file which will include all possible dependencies like database, queue, cache in the same config. You just need to do a docker compose up to start the development locally. Also, if possible, I will try to add some more details to use skaffold or garden which can enable developers to do development on kubernetes platform.

--

--