A Boilerplate to Self-Hosted Continuous Delivery Django Apps (Part-2)
Django, Docker, GitHub Actions, Workflow, Self-Hosting, Custom Runner
Hi. In this article, I will write about a GitHub Actions Workflow that I created to automatize my deployment process on my own VPS. Actually, the workflow and project boilerplate that we will use is the boilerplate repo I created to handle these kinds of processes. Also, I have published an article about the repo. You can read the part 1 of this story by following the link below.
Also, you can read the first part of this story by following the link below.
GitHub Actions
We can automate our workflows by using GitHub Actions.
We can use them by creating .github/workflows
folder in our repositories. I do not want to deep dive into the essential topics such as what GitHub Action is, what the YAML file is etc in order to save time.
Self-hosted CD Workflow
I generally use Hetzner to deploy my projects instead of GCP or AWS. It is more wallet-friendly than them. In self-hosted workflows, the main idea is to create a runner for tracking the trigger events on the workflow.
cd.yml
In my boilerplate repo, there is a file which is named cd.yml
. I defined my workflow in this file.
name: Baysan Continuous Deliveryon:
push:
branches: [ "main" ]jobs:
deploy:
runs-on: self-hostedsteps:
- uses: actions/checkout@v3
- name: Build Updated Docker Compose Environment
run: docker-compose build- name: Stop Old Docker Compose Environment
run: docker-compose down- name: Run Updated Docker Compose Environment
run: docker-compose up -d- name: Remove Unused Docker Compose Environment
run: |
docker container prune -f
docker image prune -f
Basically, this file does the steps below:
- Execute this workflow when something is
pushed
onmain
branch. - Define a job as
deploy
- The job will run on
self-hosted
server - Get the last version of the code by using
actions/checkout@v3
- On the server, execute
docker-compose build
- On the server, execute
docker-compose down
- On the server, execute
docker-compose up -d
- On the server, execute
docker container prune -f && docker image prune -f
Actually, it was straightforward to understand the file. We could use the options below to run our workflow on it.
However, I used self-hosted
to run this workflow on my server. Now, everything is ready to use this workflow. We just need to set up a runner to track this workflow on our server.
Creating Runner
Under the Settings
, we find Actions
option. We use Runners
tab under the Actions
to create our runner.
Then we will find everything we will need by clicking just New self-hosted runner
button.
Actually, when you follow the commands, you will successfully create your own runner. Probably, you will get an error like this Must not be executed as ROOT
if you are root
user. You can use the command below to bypass this security rule.
export RUNNER_ALLOW_RUNASROOT=1
However, there is a tricky point. By using run.sh
script, you can start your runner. But when you close the terminal, the runner will be closed. To deal with that, you should use svc.sh
file. You can get how-to-use instructions by executing ./svc.sh help
.
You can install the runner as a service on your server.
./svc.sh install
Then, you can start it by using the command below.
./svc.sh start
Also, you can check its health status.
./svc.sh status
You just need to use uninstall
if you want to remove this runner.
./svc.sh uninstall
Now, you can see your workflow under Actions
section on your repo.
Finally
Hopefully, you enjoyed it. I always wondered about this topic. When I learned it, I was super happy. Because I dreamed that I could automatize my projects by using just GitHub instead of using AWS or GCP. You can visit the boilerplate repo I mentioned and used in the article by following the link below.
I also shared a video on my Youtube channel about how to use this repo and how we can easily deploy our projects on a real server. For now, it is just in Turkish. I believe that one day I will upload my videos in English :P
Kind regards
Also, you can read the first part of this story by following the link below.