Continuous Integration for Python 3 in Gitlab

Conor Flynn
Metro Platform
Published in
3 min readJul 4, 2018

Gitlab is the new cool kid. It’s open source, self-hostable, and as a new unique selling point, not owned by Microsoft (on the off-chance you hadn’t heard yet). People have been switching from Github in droves recently as the tooling for importing a repository from Github has been perfected with the tens of thousands of recent moves a testament to that.

It’s easy to see why. Gitlab is a fully integrated agile development environment. You can manage a board and milestones, conduct code reviews and most relevant to this article, run your continuous integration tests easily. While setting up continuous integration with coverage reporting for Metro recently on Gitlab it initially seemed quite intuitive. However as it progressed there were some undocumented issues.

To save you from this hardship, I’m going to condense my experience of setting it up here, by showing you a template so you can get your CI set up painlessly.

Our testing is always this successful. Promise…

Given that you have a Python 3 project and a command line command to run your tests, you are almost good to go. You will be creating and modifying the .gitlab-ci.yml file in the root of your repository to configure your testing environment. There are five main sections that will be covered here, but you can see the full list of keys with in-depth explanations for the configuration file here.

If you’re reading this, you are clearly technically capable and are more than likely just looking for a file to copy/paste and modify so let’s get into the details by stepping through a commented and stripped down version of our .gitlab-ci.yml file:

# Register any environment variables you need here.
variables:
ENV_VAR_1: "I'm an environment variable"
ENV_VAR_2: "You get the picture"

# These are magic variables. They will be used by Postgres/Docker to do our initial database setup. The keys must be named as follows. You can change the values as you see fit.
POSTGRES_DB: 'database_name'
POSTGRES_USER: 'postgres'
POSTGRES_PASSWORD: 'postgres'
# This lets the Gitlab runner know we want a Docker image of the latest Postgres version to be installed.
services:
- postgres:latest
# These paths will be cached in between test runs. Very useful for reducing the build time.
cache:
paths:
# This took some trial and error to figure out.
- ~/.cache/pip/
# This runs before the unit_tests task and should do any miscellaneous setup that's needed.
before_script:
- pip install -r requirements.txt
# The main task to run your tests.
unit_tests:
# This imports the Docker image for Python. Useful so you don't have to do any Python setup manually.
image: python:3.6
# These are environment variables you can specify local to the unit_tests task.
variables:
# If you changed the POSTGRES_* values in the global environment variables section you will also need to modify this string.
DATABASE_URL: "postgresql://postgres:postgres@postgres:5432/database_name"
# Command line commands to run your tests and generate coverage reports.
# With Django these are run through the manage.py file. You should change this section as if you're running the tests from the command line.
script:
- echo "Running unit tests…"
- cd src/metro/
- python manage.py migrate --settings=metro.settings.ci --noinput
- python manage.py create_model_dummies --settings=metro.settings.ci
- coverage run --source='.' manage.py test --settings=metro.settings.ci
- coverage report
# This regex will pull out the line coverage from the coverage report output so it will be displayed in the job page on Gitlab.
coverage: '/^TOTAL.*\s+(\d+\%)$/'

Once this file is in the root of your repository you are good to go. You will use shared runners by default, but you can use dedicated ones if your project is large enough to warrant it. So that’s it. This post should get you set up in half the time it initially took me. If you have any improvements that could be made to this template let me know in the comments below.

--

--