CI/CD for Bitbucket Cloud in Python and Rust using cache

Thomas Simmer
3 min readFeb 3, 2024

In this article, I present an easy way to start CI/CD on the Bitbucket Cloud platform. I believe this can be adapted for other git platforms.

Projects with a Python backend

Suppose you have a project that uses docker compose. You have a container for frontend, one for database, and another one for backend.

Example of a structure for a project using docker compose

Every time you push a commit, you would like all tests written in Python to be executed automatically by Bitbucket.

Simply create a bitbucket-pipelines.yml at the root of your project. Here is the content of mine:

# The pipeline simply consists in running tests.
#
# Cache build dependencies to speed up pipeline. See :
# https://support.atlassian.com/bitbucket-cloud/docs/cache-dependencies/

image: python:3.11

definitions:

caches:
apt-lists: /var/lib/apt/lists
apt-cache: /var/cache/apt

yaml-anchors:
- &setup-script >-
rm /etc/apt/apt.conf.d/docker-clean
&& apt-get update
&& apt-get install -y python3-dev

# Notice above I only kept python3-dev but you should add your apt dependencies
# here.

pipelines:
default:
- step:
name: Run tests
caches:
- pip
- apt-lists
- apt-cache
script:
- *setup-script
- cd backend
- pip install -r requirements.txt
- export OPENAI_API_KEY='dummy'
- python manage.py test

# Notice as well the "cd backend" command. That may not be necessary for you.

As you can see, I first declare the image I am going to use (Python 3.11). I then define some locations for caches, so Bitbucket does not download dependencies every time a new commit is pushed. I also define a small script that will download the apt dependencies and remove a docker-clean file because it’s an apt hook that deletes the packages after installation.

After that, I can declare my pipeline. There is one step called “Run tests” that runs 5 scripts. The first is my setup-script defined before. I then change directory to my backend folder because that’s where my Django project is located. I install the requirements and declare a fake OpenAI API key because my tests need one. I don’t make any API calls in my tests, I use mock objects for that. There will probably be another article to explain that. But I still need this variable to be set up so if you need another environment variable, you can do this. Of course, avoid to write any secret in here. Finally I run my tests.

Let’s see how Bitbucket handles this. Here are the results the first time I pushed :

And here is the second time :

Notice the decrease in time thanks to caching.

Projects in Rust

For Rust, my pipeline is slightly simpler. Here it is:

# The pipeline simply consists in running tests.
#
# Cache build dependencies to speed up pipeline. See :
# https://support.atlassian.com/bitbucket-cloud/docs/cache-dependencies/
# See here to understand the choice for the folders we cache :
# https://doc.rust-lang.org/cargo/guide/cargo-home.html#caching-the-cargo-home-in-ci

image: rust

pipelines:
default:
- step:
name: Run tests
caches:
- cargo-registry-index
- cargo-registry-cache
script:
- cargo test

definitions:
caches:
cargo-registry-index: /usr/local/cargo/registry/index/
cargo-registry-cache: /usr/local/cargo/registry/cache/

I chose to cache only two things: index and cache. However, you can see everything you can cache here: https://doc.rust-lang.org/cargo/guide/cargo-home.html#caching-the-cargo-home-in-ci

I placed this file at the root of my project and here is the result:

Conclusion

These are basic pipelines but I hope they will help you gain some time and encourage you to write tests. If you have any recommendations on how to improve these pipelines, please write to me. Or even better, make a pull request here to improve them or add new ones for other stacks: https://github.com/thomassimmer/bitbucket-pipelines

--

--