Handle multiple databases in bitbucket Pipelines

Anoob Bava
3 min readMay 13, 2022

--

Stop Before proceed:

This document is expected the reader to have a basic understanding of how the bitbucket pipeline works and what is a docker and how it is used.

Problem:

Imagine you are working on a project which is loading the data from an external database and you have only read-only access. Suppose you need to write developer tests to verify the data from this one and how we can implement the same in bitbucket pipelines.

Current option:

bitbucket is providing the pipelines to execute the developer test or CI/CD functionalities. Also, they are providing only a single database connection to handle the normal developer tests which are currently used for our Rails application. We might need another connection similar to the existing one to use for the external database connectivity.

The existing workflow in bitbuket_pipelines.ymlis like this.

image: ruby:2.3.0
pipelines:
branches:
'*':
- step:
script:
- export DATABASE_URL=postgresql://pg_user:pg_user_password@localhost/pipelines
- bundle exec rails db:schema:load RAILS_ENV=test
- bundle exec rspec
services:
- postgres
definitions:
services:
postgres:
image: postgres
environment:
POSTGRES_DB: pipelines
POSTGRES_USER: pg_user
POSTGRES_PASSWORD: pg_user_password

The issue with this approach is that we have only one database connection is available. I have searched for a lot of options in the community, not found a solution that worked for me.

Proposed solution:

We have an option to load the docker image to the bitbucket pipelines to speed up the execution. So I thought about why don’t create a Postgres database on the docker image, so once the image is loaded to the bitbucket pipeline we can access this connection and use this one for our external database developer tests.

  1. Create a Dockerfile
  2. Use database connection in bitbucket pipeline.

1. Create a Dockerfile

FROM ruby:2.3.0
RUN apt-get update && apt-get install -y --no-install-recommends apt-utils
RUN apt-get install -y -qRUN apt-get update \
&& apt-get install -y postgresql postgresql-contrib \
&& apt-get install sudo \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/*
RUN sudo sed -i 's/port = 5432/port = 5433/' /etc/postgresql/11/main/postgresql.confRUN sudo service postgresql restart 11RUN pg_lsclustersRUN apt-get update \
&& apt-get install -y \
curl \
libxrender1 \
libjpeg62-turbo \
fontconfig \
libxtst6 \
xfonts-75dpi \
xfonts-base \
xz-utils

Normally Postgres will be using the 5432port number, this will be used by our normal Rails application developer tests, we can change that port to 5433for our new connection.

Once this is completed, need to create a docker image from this Dockerfile and push it to the docker hub ready to be pulled by bitbucket pipelines.

2. Use database connection in bitbucket pipeline

Once the image is loaded to the bitbucket pipelines, the new Postgres connection will be off state. We might need to jump-start the Postgres using the restart command.

Now, we can create the database and create superuser permissions on that database. Also, load the structure and data from the application path itself.

That's it, Thanks for reading, if you think of a better way to handle this, please share your ideas.

If this story helps you to learn anything, please feel free to buy me a coffee

--

--