Effective meta-repo CI/CD pipelines
- It’s not all about mono-repo or multi-repo: We’ll cover pros and cons of multi-repo and mono-repo and how a meta-repo can help.
- Setting up a meta-repo: We’ll learn how to get your meta-repo up and running.
- CI/CD pipelines: We’ll set up some simple continuous integration pipelines to leverage our meta-repo with Docker building and semantic release.
It’s not all about mono-repo or multi-repo
We have all been there. You start with a small idea in one repository but after some time you get more ambitious, want to build more components and start thinking about how to break your code in parts. You have two options:
Option 1: Managing multiple repositories
Sometimes referred to as the multi-repo, you simply create a repository per logical project under the umbrella of your organization.
- Very clear separation enforcing cleaner modules.
- Each repository can have its CI/CD pipeline which simplifies things enormously.
- Granular access permissions can be granted per repository or group.
- Beyond a certain number of repositories, managing can be a burden.
- A new developer set up involves pulling multiple repositories.
- It is harder to reuse code between repositories.
Option 2: The mono-repo
It attempts to solve the issues with having multiple repositories by maintaining only one. This technique is used by many of the cool kids on the block like Google and Facebook.
- A very simple developer set up with one pull.
- Easier to share code between projects.
- You can take advantage of tools like Bazel, Buck or Pants.
- Undeniably CI/CD is not as straight forward.
- Developers need to pull code they might not need.
- You grant access to all your codebase.
Like any other technical decision, there is a compromise so both approaches are perfectly valid in certain cases. I really liked the simplicity of setup of a mono-repo but most of us aren’t Google and tooling is just not there yet.
Option 1.5: Bridging the gap with a meta-repo
The meta-repo is the missing piece which provides a tool to group multiple backing repositories into one yielding a less cumbersome management.
- You actually have multiple repositories so all the advantages still apply.
- Developers can pull all repositories of a meta-repo with one command.
- You can have as many meta-repos as you want to point to other backing repositories. ie: ‘meta.all’, ‘meta.backend’, ‘meta.frontend’.
- You actually have multiple repositories.
Setting up a meta-repo
We’ll be using a node tool called
meta which helps to manage your repositories following the meta-repo paradigm.
It gives a simple way of running scripts throughout multiple repositories defined in a
.meta file through an array of available plugins. I do recommend you check out their website but in this article, we’ll only be using three plugins:
meta-init: Initializes a meta-repo which creates an empty
meta-project: Allows adding mappings between individual repositories and local folders.
meta-git: Brings a way to run git commands through all the defined repositories.
Enough of chit-chat, let’s get to code. You’ll need to have node installed to follow along.
First, install meta globally:
npm i -g meta
Second, let’s create a meta-repo project:
mkdir meta.all && cd meta.all
This will initialize your empty meta-repo which you can now push to your git provider of choice:
git add .
git commit -m “feat(init): initialize monorepo”
git remote add origin https://gitlab.com/you/meta.all.git
git push -u origin master
Now we can start adding projects to our meta-repo which we’ll cover in the next section.
CI/CD with GitLab
Note that I chose to use GitLab because its pipeline extending support and built-in Docker registry but the concepts should apply to other tools.
We will be adding two projects to our meta-repo:
- A Docker template which will provide all other repositories with a simple way to build and publish images from pipelines.
- A semantic release template which will create automated changelogs and tags for us.
First, let’s add our project.
meta project add tools/build/docker https://gitlab.com/you/tools.build.docker.git
Second, let’s write a pipeline template in
- Builds the image from the
- Tags it with the name of the branch.
- If the branch is
master, also tag it as
- Push the image(s) to the Gitlab Docker registry.
Semantic release template:
Semantic release automates releases based on the commit messages. This is very handy to generate automated changelogs and tag versions in git.
In order to take advantage of those capabilities without limiting our projects to using node, we’ll wrap that functionality in a portable Docker image.
First, add the project:
meta project add tools/build/semantic-release https://gitlab.com/you/tools.build.semantic-release.git
Second, create a
Dockerfile containing the semantic release installation:
Third, create a pipeline template which:
- Only runs for
- If it finds a
.releasercfile it honors it, otherwise, it uses a set of default plugins.
Consumers will need to have the
GL_TOKENenvironment variable set in order to allow semantic release to communicate with your GitLab repository. This can be done per project or per organization under CI/CD > Environment Variables.
Forth, we create the pipeline using our previous Docker template to build the semantic release image.
Once all of this is ready you can push all your repositories to GitLab by pushing each individually or running:
meta git add -A
meta git commit -m "chore(meta): some cool name"
meta git push
One the semantic release repository has been pushed you should see both stages in the pipeline pass and the image in the project’s Docker registry.
In this article, we learned how to set up a meta-repo to help give answers to the mono-repo vs multi-repo war and we set up two simple CI/CD pipelines using Docker and semantic release.
In a future article, we will build on top of this and set up a project which uses Docker and semantic release to deploy a real-world application.