Thanks to Pankaj Patel for the awesome picture.

Stop running static jobs in GitLab, run dynamically created jobs instead

Robert-Jan Kuyper

--

To start with a confession I was also guilty of running static scoped jobs in GitLab.

The issue with static scoped jobs

To clarify the issue I faced over and over again; let say you have a nice app with 3 environments: develop, acceptance and production. Personally I would be tempted to think in a static solution like:

Multiple times I wondered that there should be an easier way of this copy paste behaviour, to actually create it once and reuse it everywhere.

Fortunately GitLab came with the extends keyword, that allows us to inherit the configuration from another job. At least this partially solves the issue. The above gitlab-ci.yml could now be refactored to:

Although in this example the code doesn’t become smaller, in most real-world scenarios it should become smaller. However, it still forces as to create separate jobs for each environment and lot’s of problems would not be tackled with this approach.

What we seek is a more flexible approach. An approach where there could be an ever-growing list of jobs without the need of adding jobs over and over again.

GitLab dynamic child pipelines 🚀

According to GitLab, instead of running a child pipeline from a static YAML file, you can also define a job that runs your own script to generate a YAML file, which is then used to trigger a child pipeline.

The image below outlines a dynamic child pipeline.

Dynamic child pipelines explained

A simple use case

Imagine you want to push a simple Docker image into a registry, where only one build argument should be passed. This build argument however should vary per branch. So all we should need is:

  • A Dockerfile
  • and a .gitlab-ci.yml

Let’s create a simple Dockerfile and echo the given environment.

We can test it with the following line:

docker build --build-arg ENVIRONMENT=acceptance -t test . && \
docker run test

The output should be acceptance.

Now we have a Dockerfile let’s create the .gitlab-ci.yml configuration file, where we built the image and push it into GitLab’s private registry.

I hear you questioning, where the docker build and docker push part is. This is were dynamic pipelines comes in. To clarify the working from top-to-bottom:

  • We use a default image of node:16 for all the jobs
  • We define 2 stages: build and trigger .
  • In the build stage we run a node script called create-pipeline.js that writes a new pipeline configuration file to the repository called dynamic-gitlab-ci.yml and stores it as an artifact.
  • In the trigger:deploy job we explicitly wait for for the build job and trigger the pipeline configuration file we generated in the previous job called dynamic-gitlab-ci.yml.

The dynamic-gitlab-ci.yml configuration file is generated during runtime in the build job to trigger it in a later stage. To clarify the missing docker build and docker push part, let’s dive into the create-pipeline.js:

As you can see clearly, the docker build and docker push commands are dynamically generated here. To clarify the script from top-to-bottom:

  • the branchMap is a JavaScript map and maps branches to our environments.
  • the createJob function takes one argument, named environment and returns a gitlab job. This job contains valid yaml and holds the configuration for the docker build and docker push part.
  • Finally the createDynamicGitLabFile determines the current branch, created the job and writes it to dynamic-gitlab-ci.yml.

That’s it. We now have a fully functional dynamic child pipeline running in GitLab:

You can view the example project in GitLab.

Conclusion

GitLab dynamic child pipelines come very handy when static pipelines do not meet your requirements. It is a powerful way to solve many common problems when duplicating parts in pipelines.

For more examples watch the video below.

--

--

Robert-Jan Kuyper

Senior Backend Engineer specialised in NodeJS, NestJS, Docker and CI/CD | https://datails.nl/