How did we seamlessly integrated JFrog CLI into our GitLab CI pipelines

Ron Izraeli
4 min readMar 5, 2023

--

GitLab in the swamp (created with Midjourney)

In today’s fast-paced and ever-changing technological landscape, software development teams are constantly pressured to deliver high-quality code quickly and efficiently. To address that challenge, many developers and companies incorporate CI/CD pipelines into their SDLC. The GitLab solution consolidates the version control and the CI/CD pipeline in a single interface making it a popular solution. However, one of the biggest challenges the SDLC is facing evolves around the supply chain security posture throughout the building, testing, packaging, and deploying securely and reliably. This is where the integration of JFrog CLI into GitLab CI/CD pipelines comes into play. By leveraging the power of JFrog’s industry-leading binary repository manager and the flexibility of GitLab’s continuous integration and deployment platform, teams can streamline their software delivery process and achieve true Dev-Sec-Ops success. So, if you’re ready to take your software development to the next level, buckle up and join us on this journey as we explore how we integrated JFrog CLI into our GitLab CI pipelines.

We were seeking to improve our dependencies' security posture and conduct a detailed composition analysis. We had previous tools integrated to our pipelines, however, the notion of consolidating both tools and visibility across our supply chain has great advantages. Since we already had been using JFrog Artifactory and Xray, it was easy for us to address our challenge with JFrog Build integration using JFrog CLI.

While JFrog provided best practice examples and templates, we quickly realized that implementing them across hundreds of projects was not a straightforward task.

Here is an example of building npm according to JFrog’s best practice template →

default:
image: node:16

include:
- remote: "https://releases.jfrog.io/artifactory/jfrog-cli/gitlab/.setup-jfrog.yml"
jfrog-npm-build:
script:
- !reference [.setup_jfrog, script]

# Configure JFrog Artifactory repositories
- jf npmc --repo-resolve $ARTIFACTORY_VIRTUAL_REPO --repo-deploy $ARTIFACTORY_LOCAL_REPO

# Resolve dependencies from JFrog Artifactory
- jf npm install

# Deploy *.tar.gz to JFrog Artifactory
- jf npm publish

# Publish build-info to JFrog Artifactory
- jf rt build-publish

after_script:
# Cleanup
- !reference [.cleanup_jfrog, script]

The challenges we came across are:

1# Effort

Adding the ‘jf’ prefix to tens or hundreds of project’s .gitlab-ci.yml files is a time-consuming effort — hence we are not likely to get the corporation of Dev and DevOps teams to implement.

2# Tight Coupling

Adding the ‘jf’ prefix creates a coupling that could potentially cause problems if we ever decided to remove or change it.

First make the change easy, then make the easy change. Kent Beck

To address these challenges, we came up with a more sophisticated solution to reduce friction. We used GitLab CI’s template feature to create a generic template that included the JFrog CLI commands. This allowed us to avoid duplicating the same commands in every project’s pipeline, significantly reducing the effort required to integrate JFrog CLI.

We created a template so other project pipelines can easily consume it. common/templates/.ci-cd.yml →

variables:
ARTIFACTORY_BASE: https://artifactory.mycompany.com
JFROG_CLI_BUILD_NAME: $CI_PROJECT_NAME.$CI_BUILD_REF_NAME
JFROG_CLI_BUILD_NUMBER: $CI_PIPELINE_ID
JFROG_CLI_BUILD_URL: $CI_JOB_URL
JFROG_CLI_TEMP_DIR: $CI_PROJECT_DIR/build-info

.artifactory-base:

before_script:

# Install JFrog CLI
- curl -v >/dev/null 2>&1 || wget -q -O - $ARTIFACTORY_BASE/artifactory/generic/jfrog-cli/install-cli.sh | sh
- jf -v >/dev/null 2>&1 || curl -sS $ARTIFACTORY_BASE/artifactory/generic/jfrog-cli/install-cli.sh | sh

# Create server ID
- jf config add myartifactory --url $ARTIFACTORY_BASE --access-token=$JF_ACCESS_TOKEN

# Configure JFrog Artifactory repositories
# Configure the following variables with the names of npm repositories in JFrog Artifactory:
# 'ARTIFACTORY_VIRTUAL_REPO', and 'ARTIFACTORY_LOCAL_REPO'.
- jf npm-config --repo-deploy=$ARTIFACTORY_VIRTUAL_REPO --repo-resolve=$ARTIFACTORY_LOCAL_REPO

# Replace npm and docker commands with jf wrappers
- mkdir -p /wrapper
- echo '#!/bin/sh' > /wrapper/npm && echo "PATH=\"${PATH}\"" >> /wrapper/npm && echo 'jf npm $@' >> /wrapper/npm && chmod +x /wrapper/npm
- echo '#!/bin/sh' > /wrapper/docker && echo "PATH=\"${PATH}\"" >> /wrapper/docker && echo 'jf docker $@' >> /wrapper/docker && chmod +x /wrapper/docker
- export PATH="/wrapper:${PATH}"

after_script:

# Add enviroment variables to build-info
- jf rt bce $JFROG_CLI_BUILD_NAME $JFROG_CLI_BUILD_NUMBER

# Add information from git to build-info
- jf rt bag $JFROG_CLI_BUILD_NAME $JFROG_CLI_BUILD_NUMBER

# Publish build-info to JFrog Artifactory
- jf rt bp $JFROG_CLI_BUILD_NAME $JFROG_CLI_BUILD_NUMBER

The template aims to wrap the job’s script with before and after operations. Before, we provision the JF CLI and configuration. And after, we upload the collected build-info.

Now, developers can include it in their .gitlab-ci.yml (read the comments)

# make .ci-cd.yml template jobs available to use in your pipeline
include:
- project: common/templates
file: '.ci-cd.yml'
ref: latest

variables:
ARTIFACTORY_DOCKER_REPO: artifactory.mycompany.com/docker
...

build:
image: $ARTIFACTORY_DOCKER_REPO/node:16
stage: build
extends: .artifactory-base # instruct your job to apply the template wrappers (before and after script)
script:
- npm ci
- npm publish

Continuing to a more complex scenario, where you want to share your build-info across various stages (read the comments)

include:
- project: common/templates
file: '.ci-cd.yml'
ref: latest

variables:
ARTIFACTORY_DOCKER_REPO: artifactory.mycompany.com/docker
DIND: $ARTIFACTORY_DOCKER_REPO/docker:dind
DOCKER_IMAGE_NAME: $ARTIFACTORY_DOCKER_REPO/$CI_PROJECT_PATH:$CI_BUILD_ID-$CI_BUILD_REF_NAME

stages:
- build
- package

build:
image: $ARTIFACTORY_DOCKER_REPO/node:16
stage: build
extends: .artifactory-base
after_script: [] # Override template's after_script to prevent build-info publish so we can aggregate the next job
script:
- npm ci
- npm run build
artifacts:
paths:
- build
- build-info # Share the build-info folder for the next job

package:
image: $DIND
stage: package
extends: .artifactory-base
services:
- name: $DIND
alias: docker
script:
- docker build --tag $DOCKER_IMAGE_NAME .
- docker push $DOCKER_IMAGE_NAME

As you can see in the former example, the npm command remains untouched but in fact, they execute the jf wrapper.

mission accomplished

We minimized the friction as much as possible and only add decorations. The core logic of the job remains untouched.

If we ever wish to revert or enhance it, we simply change the template.

I would like to thank Adam Richter and Assaf Mendelson for their collaboration in implementing the above solution

--

--