Integrating code inspection in your OpenShift Pipelines using SonarQube

Grzegorz Smolko
AI+ Enterprise Engineering
7 min readJul 30, 2021

How to integrate code inspection into your build pipelines

During my work in the Cloud Engagement Hub (CEH), we are often asked by customers how the software development lifecycle for cloud applications should look like, what are key parts in it and how to make it consistent. Today I’ll talk about integrating code inspection.

Code inspections and code reviews are important developer practices. They allow your code to be cleaner, more secure and aligned with your coding guidelines. But code review requires a senior developer, who might not be always available to do a peer review.

What if you could gather his knowledge and do review at any given time? Or even better, what if you could integrate review as a step in your continuous integration pipeline?

In this story you will learn how to install SonarQube, one of the several tools that perform code inspection, in the RedHat OpenShift Container Platform (OpenShift) and use it in your OpenShift Pipeline.

Install SonarQube

The most convenient way to install SonarQube in your OpenShift environment is via Helm chart. You can get the latest version of the chart from here:

git clone https://github.com/SonarSource/helm-chart-sonarqube.git
cd helm-chart-sonarqube/charts/sonarqube
helm dependency update

This chart is prepared for plain Kubernetes, and you need to make following changes in the helm-chart-sonarqube\charts\sonarqube\values.yamlfile for the OpenShift deployment:

OpenShift:
enabled: true

securityContext:
# For standard Kubernetes deployment, set enabled=true
# If using OpenShift, enabled=false for restricted SCC and enabled=true for anyuid/nonroot SCC
enabled: false

volumePermissions:
# For standard Kubernetes deployment, set enabled=false
# For OpenShift, set enabled=true and ensure to set volumepermissions.securitycontext.runAsUser below.
enabled: true
# if using restricted SCC set runAsUser: "auto" and if running under anyuid/nonroot SCC - runAsUser needs to match runAsUser above
securityContext:
runAsUser: "auto"

# For OpenShift set create=true to ensure service account is created.
serviceAccount:
create: true

There is already customized version of this chart available here: https://github.com/stocktrader-ops/helm-chart-sonarqube which additionally automatically creates OpenShift Route to access deployed SonarQube application.

Install chart with the following commands (login to your OCP cluster first):

oc new-project sonarqube
cd helm-chart-sonarqube/charts/sonarqube
helm upgrade --install -f values.yaml -n sonarqube sonarqube ./

If you see the following messages the chart has installed successfully:

I0706 12:37:25.645523    3900 request.go:621] Throttling request took 1.1159359s, request: GET:https://yourcluster/apis/route.openshift.io/v1?timeout=32s
Release "sonarqube" has been upgraded. Happy Helming!
NAME: sonarqube
LAST DEPLOYED: Tue Jul 6 12:37:26 2021
NAMESPACE: sonarqube
STATUS: deployed
REVISION: 2
NOTES:
1. Get the application URL by running these commands:
export ROUTE_HOST=$(kubectl get route sonarqube --namespace sonarqube -o jsonpath="{.spec.host}")
export PROTOCOL="http"
export PROTOCOL="https"
echo $PROTOCOL://$ROUTE_HOST

Create Route

If you are NOT using my customized version of the chart, you need to create route to access SonarQube.
Either create route via web UI in your sonarqube project Networking > Routes with the following settings:

  • Name: sonarqube
  • Service: sonarqube-sonarqube
  • Target Port: 9000 -> http (TCP)
  • Secure route: checked
  • TLS termination: edge

Or apply the following yaml:

kind: Route
apiVersion: route.openshift.io/v1
metadata:
annotations:
meta.helm.sh/release-name: sonarqube
meta.helm.sh/release-namespace: sonarqube
openshift.io/host.generated: 'true'
name: sonarqube
namespace: sonarqube
labels:
app: sonarqube
app.kubernetes.io/managed-by: Helm
chart: sonarqube-1.0.15
heritage: Helm
release: sonarqube
spec:
to:
kind: Service
name: sonarqube-sonarqube
port:
targetPort: http
tls:
termination: edge

Access and configure SonarQube

Access your SonarQube instance via created route. You should see the following login screen:

SonarQube login screen
Login screen

Login with default admin/admin credentials. On the first login you will be asked to set new admin password.

It is not recommended to use admin account for general purposes. Create new, dedicated user that will be used by your pipeline. Select Administration > Security > Users and create new user:

Create user

SonarQube allows to use tokens for integration with tools, which is more secured that using username and password pair. Relogin to your newly created user and generate token that will be used by your build pipeline. Select your user icon > My Account > Security, provide token name and click Generate:

Generate access token

You can additionally configure custom quality profiles and quality gates for your projects, but this is out of scope for this article.

Pipeline integration

Now, once SonarQube is installed and configured, you can integrate it with your pipeline. For this article it is assumed that you already have OpenShift Pipelines installed in your cluster.

OpenShift Pipelines are based on Tekton, to integrate external tool, like SonarQube, into your pipeline it is recommended to create separate Task, as this allows better reuse across different pipelines.

Depending on your pipeline definition, this task may just have SonarQube integration step or also some additional steps like git cloning your source code or application compilation. In this case task has multiple steps (git-clone, build, sonar-scan) as it makes it more independent.

Here, I’m only showing code related to the sonar-scan step (you can find full task definition in this repo ibm-sonar-scan.yaml)

- name: sonar-scan
image: $(params.sonarqube-cli)
env:
- name: SONAR_USER_HOME
value: $(params.source-dir)
- name: SONARQUBE_URL
valueFrom:
secretKeyRef:
key: SONARQUBE_URL
name: sonarqube-access
optional: true
- name: SONARQUBE_TOKEN
valueFrom:
secretKeyRef:
key: SONARQUBE_TOKEN
name: sonarqube-access
optional: true
workingDir: $(params.source-dir)
script: |
APP_NAME=$(params.app-name)
SONARQUBE_JAVA_BINARIES_PATH="$(params.sonarqube-java-bin-path)"
sonar-scanner \
-Dsonar.login=${SONARQUBE_TOKEN} \
-Dsonar.host.url=${SONARQUBE_URL} \
-Dsonar.projectKey=${APP_NAME} \
-Dsonar.qualitygate.wait=$(params.gate-wait) \
-Dsonar.qualitygate.timeout=$(params.gate-wait-timeout) \
-Dsonar.java.binaries=${SONARQUBE_JAVA_BINARIES_PATH}

Key parameters for this task are:

  • image — its a docker image with SonnarScanner cli (sonarsource/sonar-scanner-cli)
  • SONAR_USER_HOME —location of your application source files
  • SONARQUBE_URL — URL to your SonarQube instance
  • SONARQUBE_TOKEN — your SonarQube user token
  • SONARQUBE_JAVA_BINARIES_PATH — location of your compiled classes (required by SonarQube in case of Java application analysis)
  • gate-wait — defines if successful quality gate evalutation is required for the task completion.

As you can see some of the SonarQube access variables are defined via K8s secret, which should be defined in the project where your pipeline is located. Here is sample secret:

kind: Secret
apiVersion: v1
metadata:
name: sonarqube-access
data:
SONARQUBE_TOKEN: base64encodedToken==
SONARQUBE_URL: base64encodedURL==
type: Opaque

Once you have task, you can add it to the pipeline. For this article, I use very simple pipeline, that just runs the sonar-scan task:

Running the pipeline

You can find source of the pipeline here — sonar-trader-pipeline.yaml. This part of the pipeline integrates with the task:

- name: sonar-scan
params:
- name: git-url
value: $(tasks.setup.results.git-url)
- name: git-revision
value: $(tasks.setup.results.git-revision)
- name: source-dir
value: $(tasks.setup.results.source-dir)
- name: app-name
value: $(tasks.setup.results.app-name)
runAfter:
- setup
taskRef:
kind: Task
name: ibm-sonar-scan

If you want install this sample pipeline in your project execute the following:

git clone https://github.com/stocktrader-ops/stocktrader-argocd-gitops.git
cd stocktrader-argocd-gitops/pipeline
oc apply -f ibm-sonar-scan.yaml
oc apply -f ibm-setup-v2-1-26.yaml
oc apply -f sonar-trader-pipeline.yaml

You can invoke the pipeline from the OpenShift console using Start action:

Once the pipeline is executed, you can open SonarQube interface to see application scan results:

Analysis results

Analysis shows various information about your code, like bugs, vulnerabilities, code smells and rough time estimate to fix these issues. You can easily drill down to see specific detected problems:

“Code smells” found in the application

Such information allows you to make better decisions about application development and deployment.

GitHub integration

Right now, you were invoking pipeline manually via OpenShift interface. You can integrate the pipeline with your GitHub repository via webhook.

First, run the following yamls, which define listener on the OpenShift for your pipeline:

oc apply -f sonar-trader-pipeline-binding.yaml
oc apply -f sonar-trader-pipeline-listener-route.yaml
oc apply -f sonar-trader-pipeline-listener.yaml
oc apply -f sonar-trader-pipeline-template.yaml

To get the listener route url issue command:

oc get routes
NAME HOST/PORT PATH SERVICES PORT TERMINATION WILDCARD
el-sonar-demo-pipeline el-sonar-demo-pipeline-project-namespace.cluster.com el-sonar-demo-pipeline http-listener edge None

In this case it is el-sonar-demo-pipeline-project-namespace.cluster.com

In your GitHub application repository create web hook with the noted URL:

Webhook settings

Now, your application repo is integrated with the pipeline. Anytime you will check-in new code, the pipeline will be triggered and code will be automatically reviewed by the SonarQube for bugs and vulnerabilities.

While doing such reviews you may find out issues in the code that you didn’t even thought about. At latest that was in my case when I reviewed some older, legacy code ;-)

--

--