Code Quality Control with SonarQube: Implementation and Adoption
Introduction
Context
At SSENSE, we have set ourselves a goal to share all our source code internally by providing access to all Git repositories for all teams within the tech department. This is one of several recent structural changes within our tech department, which have made it possible to maximize room for collaboration between all teams. By inducing cross-team initiatives and standardizing our technological practices, we are moving in a direction that encourages all engineers to feel like stakeholders in all technological initiatives. In this climate of collaboration, it’s necessary to equip oneself with the tools to navigate the tides of change and progress.
Observations
Given the aforementioned context, and the never-ending pressures of an agile ecosystem, we noted the following areas for improvement:
- On a department-wide scale, our overall consideration of code quality was lacking.
- Code quality standards were not homogenized across all teams, and were largely dictated by initiatives within certain projects.
- We needed a standardized policy for code improvement.
- We did not have a way to provide visibility on code quality levels for our various code-bases.
- Most code quality improvements were human driven rather than automated, thanks to our pull request code review system.
While these observations were not alarming or extraordinary by themselves, they definitely presented avenues for improvement that were well worth considering.
Approach
Given the challenges presented above, a policy of continuous improvement for code quality had to be adopted. The context presented above encouraged us to ask an endless number of new and important questions about the possible use-cases for such an initiative, especially with regards to its impact on cybersecurity. We decided to start by limiting our approach to first setting up a platform for automated and continuous code quality analysis.
Introducing SonarQube
SonarQube is an industry-leading platform for continuous code quality control, with a very large community of users to support it. Its repertoire of interesting and important features has made it a tool used and recognized by many enterprises.
Multi-Language Support
SonarQube is largely a language agnostic platform which supports a vast majority of mainstream languages such as C++, HTML, Java, JavaScript, etc. Each language analyzer has language-specific quality rules, allowing the user to define a quality standard. This is an important feature when you consider the tradeoffs of stricter quality control. The stricter the quality standard, the higher the quality of the product, but conversely, standards that are too strict can also lead to increased frustration for users which can act as a barrier to adoption.
Continuous Quality
SonarQube is easy to pair with a Continuous Integration and Deployment (CICD) platform. It introduces the notion of Continuous Quality, which is easy to digest in the context of CICD pipelines. Such a pipeline would pass the code through SonarQube in an automated fashion to ensure Continuous Quality. Qualitative inspections provide not only insights into the health of the source code, but also the ability to highlight potential new risks. SonarQube also detects vulnerabilities that extend beyond the domain of code design.
Bugs, Code Smell, & Security
By analyzing source code, SonarQube is able to extract many metrics such as:
- Reliability: Covered by bug detection.
- Security: Covered by the detection of points of weaknesses, and problems related specifically to the security of the code.
- Maintainability: Inferred based on the following two factors:
- Code Smell: Determined by the code’s conformity to best practices.
- Technical Debt: An approximation of the time required to understand the code-base.
- Coverage: A measure of the rate of code covered by tests.
- Duplication: A measure of the rate of code that is repeated across the code-base.
- Size: A set of statistics about the code-base such as: number of files, functions, classes etc.
- Complexity: A measure of the cyclomatic complexity of control flow in the code.
All these metrics can be found in the SonarQube dashboard.
Developing your Repository
SonarQube offers two major ways to adapt the standards and requirement levels for each project.
- Quality Profiles: This feature allows you to define the standards and best practices for each programming language. While there are several preset industry standards such as PSR-2 for PHP users, SonarQube’s community has also contributed various other quality standards. It is quite possible to extend Quality Profiles by adding additional rules to define custom standards.
- Quality Gates: Quality Gates define a set of conditions to be met for code quality to be considered sufficient. They can be applied universally or on a case-by-case basis. For example, a Quality Gate could mandate that all new code must include at least 80% test coverage, or that there should be no diagnosed security issues.
The combination of Quality Profiles and Quality Gates allow you to define the high-level expectations of code quality within an organization. In general, more rules in profiles and more conditions in gates indicate a higher expectation of quality.
An Aid for Decision Making
SonarQube’s ability to produce several key metrics and offer a way to customize Quality Profiles and Quality Gates are essential assets for decision-making. Developers, tech leads, and managers can all benefit from such assets when it comes to making both technical and product related decisions. For example, a high visibility application with some technical debt can be rewarded with a sprint dedicated to refactoring to reduce the debt.
Launching the Service
Having presented the context for this article and a general overview of SonarQube, this section will now outline the main phases of the launch of this service:
- Preparation
- Configuration
- Communication
With projects of this scale, it’s always important to be well prepared before deploying any solutions.
Preparation
The implementation of a quality analysis system such as SonarQube is a relatively large undertaking which inevitably induces major changes within the organization.
At SSENSE, our two primary tech-stacks are as follows:
- JavaScript, TypeScript, NodeJS
- Python
While these two stacks represent 75% of all tech projects at SSENSE, there are other stacks with smaller project volumes that consist primarily of:
- PHP
- Java
Fortunately for us, SonarQube is able to handle all these languages, making it straightforward to manage the integration. Having identified the technologies, we decided to configure at least one implementation of each language. These implementations will be used later to create the documentation and a tutorial.
This brings us to our next point: the configuration.
Configuration
As seen earlier, the best way to achieve continuous quality is to pass the code analysis through CICD. So, for the purpose of this article, we assume that your projects mostly use Docker for containerized development and deployment, and Jenkins for continuous integration.
Adding Dependencies (SonarScanner)
SonarScanner is a client dependency of SonarQube that allows you to perform code analysis, generate reports and send everything to SonarQube. SonarScanner relies on the configuration file that is defined in the later section labelled ‘SonarScanner Configuration’.
To add the binaries, there are two options:
- Option 1: Add the binaries directly to the Jenkins server. It will be necessary to configure Jenkins to use the local binary and execute the Sonar analysis.
- Option 2: The option currently in use at SSENSE is to add the binaries to the application’s Docker container. For some context, our Dockerfiles compartmentalized into several sections such as release for production, development, etc.
To implement the second option, we must add the following block to the Dockerfile:
FROM alpine:latest as base…# Development Image including SonarQube Dependencies ##
FROM base as develop# Install SonarScannerRUN apk update && \apk add openjdk8-jre curl unzip nss && \curl -s --insecure -o ./sonarscanner.zip -L https://binaries.sonarsource.com/Distribution/sonar-scanner-cli/sonar-scanner-cli-3.3.0.1492-linux.zip && \unzip -q sonarscanner.zip && \rm sonarscanner.zip && \mv sonar-scanner-3.3.0.1492-linux /root/sonar-scanner && \ln -s /root/sonar-scanner/bin/sonar-scanner /usr/bin/sonar-scanner && \sed -i 's/use_embedded_jre=true/use_embedded_jre=false/g' /root/sonar-scanner/bin/sonar-scanner
At SSENSE, we made the above block a dedicated image that we integrate into the images of our applications. This binary addition will be important for the next phase as it is used by Jenkins to generate reports and send it all to SonarQube.
Jenkins Adaptation
If your organization uses continuous integration, it is likely that you already have some code quality validators such as unit tests and code coverage checks. The Jenkins adaptation can therefore be considered a way to re-design the unit testing and code coverage layer, in order to generate and send reports to SonarQube. Since we use Docker to deploy our applications, transmitting reports between the various Jenkins stages needed some tweaking to create a bridge between the Jenkins file system and the container’s system.
The example below demonstrates a Jenkins stage for a NodeJS project, which calls an inner-sourced Jenkins shared library project:
dockerRun(
"${CONTAINER_ID}_cover_run",
IMAGE_TAG,
"npm run cover",
[
'volume': "${env.WORKSPACE}/tests/coverage:${APP_PATH}/tests/coverage"
]
)
The code above changes when executed by the following command:
docker run --volume /var/lib/jenkins/workspace/some_project_branch/tests/coverage:/code/tests/coverage --name some_project_cover_run --rm some_image:some_tag npm run cover
Having redefined the way unit tests are executed, reports must be sent to SonarQube. The following is a second step in the Jenkins file that will perform this operation:
withSonarQubeEnv('DefaultSonarServer') {
withCredentials([string(credentialsId: 'sonarqube-credentials', variable: 'key')]) {
dockerRun(
"${CONTAINER_ID}_sonarqube",
"${IMAGE_TAG}",
String.format("sonar-scanner -Dsonar.host.url=%s -Dsonar.login=%s -Dsonar.projectVersion=%s -Dsonar.core.serverBaseURL=%s", "${env.SONAR_QUBE_URI}", "$key", "${COMMIT}", "https://url-to-sonarqube.com"),
[
'volume': "${env.WORKSPACE}/tests:${APP_PATH}/tests --volume ${env.WORKSPACE}/dependency-check-report:${APP_PATH}/dependency-check-report --volume ${env.WORKSPACE}/.scannerwork:${APP_PATH}/.scannerwork/"
]
)
}
}
Here too, we use disk mount points to pass reports generated in previous steps to the Docker container. The SonarScanner binary (installed in the earlier section titled ‘Adding Dependencies’) transmits all reports based on the sonar-project.properties
configuration file.
SonarScanner Configuration
The sonar-project.properties
file is a simple configuration file in the Java properties format. This properties file contains at-least three types of information:
- Information about the project itself, such as its name.
- Source location information, report files, exclusions, test files
- The links that are part of the project.
#
## GLOBAL PROPERTIES
#
# must be unique in a given SonarQube instance
sonar.projectKey=project-key
# this is the name and version displayed in the SonarQube UI. Was mandatory prior to SonarQube 6.1.
sonar.projectName=My Super Project
# Path is relative to the sonar-project.properties file. Replace "\" by "/" on Windows.
# This property is optional if sonar.modules is set.
sonar.sources=app
sonar.sourceEncoding=UTF-8
#
## SPECIFIC COVERAGE
#
# This part is specific to all languages and should be adapted to your situation.
sonar.tests=tests
sonar.test.inclusions=tests/**/*.py
sonar.python.coverage.reportPaths=tests/coverage/xcoverage-*.xml
sonar.python.xunit.reportPath=tests/coverage/xunit-result-*.xml
#
## LINKS
#
# Continuous integration link
sonar.links.ci=https://jenkins.com/
# Source Code Manager link
sonar.links.scm=https://github.com/
# Issues link
sonar.links.issue=https://atlassian.net
# Project link
sonar.links.homepage=https://atlassian.net/wiki/
Communication and Adoption
Once the SonarQube service is in place, the preparations made, and the pilot projects are set up and functional, the last step to complete the implementation of continuous code quality control is to properly communicate the developments within the organization. The aim of the initial communication is to complete the service launch by informing all stakeholders of its existence, its nature, and the problems it can solve. Like any other project of this scale, proper communication is key to driving adoption across the organization.
To help ensure adoption, we found the following strategies to be useful:
- Broadcasting regular notices with information about the project’s evolution, highlights, lowlights, etc.
- Information sessions about SonarQube and how it might help developers in their day to day.
- Technical meetings aimed at facilitating project integrations.
Conclusion
Technical Recap
As we have seen, the implementation of continuous quality control in a CICD pipeline can be done in three main stages:
- Add binaries to the location of your choice.
- Redesign unit tests and report generation to send all reports to SonarQube.
- Add and configure the properties file to outline how SonarQube should interact with the project.
The complexity of this implementation is dependent on the current state of your project. It must be admitted that rewriting unit tests can be time-consuming and have possible repercussions depending on the specific case. Younger projects will usually have little to no problem integrating a continuous quality system since changes can be made quickly with very few side effects. On the other hand, more mature applications with larger liabilities and complex organizational structures will require an investment of more time, resources, and planning. If you are considering SonarQube for your organization, it’s important to consider all such factors and devise a plan that works for you.
Outcomes at SSENSE
When we started writing this article several months ago, we had about 14 out of 80 eligible projects integrated with SonarQube, representing about an 18% rate of adoption. Today, Tech at SSENSE has about 90 projects eligible for our quality automation system, of which 39 have already been integrated, representing a 43% rate of adoption. Given that this endeavor is not even a year old at the moment, our growing rate of adoption can be considered a positive sign. Our greatest learning has been that defining a feasible plan is key to ensuring success in a project of such scale. The initial plan should depend on your starting point in terms of your technical ecosystem and organizational structure. It should outline the high-level technical roadmap, and a well researched strategy for communication and adoption. A special thanks to all those who helped set up and improve this project, and drive its adoption.
Editorial reviews by Hussein Danish, Deanna Chow, Liela Touré & Prateek Sanyal.
Want to work with us? Click here to see all open positions at SSENSE!