Mitigate Vulnerabilities and Improve Go Quality Code

Jhon Baron
Globant
Published in
9 min readSep 10, 2024
Binary gopher

Normally, we inject all sorts of libraries, components, and new frameworks into our projects that we find on some internet forum, as they solve the issue or feature we want to implement. We do this without first validating the version we are installing and the potential risks that specific tags contain today. This is often shared in talks, mandatory courses, and training sessions that we receive as part of our day-to-day activities as collaborators of our organizations. However, the current development flow, which demands constant speed and continuous delivery, often leaves this aspect aside.

This is why we tend to establish necessary and relevant barriers to prevent attacks on our services from the outside, such as SQL Injection, DDoS, or Man-In-The-Middle Attacks. However, we must always keep the following premise in mind:

Our enemy is not necessarily outside our home; sometimes, it lives in our pocket.

This article delves into the various strategies and tools available to Go developers to avoid vulnerable dependencies and prevent data leaks. We will explore native Go functionalities, as well as how to integrate and create custom pipelines that enhance our development processes, so by the end of this article, you’ll have a comprehensive understanding of how to leverage tools like govulncheck for vulnerability detection, SonarQube for code quality analysis, and Jenkins for automating these checks, thereby maintaining a secure and efficient codebase.

Govulncheck Tool

The architecture for extracting and processing vulnerabilities that occur within dependencies developed in the Go programming language is one of the most interesting projects accomplished by this organization, primarily, they manage to obtain all this information initially from the following resources to ensure proper functioning for govulncheck:

Golang vulnerabilities pipeline

All this information is persisted in an internal database with the previously mentioned reports, which are reviewed by the Go security team and these reports can then be accessed through their public API. This whole process enables the use of the govulncheck tool, a native Go tool that allows the detection of vulnerabilities within the service we are working on, or any Go component that supports it. This tool can be configured in different ways.

Installing govulncheck

In the terminal, enter the following commands:

go install golang.org/x/vuln/cmd/govulncheck@latest

This will download the govulncheck tool and we will get the following output:

govulncheck output

Govulncheck in VScode

We add the following properties to our settings.json file in VSCode:

"go.diagnostic.vulncheck": "Imports", // Enable the imports-based analysis by default.
"gopls": {
"ui.codelenses": {
"run_govulncheck": true // "Run govulncheck" code lens on go.mod file.
}
}

This will enable a new option in our code editor when we are working with a go.mod file.

Govulncheck in VScode demo

It is important to note that, according to Go’s own documentation, the information is not sent to any external server or agent outside the local environment or where it is being executed. The validation mechanism involved in running these commands is related to downloading the library database and validating each one in the environment where the validation takes place.

The extension does not scan private packages nor send any information on private modules. All the analysis is done by pulling a list of known vulnerable modules from the Go vulnerability database and then computing the intersection locally.

Process Automation

Concepts such as CI/CD and DevOps ensure that every step or process within the software development lifecycle adheres to the title of this section, as they help avoid all sorts of aggravations and surprises on a typical day by monitoring our applications. It is worth noting that many members of the development ecosystem will remember events such as the zero-day vulnerability in one of the most widely used libraries in web services and applications: Log4j.

Log4shell gopher

Considering its slogan “Write once, run anywhere (WORA), or sometimes Write once, run everywhere (WORE)”, Java becomes a primary focus in this situation due to being the most affected language. Java was (and still is) found on almost every functional device, causing panic among different teams maintaining these systems and this highlights the importance of regularly checking our services to avoid being caught unaware or exposed to such situations. Below, we will present some simple yet effective strategies to monitor our code and enhance its quality.

Jenkins Pipelines

Jenkins pipelines are always very helpful and are among the most widely used open-source tools in the entire productive application development environment. It can practically be defined as follows (while we install it using the following script):

Jenkins is an open-source automation server used to build, test, and deploy software projects. A Pipeline in Jenkins is a predefined script (Jenkinsfile) that automates these processes through a series of steps and stages, enabling continuous integration and continuous deployment (CI/CD).

brew install jenkins && /usr/local/opt/jenkins/bin/jenkins --httpListenAddress\=127.0.0.1 --httpPort\=9090

After configuring Jenkins in a basic way (we install the Go plugin by following the documentation), we create a Pipeline Job and perform the following configurations:

  • Enable the option Do not allow concurrent builds.
  • Enable the option Run periodically.
  • Specify H * * * * so that it runs every hour daily.
  • In the Pipeline section, specify the following Pipeline Script, where each of them stages directly corresponds to each phase. Apply and save the pertinent configurations.
pipeline {
agent any

environment {
GOPATH = "${env.WORKSPACE}/go"
}

stages {
stage('Clone Repository') {
steps {
git 'https://github.com/Jhooomn/bidirectional-stream-comunication-servr.git'
}
}

stage('Setup Go Environment') {
steps {
script {
// Use the configured Go tool from Jenkins
def goInstallation = tool(name: 'go-1.22.1', type: 'go')
env.GOROOT = "${goInstallation}"
env.PATH = "${env.GOROOT}/bin:${GOPATH}/bin:${env.PATH}"
}
}
}

stage('Install govulncheck') {
steps {
sh 'go install golang.org/x/vuln/cmd/govulncheck@latest'
}
}

stage('Run govulncheck') {
steps {
sh 'govulncheck ./...'
}
}
}

post {
always {
cleanWs()
}
}
}

Pipeline Explanation

  • The agent any directive tells Jenkins to run the pipeline on any available agent.
  • The environment instruction sets the GOPATH environment variable to a directory within the Jenkins workspace.
  • Clone Repository: This stage clones the specified Git repository into the Jenkins workspace.
  • Setup Go Environment: This stage configures the Go environment by setting GOROOT and updating the PATH to include the Go binaries.
  • Install govulncheck: This stage installs the govulncheck tool, which is used for detecting vulnerabilities in the Go code.
  • Run govulncheck: This stage runs the govulncheck tool on the entire codebase to identify any vulnerabilities.
  • Post Actions: The post section ensures that the workspace is cleaned up after the pipeline execution, regardless of the build result.

After completing the configuration, click the Build now button, which will allow us to execute the previously established pipeline. We can validate this as follows: it will clone our repository, set the version of Go we want to use, install the vulncheck tool, and then finish. This tool allows us to exit using the ctrl + c instructions, which will generate an error, but this is intentional. The important part is the text displayed in the console (note that this workflow can be improved by validating and omitting this error, but it is intentional for this article to enhance the explanation).

Jenkins pipeline output

Code Quality

It is our responsibility to maintain our code daily with the best possible specifications and this is often accompanied by review processes in the development workflow, extensions to prevent errors, or only until we realize that our implementations were not well-designed at first glance. However, some tools allow us to validate certain globally standardized rules to be considered when developing our code, so we can continue to adhere to these best practices — such as detecting high complexity in functions, excessive use of variables, and poor documentation in methods, among others.

For this, we will use the well-known tool SonarQube. Below are the instructions to install and deploy everything necessary to work with it (after which, we will be able to access http://localhost:9000/ to view its UI):

brew install sonar && brew install sonar-scanner && brew services start sonarqube

We will find a way to configure a repository or project that we have in our local environment. Thus, we specify the option Create a local project. Next, we name our project and select Next. Enable Use global setting so that the new code we implement is always taken into account, and then click the Create project button.

At this point, we can specify the Jenkins option to continue with the workflow we have been working on through the pipelines. The documentation specifies that we need to install the SonarQube Scanner plugin for Jenkins — version 2.11 or later in Jenkins to use these functionalities.

We create a new file called sonar-project.properties in the root of our repo and inside it, we specify the name of the project we configured and other properties as shown below. We also update our pipeline with the following stages before executing govulncheck:


stage('SonarQube Analysis') {
steps {
withSonarQubeEnv('SonarQube') {
sh 'sonar-scanner -Dsonar.token=<user_token>' // generate a new user and token to auth
}
}
}

In the sonar-project.properties file located at the root of our repository, we will specify the following configurations:

sonar.projectKey=quality-project
sonar.sources=.
sonar.exclusions=**/*_test.go
sonar.tests=.
sonar.test.inclusions=**/*_test.go

Following this, we can execute our pipeline and obtain the following results:

  • Authentication of our user via a token.
  • Auto-generated logs from Sonar in our Jenkins console, indicating each of the files that have been processed and discarded by the parameter.
  • Validation of plugins.
  • Coverage using the Jacoco tool.
  • Final status of our results and the time it took to complete.
  • Completion of the parallel threads generated to perform this analysis.

Also, after completing this process, we can see a new icon or button with the Sonar logo in the list of executed builds, by clicking on it, we will be redirected to our dashboard where we can validate relevant information.

SonarQube main dashboard

By default, 80% of the lines of code must have coverage to meet the minimum standard for the repository and a high code coverage helps catch bugs early, improves code quality for sure, and provides confidence that the code behaves as expected. Tools like unit tests, integration tests, and code coverage analysis (like how we’re doing here) can help achieve and monitor this standard.

SonarQube coverage

We will be able to observe the amount of duplicated code in our project, which by default is set at 3%, this is something to keep in mind because it occurs when identical or similar code appears in multiple places within a project so this increases maintenance effort and the risk of bugs, as changes must be replicated across all instances. It also makes the codebase larger and harder to navigate and how can we reduce this? To address this refactoring the code to use functions, methods, or reusable components.

SonarQube duplicated code lines

Issues (also known as code smells) completely affect the quality of the code because they are problems categorized by the same tool, as you can see in the following image, which urgently indicates that we must perform such an update immediately. We must always keep in mind that SonarQube has a great source of information collection that specifies most of the possible best practices for our projects.

SonarQube issues list

Conclusion

In conclusion, effectively mitigating vulnerabilities and improving the quality of GoLang code can be achieved through the use of robust tools and practices in the development lifecycle: By incorporating govulncheck for vulnerability detection and SonarQube for code quality analysis, we ensure our projects adhere to high standards of security and maintainability; Configuring Jenkins pipelines automating these checks not only enhances our workflow but also proactively addresses potential issues, such as code duplication and code smells, which contribute to maintaining a clean and efficient codebase. Adopting these practices and tools empowers development teams to confidently advance their projects, safeguard against common pitfalls, and align with best coding practices.

Resources

--

--