A Secure Software Supply Chain to Deal With Cyber Threats, Technical Debt and License Violations
— by Tim Riffer and supervised by Dirk Kolb.
The continuous introduction and implementation of the DevOps concept and the associated automation of software creation and delivery has made the software supply chain a business-critical unit in companies. What many publications and blog posts that deal with DevOps rarely mention, is that traditional software delivery pipelines do not address security issues, license violations or technical debt. Considering how often complex software systems are used in critical infrastructures, it would be negligent to ignore these issues in the DevOps environment.
We tasked ourselves to build a secure software supply chain for our Data Fusion Platform. As an advocate of innovative IT solutions, we are committed to deliver innovative, state-of-the-art and secure software to our customers and partners.
NIST’s Secure Software Development Framework white paper covered various elements of secure software development:
- Prepare the Organization (PO): Ensure the organization’s people, processes, and technology are prepared to perform secure software development.
- Protect the Software (PS): Protect all components of the software from tampering and unauthorized access.
- Produce Well-Secured Software (PW): Produce well-secured software that has minimal security vulnerabilities in its releases.
- Respond to Vulnerability Reports (RV): Identify vulnerabilities in software releases and respond appropriately to address those vulnerabilities and prevent similar vulnerabilities from occurring in the future.
We assume that every software developing company has a minimum set of security in order to fulfill the points PO and PS mentioned in the white paper. Thus, we will focus on PW and RV. We present a concept that deals with security, technical debt and license violations. What we will not explain is the hardening of operating systems or IT network security as it would go beyond the scope.
Why is it necessary?
At a time when there is daily news of security breaches and the resulting data leaks, companies should do everything they can to ensure a high level of security for their own software. This becomes even more important when software artifacts are deployed and used in the context of a software development kit or in critical infrastructures. Customers rely on everything being done to ensure the security and reliability of your software. Security and reliability are often vague requirements within contracts resulting in security teams knocking on the door with the goal to run an audit. To avoid such a stress in the first place, your own software should be checked for possible vulnerabilities each time it is released as part of the DevOps workflow.
You could say, “We have our own security team,” but that team is usually not as large as necessary and not as involved in software development and testing as it should be. In times of new movements, such as DevSecOps, it is a reason for us to automate as much as possible when it comes to security. It is obvious that this does not mean that manual penetration tests can be dispensed with, but they are not part of this article.
In the following, we explain the possibilities of automated security control that we use in our continuous supply pipeline. We will mainly be talking about Static Application Security Testing (SAST) and Dynamic Application Security Testing (DAST), which is what we have in our pipeline at the moment. We investigate some tools from Interactive Application Security Testing (IAST) and Run-time Application Security Protection (RASP) to determine which best meet our requirements.
Finding the Right Tools for the Job
The search for the right tools for the job is determined primarily by the question of what we want to achieve. It is obvious that not only the own code can lead to problems. Especially dependencies on foreign code often lead to security holes, technical fault or license violations.
When it comes to testing software security, there are many different products on the market. We have taken a closer look at some services and tools and filtered out the ones that best fit our needs. In the following, we explain which tools we use to secure the individual areas and how we integrate them.
Static Code Analysis
Dependency Vulnerability and Update Analysis
As mentioned at the beginning, our supply chain not only serves to find bugs or vulnerabilities in the code, but is also intended to help us continuously work off the technical debt. Part of the technical debt is obsolete or uncertain dependencies, which occur very frequently in complex software projects. To work on them, we use Dependabot. This service is now available as a free app for GitHub customers and checks all found dependencies for outdated or unsafe versions on a daily interval. Apart from source code dependencies, Dependabot can also be used to keep the version of base images used in Dockerfiles up to date. Each hit results in an individual pull request created by Dependabot. Our build system contains an extra pipeline for such pull requests, which analyzes and tests them, and in case of a positive result merged with the master branch.
Since Dependabot is not a direct part of our build process and therefore does not run a dependency check on every commit, there is an additional job for this in our pipeline. We use the Gradle plugin of OWASP Dependency Check (element 3 in the block diagram). We’ve found that this is sometimes more accurate than some paid services we’ve tried.
In addition to checking for insecure dependencies, we will also use the Gradle Versions Plugin to check for outdated versions (element 1 in the block diagram). If something is found, the development team will be informed about which dependencies are obsolete.
Docker Image Vulnerability Scan
We currently use Anchore to check our Docker images for security vulnerabilities and compliance issues (element 6 and 7 in the block diagram). The main reason we use Anchore is because we can easily integrate it into our build pipeline. We have just started using this tool and are still exploring and integrating all the possibilities. In addition to scanning our created Docker images, we are continuously scanning Docker images that are used within our infrastructure and development.
Dependency License Analysis
Developers tend to use supposed free libraries and deliver them to customers. Unfortunately, developers are not lawyers or specialists in open source licensing. To avoid license violations and problems for us and our customers, we take a proactive approach and perform an aggressive license analysis using FOSSA (element 2 in the block diagram). FOSSA is designed for license and vulnerability management of dependencies. In general, FOSSA is a paid service, but you can try it and use 5 repositories for free with limited functionality. We mainly use FOSSA for license management and use the generated reports which provide information about possible license problems. We integrated the FOSSA CLI into our continuous delivery pipeline in order to trigger a dependency check with every source code commit.
Web Application Vulnerability Scan
For scanning the vulnerability of our web applications we decided to use OWASP ZAP (element 8 of the block diagram). This is an open source tool developed by a large and active community. There are different possibilities how to run it. As we are using Gradle, it was obvious that an integration as a plugin into our pipeline is the easiest way. Additionally to the automated tests, you can use it for your manual penetration tests.
Because of the integration into our automated build process, every change to the software undergoes all of the above checks. In addition, the correctness of the artifacts is checked again before a new version is released to our customers. This ensures that we only release software that fully meets our security requirements and provides the highest possible level of security for our customers.
So we can sit back because our software is secure — right?
The current state of our tools and services is only a snapshot. We are aware of the fact that we have to evaluate again and again whether there are new tools that perform a certain security task even better or if the tools used no longer fit.