Deploying Dependency Track as a Docker Container in Azure and building a PipeLine with Azure Devops

Leandro B.
DevRoot
Published in
14 min readAug 1, 2020

In this article I will be showing how to deploy OWASP Dependency Track into an Azure Container and use it as an application.
I will also be showing how to configure a pipeline in Azure Devops that will create a BOM File(Based on your csproj file in your repository) and send it to the Azure App that we previously created to be analysed and stored.
To be able to do this you’ll only need an Azure Devops free account and an Azure Portal free account.

In Azure Portal we will:
• Create and configure our Sql Server Database;
• Create and configure the Docker Container and Registry;
• Create and configure the Application to run our container image.

In Azure Devops we will:
• Create and configure our pipeline to send the needed BOM File to be analysed.

NOTE: The repository that I am using to build my pipeline has a .NET Core Web API project and it is that project that I’ll be analysing with Dependency Track.

If you guys want to setup Dependency Track with Azure AD OpenID Connect, you can check this article that I wrote: https://lyny-leandro.medium.com/dependency-track-with-azure-ad-openid-connect-b2d13861c4f5

So, before we begin…

…let’s see what is the main technology that we are going to work around which is Dependency Track!

OWASP Dependency Tracker: “Dependency-Track is an intelligent Supply Chain Component Analysis platform that allows organizations to identify and reduce risk from the use of third-party and open source components.”

src: https://owasp.org/www-project-dependency-track/

You can get more information about it in their Website which has a lot of useful information and documentation, but basically Dependency Track is a software that will analyse your dependencies and libraries in your application and will search for any known vulnerability about them. It’ll also tell you if the libraries and dependencies are outdated or deprecated.

Their Website with their documentation: https://docs.dependencytrack.org/

Dependency Track Dashboard

Container Requirements

As stated in their documentation to run Dependency Track into a container you need the following Container Requirements:

Minimum:
4.5GB RAM
2 CPU cores

Recommended:
16GB RAM
4 CPU cores

You can check more information about this subject in:
https://docs.dependencytrack.org/getting-started/deploy-docker/

With all the relevant information out of the way…

…Let’s begin!

Firstly let’s take a look at the yml file that we’ll use to set up our container in Azure.

As you can see, in our file, we have some very important things, like the Database Connection, the Driver that we will need and the image that we will use, which is the Dependency Track one, the volume which is very important and the path to our driver.

You can download the SQL Server Driver here: https://www.microsoft.com/en-us/download/details.aspx?id=100855

For this specific medium article, we’ll be using the Java 8 SQL Server driver.

Let’s start with our volume:
Since we are deploying it to Azure we will need to use the following environment variable ${WEBAPP_STORAGE_HOME} and following this will be the website path where we stored our driver.

The /data/ segment in the volume is the default path in our container that is created, so, we are basically linking our Azure Website directory with our container Volume so it can find the driver.(This will be easier to understand when we create everything in azure).

Driver Path:
This line in our code ALPINE_DATABASE_DRIVER_PATH=~/driver.jar will be translated to /data/driver.jar in which, as I explained above, will be linked with our application wwwroot path.

And finally…

The Database Connection:
It’s really self explanatory, it’s our Database Connection to store all the needed information for our Dependency Track application to run.

Let’s jump into Azure and let’s Create and Configure our Database.

Go to the “Create Resource” tab and select Database. In the Databases section select “SQL Database”.

You will be met with a panel to create your Database name and set all the configurations of your Database. In this picture, I am using the names that I am using in the YML file, of course that you can change them to meet your taste!
You can let the default storage, as for this article demonstration is more than sufficient!

You can jump to the “Review + Create” tab since this is all you’ll be needing to set your Database.
After you press the continue button to create your Database you’ll need to wait a little bit for it to be done, when it is, go to the Database Resource that you created.(It will be either in your Resource Group that you set it to be or in your default panel).
You’ll be met with the following overview panel:

You’ll need to click in the “Set Server Firewall” setting.
As for now, you or any other IP Address has access to your Database and for this article purpose we will change that and we will make every IP have access to it.
Normally, when you try to Connect to an Azure Database you’ll be met with an interface to add your IP to a Whitelist, since we want our Application to have access to it and we don’t want any trouble with Database access problems, we will configure this.
Note that is always viable that you only put the access to RESTRICTED IP Addresses for security reasons.

Once you set up the “Start IP” to be 0.0.0.0 and the “End IP” to be 255.255.255.255 click the Save option and that’s all.
You have a Database Created and Configured to run Dependency Track.

You can test your connection to the Database that you created in your Local Machine.

As you can see, I can connect to the Database that I created which is good!

Now that we have our Database…

…Let’s jump to our Container Creation and Configuration!

Let’s create our Container Registry

Go to “Create A Resource,” then look under Containers > Container Registry.

Same as others, you’ll be asked to fill some information. It is really straightforward and you only need to name your Registry and create or select a Resource Group for it.

When you finish doing it, you will need to navigate to the Resource location so you can activate the access keys option, in which will give you an User and 2 Passwords.

Once you click the “Access Keys” tab, just “Enable” the “Admin User” option and it’s all set and done. We have our Container Registry Created and Configured.

Now, let’s create and configure our web app for our container…

…Actually, this is the trickiest part in all this article. Well, it’s not hard per se, but we have some configurations that we have to do and understand why we are doing them.

We start by going to “Create A Resource,” then look under Containers > Web App for Containers.

You will need to fill some relevant information like the name, create or add it to an already existing resource group(Does this procedure look familiar?).

But there is a slightly change that you need to do in the “Basics” tab which is the “Sku and Size”. We need to change our “Sku and Size” to meet our Container Requirements for Dependency Track.
I increased it to a little bit above the minimum requirements which I mentioned in one of the sections of this article, but you can choose whatever you feel like, just make sure that meets the minimum or above the minimum requirements for your Container App to run smoothly.

Let’s jump to the “Docker Tab”.

In here, we will choose “Docker Compose” as our set of “Options” and we will choose our “Image Source” to be “Azure Container Registry”. Then, all you have to do, is select the Registry that you created(In my case it was “DependencyRegistry”) and choose the Configuration File, which is the YML file that we discussed about previously in this article.

You can go to “Review + Create” tab and create your Web App Container.

Once it’s created, go to the Web App location and stop it. We will need to do some configurations before we move further.

Now that our Web App is stopped, let’s hop into the “Deployment Credentials” tab and configure our password for our FTP connection in which we will use to create our folders and upload our driver.

Save it and go back to the Overview Tab. You will find something called “FTP Hostname”. Copy that link and hop to your FTP Client so we can use it to access our Application File System. I will be using FileZilla in this example and it will look something like this.

Remember the Volume that we talked previously about in the YML file? We had this path “${WEBAPP_STORAGE_HOME}/site/wwwroot/data/extlib/”. We will only need to create “/data/extlib/” inside of /wwwroot and upload our driver to /extlib/
In the image below you can see that done.

NOTE: You need to set the filepath the way you set it in the YML File, in my case, I set it this way, but you are free to do as you like!

With this out of the way, we still need to do some other configurations for this to work, as for now, it isn’t enough.

You’ll need to change to the “Configuration” tab and change 1 variable and add a new one.

Configuration Name: WEBSITES_ENABLE_APP_SERVICE_STORAGE

For our container to know that we are using a Volume Storage and to link them between our Application Service and Container you will need to change it’s value to “true”, since the default value is set to “false”.

Configuration Name: WEBSITES_CONTAINER_START_TIME_LIMIT

We will need to create this configuration in order for our application to run and don’t timeout when the container image is building. Since Dependency Track has a slow image build(Takes around 20/25 mins to create) we need to create this configuration and set it’s value to “1800” which translates to our container having 30 minutes to start!
Save your configurations and you can start the application!

Your configuration will look like the image above.

Wait for about 20 minutes until it’s all set and done.
If we go to our Database we will see some tables that were created so that our application could run.

And if we copy and paste our Application URL to the browser and we hit enter we will have this page showing up, which is great, since this is our Dependency Track Login page!

The Default login is:
Username: admin
Password: admin

After you login it will promt a window for you to change the admin password. Once you are done with that and login with the new credentials, you’ll be met with this Dependency Track Dashboard.

Before we go to Azure Devops to configure our Pipeline to work with our Application, we will need to create a project, get the project ID and the API Key.
So, let’s go to our “Project” tab and let’s Create a Project.

A modal pop-up will appear for you to enter some information about your project. Fill in the form as you wish and when you are satisfied with what you have, hit the “Create” button.

A new project will appear in the Grid in your Projects Dashboard, click it and you will navigate to the project itself. It is empty for now and it’s how is going to be until we build our PipeLine to upload our BOM file.

Once you are inside, look at your URL and you’ll have a hash like string. That’s the project ID, copy it and paste it to a Notepad, you’ll need it later on.

Now, for the final step, we will need to get the API Key. To do that, we will need to navigate to the “Administration” Tab inside the tab go to “Access Management” in there select “Teams” and select the account which has the API KEY in the grid with the “1” value. Once you click it you’ll be presented with the following screen.

Once again, copy and paste the API Key to a Notepad since we will need it later on.
So, we have created our Project, we got the Project ID and the API Key, we are ready to go to our Azure Devops Account and create our PipeLine, so let’s go!

Azure DevOps Pipeline

Now that we have everything set and done, let’s build our Pipeline.
You can do it with an extension of Dependency Track in the market which is free but you won’t have it exported to your Dependency Track Application automatically and it takes up to 3/4 minutes just to analyse the csproj file.

What’s the problem about the 3/4+ minutes of time?

Well, in Azure Devops you pay for the time that your Pipeline takes to finish, imagine that you have set a group of Pipelines to work in your CI/CD plan and you have ALWAYS to analyse your dependencies and libraries, this will add up and at the end of the month you’ll feel it.

So, what are the advantages of doing it this way?

To begin, you have your own Dependency Track personalized Dashboard, I have it in a container in Azure, for this example, but you can have it on your own server running in a container or not.
The pipeline that we are going to build takes less than 1 minute at best to do all the work and plus exports the BOM File automatically to your project. Of course it all depends on the size of your project and how many dependencies and libraries you have, but in general, this is way effective.

So, let’s begin and analyse the YML file that we are going to use!

In this YML we have 3 DotNet tasks in which the first one is to download the SDK of .NET into the ubuntu image
The second one is to download CycloneDX, it is the package that will analyse and export our csproj file to a BOM file.
The third one is to run the CycloneDX command to generate our BOM File.
The last one is to take our BOM file and upload it to Dependency Track.

Note that you’ll need to change your path to your repository, in my case, it is /Commander/Commander.csproj. You’ll also need to add your ProjectId, API Key and Dependency Track URL in the last section of the YML File.
In top of all that, you need to download from the market a Dependency Track library to run the last task.

https://marketplace.visualstudio.com/items?itemName=GSoft.dependency-track-vsts

With all this out of the way…

…Let’s begin!

Login to Azure Devops, if you don’t have a project yet, create a new one and link a repository that has a project that you’d like to analyse.
In my example, I’ll be using a .NET Core Web API

Now, we will need to create a Pipeline. Go to the Pipeline Tab and click where it says “Create Pipeline”. You will then be presented with an interface with many options, choose the one that says “Use the classic editor

Select your Repository and hit continue. In my case I’ll be using Azure Repos Git.

Once you hit continue you’ll go to an interface that will show you all the templates that you can use to build your pipeline. We will choose the YAML one.

Import the YML file into your repository and then select the path in which you put it in.

Once it’s all done, Save your Pipeline. Now go to your Pipelines tab and you should have the Pipeline that you saved created! Click on it and run it and wait for an agent to be available to run your pipeline.

You should have the following screen when you go to the Pipeline Job interface.

As you can see, for this project, it took only 33 seconds to analyse and do all the work of sending it to my Dependency Track Application.
If the job failed, you must have configured something wrong, once again:

• Don’t forget to change the YML file to target your .csproj;
• Don’t forget to change the API Key, the Project key and the Application URL;
• Don’t forget to download the library in Azure Devops market to the account that is running the Pipeline!

Library to Download: https://marketplace.visualstudio.com/items?itemName=GSoft.dependency-track-vsts

So, Let’s check our Dependency Track Application!

When I go to my Projects Dashboard I can see that something was uploaded at the time that the pipeline ended!

When I enter the project I can see that I have dependencies that were analysed meaning that IT WORKED!
My pipeline sucessfuly sent the BOM File that was generated to my Application and it was automatically analysed and I can see that I don’t have any vulnerabilities in my dependencies and libraries!

Conclusion

We touched in a lot of topics in this article regarding Azure Portal, Azure Devops, YML Files, containers and Dependency Track.
I hope that you found this article useful and that you found Dependency Track to be a must to add to your collection of applications to scan your application vulnerabilities. Remember that security is very important and it’s always nice to have a double check in things that we sometimes take for granted, like our dependencies and libraries being secure!
Sometimes performance and security don’t go well together and I tried the best that I could to optimize a custom pipeline to be somewhat faster than the others that I’ve already seen.

I hope you guys learned something new from this article, I sure did and it was a pleasure writing and making this possible.

--

--

DevRoot
DevRoot

Published in DevRoot

Knowledge Sharing Hub for Computer Science related topics .

Leandro B.
Leandro B.

Written by Leandro B.

PT🇵🇹 Penetration Tester/Ethical Hacker. OSCE3, OSEP, OSED, OSWE, OSCP, CRTP, OSWP, eMAPT and eJPT Certified.

Responses (4)