Building an Azure Cloud Detection Tool — Part 2

Los-Merengue
9 min readJun 30, 2023

--

Welcome again, just like as it was discussed in the previous part of this blog post, we will be taking you through the process of building a detection tool in azure cloud.

Prerequisite:

  • Azure account: We can create an account using this link
  • Git Knowledge: You may need to understand a little bit of git
  • Terraform: You may need to understand the basic command in terraform. If you don’t, you can check this link on my blog for terraform
  • PowerShell: You will need to understand some commands in PowerShell so it would be easy to follow.

The diagram below gives a general overview of what the steps involves. This approach is a continuous one and it is the base at which we can think about the project.

Figure 1: General Overview of Detection and Response Tool

To have a sense of direction, let’s highlight step by step what we will be doing

  • Deploying the Resources in Azure Cloud
  • Configuring the Logging
  • Attacking the Cloud Environment in order to Create a True-Positive Event
  • Create a Detection Rule and Automation
  • Test the Detection Rule and Automation
  • Clear the Resources to Avoid Over Billing

Task 1: Deploying the Resources to the Azure Cloud

Objectives:

The objective of this task is to deploy the resources that we will be using to set up this infrastructure, we will be doing this in the following steps

  • Log into your Azure account and launch a Cloud shell Session
  • Download this source code from GitHub

This task will be carried in a simple way by using your web browser. No additional tool like setting up an IDE (Visual Studio Code) or using an SSH Client Will be required, we will be using the cloud shell provided by Microsoft. To learn about setting up an environment locally, you can read this blog — where I discussed how you can set up terraform in an IDE.

After creating your Microsoft azure account, log in to the account and you will be see this interface

Figure 2: Interface of Microsoft Azure

Just as seen in the screenshot above, you can launch the cloud shell session by clicking on the command prompt icon at the top of the page. If you are just using the cloud shell for the first time, you will be required to create a storage account to save every content you will be using in each session.

Figure 3: Cloud Shell with PowerShell Prompt

If it is required you select between PowerShell or Bash, select as seen in the screenshot above.

Next we will download the repository that using the command

* git clone https://github.com/Los-merengue/Azure-Detection.git
Figure 4: Cloned Infrastructure Repo

Next, we will need to locate the terraform script in the in the directory and this can be done by running the following command

* cd ~/Azure-Detection/terraform
* Get-ChildItem

The above command will enter the directory that contains the terraform scripts and output them to confirm as shown in the screenshot below

Figure 5: Terraform Scripts

We finally now have all the components needed to deploy the resources in our Azure account. Let us use the command to deploy the resources in the cloud using the terraform commands.

* terraform init
* terraform plan
* terraform apply

The command above will first pull all the required components required to make the deployment possible through the init command. The plan command will preview the set of resources that wants to be deployed and finally the apply command will provision the resources.

Note that when prompted ‘Do you want to perform these action’, type yes and press Enter.

After the resources have completed deployment, it will output the screenshot below.

Figure 6: Output of Completely Deployed Resources

Now, lets verify if each of the resources that is required is provenly deployed using the following command

* Get-AzResourceGroup | select-Object ResourceGroupName
* Connect-AzureAD
* Get-AzureADApplication
* Get-AzOperationalInsightsWorkspace -ResourceGroupName DetectionWorkshop | Select-Object Name
* Get-AzStorageAccount -ResourceGroupName DetectionWorkshop

To understand each of the command is intuitive, just as the command name implies, the command precedes them. Let me explain

  • Get-AzResourceGroup | Select-Object ResourceGroupName:This command will output the information of a group of resources, hence Get. The vertical line in the command will pipe (feed the information) to the next command and select the name of the group of resources, think about this like filtering the results but in a more specific context.
  • Connect-AzureAD: This command will connect the powershell session to an Active Directory (A database that stores information about each component in a network and make it easy for an administrator or user to find and use). In the case of Azure Active Directory, it is similar to the IAM service of the other cloud providers such as GCP and AWS
  • Get-AzureADApplication: This will output the information of the Azure Active Directory
  • Get-AzOperationalInsightWorkspace -ResourceGroupName DetectionWorkshop | Select-Object Name: This will output the operational information of this project we are working on DetectionWorkshop and specifically produce the result for the name object.
  • Get-AzStorageAccount -ResourceGroupName DetectionWorkshop: This will get the storage account attributed to a resource group name and in this case, our resource group.
Figure 7: Output of Command Above

We have completely deployed the resources, the next thing is for us to set up logging in the next task.

Task 2: Configuring the Logging

Objective:

The objective of this task is to configure a logging (summarized information of specific activity within the resources) system that will be shipped to a log analytics workspace used to create this detection. We will achieve this by:

  • Researching ATT&CK Technique detections and discover an approach to use in the Azure tenant
  • Configure logging of blob (binary large object- they are used to store different types of files in azure, they are unstructured, scalable, high performance storage system and they optimize cost for data storage in long term) storage and ship these logs to analytics workspace.

Let’s review this MITRE ATT&CK Technique. You may wonder what is this link about. Think about it as a link that helps you study the technique that an attacker will use in a scenario like this. This is essentially what the MITRE ATT&CK is used for, it is like the encyclopedia for cyber adversaries. Furthermore, you can use the link, study it to understand things from the adversary perspective.

By opening that link you can see that there is a juicy part as shown in the screenshot below

Figure 8: Detection Technique of MITRE

This is my favorite part, it gives a picture of how we could detect the malicious activity of this technique. In our case we will be thinking outside the box a bit to leverage a honey file ( this is like a file that will attract attackers). This file will be kept in a key location of our organization and if they are accessed, we detect and immediately respond as the attacker has made their presence known.

To be able to track usage of a honey file, we must monitor when it is accessed. This can be done by creating a diagnostic setting on our blob storage resources. Think about this setting as a configuration which specifies the information you want to keep track of and where you want to feed/stream this information to.

In our case, the configuration we want to keep track of is the StorageRead log category. We will collect all the logs and metrics of this resources so we can see everything that is happening. We will consequently push this logs to the analytics workspace to make it easier to read the logs. For sure it will be easier to use the GUI to make this configuration, but I will be using the command line to facilitate this configuration

* $blobServicesId = (Get-AzStorageAccount -ResourceGroupName DetectionWorkshop).id + "/blobServices/default"
* $logAnalyticsWorkspaceId = (Get-AzOperationalInsightsWorkspace -ResourceGroupName DetectionWorkshop).ResourceId
* $DiagnosticSettingName = "AllEvents-LogAnalytics"
* $metric = @()
* $log = @()
* $metric += New-AzDiagnosticSettingMetricSettingsObject -Enabled $true -Category 'Transaction'
* $log += New-AzDiagnosticSettingLogSettingsObject -Enabled $true -Category 'StorageRead'
* $log += New-AzDiagnosticSettingLogSettingsObject -Enabled $true -Category 'StorageWrite'
* $log += New-AzDiagnosticSettingLogSettingsObject -Enabled $true -Category 'StorageDelete'
* New-AzDiagnosticSetting -Name $DiagnosticSettingName -ResourceId $blobServicesId -WorkspaceId $logAnalyticsWorkspaceId -Log $log -Metric $metric -Verbose

To understand what each of the command is doing, I will suggest you read on each of them in this link.

Figure 9: Expected Result of the Configuration

If your result is the same as the one seen above, then we have our configuration in tact. We have now configured the logging and forwarding of all events for this particular blob storage.

Task 3: Attacking the Cloud Environment in Order to Create a True-Positive Event

Objective:

In this task we will be attacking the blob storage to generate a true positive log entries which will help build a detection and automation technique. We will do this by

  • Performing an enumeration of the blob storage resources — this can be found in this link
  • Downloading the honey file from the storage using this technique

This is where it get interesting and a little obscure (for security reasons). Using the Azure Cloud shell, we will perform reconnaissance of the storage account and the content of the blob containers. The reason for this is to check if we can find some file that will be interesting to us from an attacker perspective.

We will run the following command to enumerate the storage and download a honey file from it.

* az account show | jq .user
* az storage account list --resource-group 'DetectionWorkshop' | jq .[].name
* Write-Output ($storageAccount = az storage account list --resource-group 'DetectionWorkshop' |jq -r '.[] | select(.name | startswith("proddata")) | .name')
* az storage container list --account-name $storageAccount --auth-mode login | jq .
* az storage blob list --account-name $storageAccount --container 'hr-documents' --auth-mode login | jq .[].name
* az storage blob list --account-name $storageAccount --container 'secretdata' --auth-mode login | jq .[].name

After running all the command successfully, the output of the command is suppose to look like the following below

Figure 10: Output of the Last command above

The interesting stuff about this file is that it is found in a directory that looks juicy to an attacker. Let’s download the Azure Cloud Shell session, we will achieve this using the command below.

* az storage blob download --account-name $storageAccount --container-name 'hr-documents' --name 'job-posting-personalassistent-draft.txt' --file '~/ex3-hr-data-job-posting-personalassistent-draft.txt' --auth-mode login | jq . ; az storage blob download --account-name $storageAccount --container-name 'hr-documents' --name 'job-posting-secops-azure-draft.txt' --file '~/ex3-hr-documents-job-posting-secops-azure-draft.txt' --auth-mode login | jq . ;az storage blob download --account-name $storageAccount --container-name 'secretdata' --name 'final-instructions.txt' --file '~/ex3-secretdata-final-instructions.txt' --auth-mode login | jq .
* Get-Content -/ex3-secretdata-final-instructions.txt

The above command will produce some sort of encoded output which you can further analyze to decode using tools like cyberchef. Would have love to do that too, but that is another rabbit hole on it own.

Figure 11: Output of Final Command

In Conclusion

In this part of the series, we have been able to deploy an infrastructure in the Azure Cloud, configure the logging process of this deployed infrastructure and successfully downloaded a file that looks suspicious; assuming we are looking it from an attacker perspective.

In the next part, we will create a detection rule in an automated fashion and also test this rule.

Clap and follow this page

Just incase you have your contribution, you can reach me on LinkedIn it would immensely be appreciated.
Thank you.

--

--