Mordor Labs 😈 — Part 2: Executing ATT&CK APT29 Evaluations Emulation Plan 📕 -Day1

Building the environment for scenario one is very easy and takes around 30–45 mins. Once the environment is set up, you will still have to set up your computer to authenticate via certificates with point-to-site VPN.

This post is part of a three-part series where I share my experience deploying the ATT&CK APT29 evaluation environment via Azure Resource Manager (ARM) templates and collecting free telemetry produced after executing the emulation plans for each scenario.

In this post, I share a few steps to connect to the environment via a point-ti-site VPN and a quick video showing every single step taken following day 1 emulation plan.

The other two parts can be found in the following links:

Pre-Requirements

I highly recommend to read the first part of this series. That should be helpful to understand the infrastructure in more details and the reason why we need to create self-signed certificates to authenticate to the environment.

Connect to the Environment

This was already explained in the first post, but I believe it would be good to go through this again. If you already did this part, feel free to skip to the “Preparing for Emulation Plan (Day 1)” section of this post.

Once the environment deploys successfully, you will have to download a VPN config file from Azure, update it with your client certificate and key, import it to your OpenVPN client, and connect!

Download VPN Client config from Azure

Go to your Azure Portal > Resource Group Name > Virtual Network Gateway > Point-to-site-configuration and click on Download VPN Client.

Update VPN Client Config

Once it downloads, you will have a compressed file with a few file configs in it. The environment I put together uses OpenVPN as the VPN client protocol. Therefore, we are going to update the OpenVPN\vpnconfig.opvn file and insert the Client Certificate and Private Key to it. Open with your favorite editor. I use Visual Studio Code.

I first comment out lines 20–21because TunnelBlick handles that for you.

#log openvpn.log
#verb 3

Then, I modified lines 71–76 (These are official steps BTW). You need to copy the contents of your self-signed client certificate and paste it between <cert></cert> as shown below:

# P2S client certificate
# Please fill this field with a PEM formatted client certificate
# Alternatively, configure 'cert PATH_TO_CLIENT_CERT' to use input from a PEM certificate file.
<cert>
-----BEGIN CERTIFICATE-----
MJVADC....
-----END CERTIFICATE-----
</cert>

Next, you have to do something similar but with your client private key. Open your client private key file and copy the contents of it and paste it between <key></key> as shown below:

# P2S client certificate private key
# Please fill this field with a PEM formatted private key of the client certificate.
# Alternatively, configure 'key PATH_TO_CLIENT_KEY' to use input from a PEM key file.
<key>
-----BEGIN RSA PRIVATE KEY-----
M...
-----END RSA PRIVATE KEY-----
</key>

That’s it! You are ready to connect to the environment. Open your OpenVPN Client and drag and drop the client VPN config that we just edited or double-click on it depending on what VPN client app you are using. Finally, connect!

Preparing for Emulation Plan (Day 1)

I recommend to connect to every endpoint that you are going to use, disable Windows Defender, and get ready to execute the emulation plan. Get familiarized with the following resources to know how to prepare for it:

I also put together a document from those two resources and added some context related to the environment that I built to share with the community. You can use it as an online document or download from here. The document also comes with instructions to perform additional post-deployment steps.

Launch Pupy C2

ssh to TEMSERVER box (192.168.0.4)

LOCAL@LAPTOP ~ % 
LOCAL@LAPTOP ~ % ssh wardog@192.168.0.4
wardog@192.168.0.4's password:
wardog@TEAMSERVER:~$

Launch Pupy with the following command.

sudo docker run --rm -it -p 1234:1234 -v "/opt/attack-platform:/tmp/attack-platform" docker-pupy python pupysh.py

enable ec4 listener

listen -a ec4

RDP to SCRANTON (10.0.1.4) and NASHUA (10.0.1.6)

RDP as dmevals\pbeeslywith to both endpoints with password Fl0nk3rt0n!T0by. If you want more information about the available domain users and their password, it is here.

Switch to SCRANTON RDP session and get ready for the initial access steps.

🚨 Wait, before all that! 😉 🛑

Get Ready to Collect Security Events

One of the main goals for this deployment is to collect all the data generated at the host and network layer to be able to share it with the community.

Collect Endpoints Events from Azure Event hub

Install Kafkacat in your local computer (version 1.4.0+) or VM.

Create a Kafkacat config in your local computer or VM

  • I created a Kafkacat config template here.
  • Add the values required from Azure Event hub. I got the following values and pasted them in the config file. Event Hub namespace (Get it from the Event Hub resource) and Event Hub Connection String (You can get it following these steps.)

Next, wherever kafkacat is installed, run Kafkacat in Consumer mode as shown below:

kafkacat -b <EVENTHUB-NAMESPACE>.servicebus.windows.net:9093 -t evh-apt29 -F kafkacat.conf -C -o end > evals_apt29_day1_manual_$(date +%F%H%M%S).json
  • -b : Bootstrap broker(s) (host[:port]). Your Event Hub Namespace
  • -t : Topic to consume events from. The name of you Event Hub.
  • -F : Read configuration properties from the Kafkacat.conf file.
  • -C: Consumer Mode : Consume/Collect events.
  • -o : Offset to start consuming from (i.e. end).

I usually run it first for a few seconds and then stop it to see if it could connect to the Azure event hub and if I get data from it.

Capture PCAPs leveraging Azure Network Watcher Extension

As I mentioned in the previous post, I put together a script to start capturing PCAPs leveraging the Azure network watcher extension installed on every Windows endpoint. It is in the scripts folder of the mordor-labs project.

Usage: Start-Packet-Capture.sh [option...]   -r     Resource Group Name
-s Storage Account Name
-c Computer Names (e.g VM01,VM02)
Examples:
Start-Packet-Capture.sh -r resourcegroup01 -s storageaccount01 -c VM01,VM02

Since the victims are SCRANTON and NASHUA, those are the only computers I want PCAPs from.

Locally in your computer with Azure CLI installed and set up run:

bash Start-Packet-Capture.sh -r <RESOURCE-GROUP-NAME> -s <AZURE-STORAGE-ACCOUNT-NAME> -c SCRANTON,NASHUA

If you go to Azure Portal > Network Watcher > Packet Capture , you will see both endpoints PCAP sessions running (We can only collect 1GB at the time)

You are now ready to execute APT29 — Day 1 emulation plan 🍻. I took a lot of screenshots for each step, but I thought it would be better to make a video of me running the emulation plan and share it with you 😆. I can share some images of how it all starts, but the rest is a video I uploaded to the Open Threat Research YouTube channel that I manage.

Initial Access (Day 1)

You will get a new call back via the Pupy console

Interact with the new session by running:

shell

and follow the steps in the online document or offline one from here

Stopping Security Event Collection

Stopping Azure PCAPs

Go to Azure Portal > Network Watcher > Packet Capture, and stop one by one.

Once you do that, you can go to the Azure storage account that you selected while starting the PCAP sessions, and you will see your PCAP files there.

Stopping Kafkacat Collection

All you have to do is CTRL+C on the terminal where you are running the collection from. If you check your Azure Event Hub, you will see that events were collected as they were being ingested 😉. Collection started around 10:55 PM. I did some tests before that. I love to see how practical and flexible this is 😱

--

--

--

Threat Hunting, Data Science & Open Source Projects

Recommended from Medium

David Parnas on Education, Software Aging, AI and Agile Methods

Analyzing the IPA file of an iOS based application

SQA REVOLUTION — MODERN CONCEPT

Two tips for senior software managers to avoid the illusion of “Knowledge Transfers”

Developing an Automated Trading System with Python

Part I: Scaling Robinhood Clearing Accounting

Different ways to use Laravel Form Requests

Iterating through multiple pages of authenticated ESI calls in Microsoft Excel

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Roberto Rodriguez

Roberto Rodriguez

More from Medium

Which services can you expect from a Security Operation Center?

Detection Design Patterns — Process Creation

Your SOAR isn't your SIEM

SIEM: Splunk Fundamentals