PCAP data is valuable — providing insight into all network communications across a link and allowing for a deep dive on activity across the wire. When something is detailed, it is also verbose; this leads to many cases where due to storage requirements PCAP data is not feasible. Our RDP Honeypot is not immune to this, we have limited the amount we collect in Part 1 of this series to 500 MB per hour, and even then we are prone to filling up our storage capacity in a short time. We have mitigated that through only collecting 3389/TCP traffic, though over time we will have to start considering how long we wish to retain full PCAP data.
One manner to mitigate this, while also automating the extraction of key artifacts, is to process and index our PCAPs with Moloch. Moloch is a free resource developed by the AOL team that allows for us to feed it raw PCAPs and create an index of the most important features for searching and analysis.
This tool provides a lot of value — and has a lot more power than we will leverage in this series.
In short, Moloch will read our PCAP data, index the key features (such as IP addresses, usernames, traffic protocols, and more) into an Elasticsearch instance, and provide a user-friendly interface to explore our volume of PCAP data.
Since our honeypot is exposed to the internet and welcoming threat actors to attempt to access it, we should handle our processing and further analysis on a separate host; this is out of an abundance of caution. Our second host will be our Moloch and Elasticsearch system. For those familiar with Elasticsearch, you are welcome to spin up a cluster, though a single node will work for this application. The below can be accomplished through docker as well, but is not a requirement, and we will walk through setups outside docker.
- Linux machine to host processing services. We can use Windows, but this guide will leverage Linux to eliminate barriers to participate.
- This Linux machine should have at least 2 processing cores and 4 GB of RAM to handle this load, though more is recommended.
- For storage we will want at least 20 GB of available disk space, though more will be needed depending on the amount of PCAP data retention desired.
The Elasticsearch documentation is the best guide to install and configure this service. We will want to use version 7 or the latest version supported by Moloch. The documentation for Elasticsearch 7 setup is available here:
Installing Elasticsearch | Elasticsearch Reference [7.x] | Elastic
You can run Elasticsearch on your own hardware, or use our hosted Elasticsearch Service on Elastic Cloud. The…
A quick summary of the steps:
- Download the installer package and dependencies (primarily Java)
- Edit the elasticsearch.yml to reflect any settings you would like to employ (no changes are required)
- Start elasticsearch!
If you want, you can edit the configuration file to prepare the instance for use in a cluster and set up 2 or more additional nodes to balance the index data.
We will follow the usual instructions for installing Moloch. The best source for this is on the Moloch website, https://molo.ch/, and their GitHub:
Moloch is a large scale, open source, indexed packet capture and search system. Moloch augments your current security…
The current installation steps include:
- Download and install the respective moloch package
- Edit the Moloch configuration file,
- Configure Maxmind integration for GeoIP and ASN resolutions, https://molo.ch/faq#maxmind
- Run the configuration script,
- Initialize Elasticsearch for Moloch:
/data/moloch/db/db.pl http://ESHOST:9200 init, where
ESHOSTis the IP address or hostname of your Elasticsearch service.
- Add an administrative account,
/data/moloch/bin/moloch_add_user.sh admin "Admin User" DONTUSETHISPASSWORD --admin
- Start the Moloch service!
At this point, you should be able to access Moloch on port 8005 in your browser.
Connecting the Moving Parts
With Moloch online, we can start ingesting our PCAP data from the honeypot. To limit our risk, we will want to pull data from our honeypot instance (instead of pushing to our analysis Box). With that, we can use
rsync to recursively pull new PCAP data down to our host. We then will want to call
moloch-capture on the new PCAPs we downloaded.
This can be automated with this shell script. Just generate an SSH key to use for connecting to the honeypot and update the
<IP> values. You may also need to update the location of the PCAP data based on your honeypot tcpdump configuration:
#!/bin/bash## Please insert the appropriate User and IP address values
## You may also need to edit the path to where your PCAPs
## exist on the remote system
echo ===== Pulling remote pcap data
rsync -a <USER>@<IP>:/data/rdp.*.pcap* /data/pcap/
echo rsync exit code: $?echo ===== Processing new pcaps
echo indexer exit code: $?
This short script also calls to another script to handle the parsing of PCAPs in the output folder, a full copy of that script is available below:
A Docker container for Moloch based on ubuntu. Contribute to piesecurity/docker-moloch development by creating an…
We will want to set up our
moloch-capture script to run with some frequency to pull PCAP data down and process it. Since we are capturing new PCAPs on an hourly basis, we could set this sync to occur hourly.
Nice work! We are through the difficult parts of this configuration and have now set up both our honeypot service with full PCAP capture, and our PCAP processing service Moloch. That leaves us with the final step of operationalizing this data through a Twitter & Pastebin bot.