Security Onion + LimaCharlie
It was recently requested on https://www.reddit.com/r/securityonion that advice be provided on how to integrate logs from LimaCharlie to Security Onion. Hopefully, this article should help to explain the steps involved.
While OSSEC and Wazuh are both great options for integrating host-based detection and response with Security Onion (OSSEC is current bundled with Security Onion, and there are plans to move to Wazuh soon), some folks may want to try LimaCharlie, a newer low-cost EDR solution led by Maxime Lamothe-Brassard (@_maximelb). With LimaCharlie,the first two agents are free, and any agents beyond such count are only $0.50 USD/agent per month, thereafter.
Since LimaCharlie is provided as a cloud service, we can download sensors, deploy them to our endpoints, then query and relay info from a cloud instance to an Amazon s3 bucket, send Syslog to either a local or cloud-based Security Onion instance, or benefit from other output options, such as Slack or a Webhook. This walkthrough will explain what is required to send LimaCharlie data to an s3 bucket and grab that data via Security Onion and the Elastic Stack. Instructions on how to setup Security Onion for secure receipt of syslog can be found here.
Please keep in mind, these steps are performed with a test Security Onion install, and may need to be adjusted for a production version (PLEASE make sure to test before deploying to production!).
First, install Security Onion as a standalone (single server + sensor machine running the Elastic Stack, fully configured). The ISO image and setup instructions can be found here:
Next, create an account at https://app.limacharlie.io/.
After creating an organization, create an installation key by navigating to the “Installation Keys” section and specify a group or scope for your key.
Once that is complete, you can deploy your first sensor by navigating to the “Sensor Downloads” section and downloading the sensor for your operating system.
Once downloaded, copy the key you created previously and install the sensor:
As mentioned in the screenshot above, you will need to create a service or persistence mechanism for a Linux agent install.
We can do this by creating a service at the following location: /etc/systemd/system/limacharlie.service.
Inside of this file, add the following (I’m using /home/test as the path for my installer — change this to wherever you would like yours to exist):
ExecStart=/home/test/lc_linux_64 -d INSTALLATION_KEY
Enable the service and start it:
sudo systemctl enable limacharlie.service
sudo service limacharlie start
At this point, we should see our sensor as “Online” in the LimaCharlie web console under ‘Sensors”:
Now that our sensor is up and running, we can create an output. This guide assumes you already have an s3 bucket setup and ready to go. To set up an Amazon S3 output, we’ll navigate to the “Outputs” section of the web console and click the “+” sign.
From here, we will provide a name for the output, select the output type (s3), and provide various other details, such as the stream type (Detections),the bucket name, key_id, secret_key, etc.
We can obtain the s3 key information from the Amazon s3 management console, by navigating to Account Name → My Security Credentials → Access Keys.
You’ll also want to ensure that the permissions are correctly set for the bucket — PLEASE do not make it publicly available.
Detections can help us to identify certain processes/files/etc. we would like to look for on our hosts. The syntax is very simple, yet very powerful.
Documentation about how to write detections and responses can be found here:
Navigate to the “D&R Rules” in the LimaCharlieweb console. Click to create a new rule, and edit it as follows:
case sensitive: false
- action: report
For the detection component, we’ll specify a rule to match on a new process of /usr/bin/watch.
We’ll respond to this detection with an alert with a category/name of “take_another_look”.
Yes, I know this rule is very simple, but it shows how easily we can create detections for items of interest.
A slew of sample rules for Windows hosts can be found here, and should help to get you started:
Let’s save the rule, then attempt to trigger it.
From the machine on which we installed the sensor, we’ll test the detection by executing the following command:
If everything was setup correctly, we should now see an object uploaded to our s3 bucket:
We can download the object and view it to see its contents (redacted):
That’s all fine and dandy, but the goal here is to get our detections to our Security Onion instance and analyze them there.
Therefore, we’ll need to configure Security Onion to pull from our Amazon s3 bucket. In order to do so, we’ll need to add some Logstash configuration to reach out to our s3 bucket, pull in our detections, then parse them and send them to Elasticsearch.
Specifically, the files we’ll need are:
- An input file to tell Logstash how to pull from our s3 bucket and process those logs.
- An output file to tell Logstash where the data will be routed (ES index; logstash-limacharlie-$DATE).
- Template file(s) to tell Elasticsearch how to format/interpret the data (modified default template and a specialized template).
The Github repository below should provide the necessary files:
securityonion-limacharlie - Send logs from LimaCharlie to Security Oniongithub.com
*Please note that the configuration files provided pertain to detection logs and not necessarily logs for all events. The files may need to be adjusted slightly if you wish to relay all logs instead of just detections.*
First, we’ll need to copy all of the files from the s3 folder to /etc/logstash/custom on our Security Onion box, with the exception of the securityonion.conf file.
Next, on our Security Onion box, we’ll modify LOGSTASH_OPTIONS in /etc/nsm/securityonion.conf to read as follows:
(watch out for line-wrapping)
After everything is in place, we’ll need to clear out the Logstash templates and make sure everything is synced:
curl -XDELETE localhost:9200/_template/logstash*
sudo so-logstash-restart && sudo tail -f /var/log/logstash/logstash.log
(again, watch out for line-wrapping)
Wait until the Logstash log shows “Pipelines running…”.
Then, attempt to trigger a detection rule on the host with sensor installed again (or check to see if Logstash picked up the previous one) and wait for the event to flow through Logstash/Kibana (try filtering events with
There you have it — our LimaCharlie detections have been shipped to and parsed by Security Onion, and can be acted upon and combined with other types of data to enhance our investigations and provide greater context around events happening on our hosts.
If you would like to integrate other components or log types into Security Onion, try taking a look at the Security Onion Wiki for more information:
security-onion - Linux distro for intrusion detection, enterprise security monitoring, and log managementgithub.com
Commercial support is also available at https://securityonionsolutions.com.
Additionally, you can follow myself, Security Onion, and LimaCharlie on Twitter via the following handles: @therealwlambert @securityonion @limacharlieio