How to Quickly Setup an ELK Stack and Elastic Agent to Monitor macOS Event Data

Leo Pitt
6 min readDec 3, 2021

--

Introduction

This article serves as a simple walkthrough on setting up a lab to view macOS event data. This lab consists of the following components:

  • ELK (Elasticsearch, Logstash, Kibana) stack
  • Elastic Agent

Component Overview

“ELK” is the acronym for three open source projects: Elasticsearch, Logstash, and Kibana. Elasticsearch is a search and analytics engine. Logstash is a server‑side data processing pipeline that ingests data from multiple sources simultaneously, transforms it, and then sends it to a “stash” like Elasticsearch. Kibana lets users visualize data with charts and graphs in Elasticsearch.

The Elastic Agent is a single, unified agent that you can deploy to hosts or containers to collect data and send it to the Elastic Stack.

The Elastic Agent provides a reliable method to ingest macOS event data and leverages some of Apple’s Endpoint Security Framework (ESF) events.

Antonio Piazza recently published an article about setting up a lab environment leveraging HELK, Appmon, and Filebeat. Cedric Owens also did a post that builds upon Antonio’s to automate that lab deployment. Please refer to those posts if they suit your test case.

Walkthrough

1. Pull down my fork of the docker-elk repo.

The items I changed from the parent repo are:

  • Adding X-Pack security to the Elasticsearch config:
xpack.security.authc.api_key.enabled: true
  • Adding X-Pack security to the Kibana config:
xpack.security.encryptionKey: “something_at_least_32_characters”
xpack.encryptedSavedObjects.encryptionKey:“something_at_least_32_characters”
  • Adding X-Pack security and Elasticsearch IP to Logstash config:
xpack.monitoring.elasticsearch.hosts: [ “http://127.0.0.1:9200" ]
xpack.monitoring.elasticsearch.url: http://127.0.0.1:9200

Note: This build uses X-Pack with paid features which kicks off a 30-day trial period. You can view this under Stack Management-> License Management in Kibana.

2. Go into the docker-elk folder (on a Ubuntu Server for this walkthrough) and start the ELK stack:

cd docker-elk/
sudo docker-compose up

3. Get the Elastic Agent (on a Big Sur host for this walkthrough):

Download the Elastic Agent. I’ve copied the unzipped folder to the ~/Documents directory in this walkthrough.

4. Setup the ELK Server:

Go back to the ELK server and log into Kibanahttp://ELKserverIP:5601/

Kibana Login Page

The default credentials are elastic/changeme. Click Explore on my own.Click the “Hamburger Menu” on the left. Go to Management->Fleet.

Fleet Menu under Management

Note: The first time you visit this page, it may take a few minutes to load.

Keep the Default Fleet Server policy and select Quick start under Step #3

Quick start option on ELK Server

Add the Fleet Server Host (i.e., the ELK server IP Address). In my case, this is http://172.16.113.10:8220.

Adding the Fleet Server Host

Hit the Add host button.

Confirmation Fleet Server host added

Next, go to Step #5 and hit the Generate service token button.

Generate Service Token button

On Step #6, start the Fleet Server. Keep the Platform option on Linux/macOS and copy the command generated. Take the copied command to your macOS endpoint.

Command to start the Elastic Agent on macOS endpoint

5. Setup the Elastic Agent on macOS endpoint:

Go to the directory in which you saved the Elastic Agent. In this walkthrough, that is under ~/Documents. Paste the copied command in the Terminal. Make sure the fleet-server-es flag has the correct value. Note: Change from localhost to the ELKserverIP.

Elastic Agent start command

The following message will appear, indicating that you correctly installed the Elastic Agent.

Elastic Agent Success message

Go back to Kibana, and it should state Fleet Server connected.

Fleet Server Successfully Connected

Next, we need to check that data is correctly ingested into Elasticsearch from our Elastic Agent.

Fleet Overview Page

You can do this by navigating to the Data Streams tab. You should see this area populated with endpoint data.

No endpoint data under Data Streams

If there is no data here, check your fleet settings by clicking the settings cog in the top right corner.

Fleet Settings cog

Ensure that your Elasticsearch settings are correctly set to the correct IP and not set to localhost.

Elasticsearch hosts field incorrectly shows localhost
Confirm the IP change

6. Add Endpoint Security:

In Kibana, go to Security-> Endpoints

Endpoints Menu

Click Add Endpoint Security button

Start Adding Endpoint Security

Start configuring the Endpoint Security Integration. Name the Integration whatever you want (e.g., secure). Select Default Fleet Server policy -> Save Integration -> Save and Deploy Changes.

Endpoint Security Integration Name and Agent Policy

Next, we enroll the Elastic Agent with Endpoint Security.

Enroll Agent with created Integration

7. Allow extensions on Big Sur host:

After enrolling the Elastic Agent, you should see the System Extension Blocked prompt. Click Open Security Preferences or System Preferences->Security & Privacy->General.

Extension Security Prompt

Under the General tab, allow the ElasticEndpoint.

General tab under Security & Privacy

After clicking Allow, you will likely get a network prompt which you will allow.

Network Security Prompt

Next, we must provide Full Disk Access to the co.elastic.systemextension extension and elastic-endpoint.

Grant Full Disk Access to Elastic Endpoint and System Extension

8. View the event data and test alerts:

Go back to Kibana, Management->Fleet->Data Streams, you should now see endpoint.events.process & endpoint.events.file datasets.

Data Streams Correctly Ingesting

Go to the Analytics->Discover tab and under logs*, and you should see events.

Note: Some key fields to pay attention to: process.command.line, file.name, event.type, event.category, event.action.

Logs from the Big Sur host

Add prebuilt alerts. Security ->Rules->Load Elastic prebuilt rules and timeline templates.

Load Prebuilt Rules

Next, go to macOS tags, and select which alerts to turn on. In this walkthrough, I turn on Sublime Plugin or Application Script Modification.

Activating Sublime Persistence Alert

To trigger the alert, I modify the persistence script by adding SublimeTextAppScriptPersistence(‘echo “hellosublime” >> /tmp/hellosublime’) to the end and execute it.

Installing Sublime Persistence

Go to Security->Alerts tab to see the alert populate.

Alerts Overview

On this page, you can analyze the event by clicking the ellipsis button under the Actions section and go to Analyze event.

Analyzing the alert

The graphical process execution flow is a helpful feature.

Processes leading to the osascript execution

Now that everything is coming in, you can use the Discover tab to start making custom queries and then go to Security->Rules to make your own alerts. David French has a thread that demonstrates leveraging custom alerts.

I hope this guide helps!

References / Resources:

--

--