Enable Splunk as backend for Zentral

zentral
Zentral Pro Services
7 min readAug 12, 2020

UPDATE NOTE: Parts of this blogpost are outdated, please consult our official docs here when setting up Splunk as backend for Zentral.

Splunk is a security information and event management (SIEM) solution that allows organizations to collect, index, report, and present data from multiple sources.

Zentral is centralized service that gathers event data from multiple sources and deploy configurations to multiple services. It allows for orchestration and event management with certain macOS security components such as Osquery for endpoint visibility, and the Google’s Santa binary authorization system and will link events with an inventory (Jamf Pro, Munki, PuppetDB, et-al).

Zentral is built on top of the popular Elastic Stack, by default, ElasticSearch is the primary data store when search for historical events. But Zentral does allow to leverage other event stores as well, as such you can send event data from supported agents via Zentral to Splunk, Azure Event Log, and more...

Setting up a different event store is actively applied in certain custom deployments where aim is to forward endpoint events (specifically macOS endpoint events) from Zentral towards a SIEM system, already in place (see also this talk at MacSysAdmin 2019 here —brief demo of event shipping to Azure Sentinel SIEM).

To illustrate this topic to follow along in a practical approach, we’ll walk thru a base scenario how to quickly activate and set up a Splunk backend integration with Zentral in just a few minutes.

In our custom example, a fleet of endpoints with open-source agents installed (Santa, Osquery, FileBeat) does already stream event and log data to Zentral. The different kind of events are aggregated, normalized and processed, then with some extra steps all event data is automatically forwarded to Splunk as the SIEM solution got enabled as additional data store in Zentral. Now, the tasks of event search, investigation and correlation can all happen in Splunk.

TL;DR — Zentral can forward all events to Splunk used as a data store.

Get started — setup a Splunk Cloud instance

To get data into Splunk there are multiple ways — we’re going to use the HTTP Event Collector (HEC) in our setup to forward all endpoint event data from Zentral directly to Splunk and use this as a data store.
You’ll find some more details on Splunk HTTP Event Collector here.

In our example customer scenario Zentral continues to deliver it’s essential core features — orchestrate open-source security agents (Santa, Osquery, FileBeat, etc) across a fleet of endpoints. The different kind of events get aggregated, normalized and processed, then all data forward to Splunk Cloud as additional data store. The main tasks for event search and correlation is performed in Splunk.

First you’ll need to have a Splunk instance with proper TLS— the fastest way here is to start a trial version of Splunk Cloud.

  1. Go to the Splunk website, start a “Cloud Trial” here
  2. Create an Account, provide all details
  3. Validate your email address
  4. Click “Start Trial” button
  5. Wait for the “Welcome to Splunk Cloud!” Email which provides your individual Splunk Cloud access credentials
  6. Make a note of your Splunk Cloud instance URL

The next action is to enable data input….

Provide credentials, validate Email, Log-in
Start the Splunk Cloud Trial

Setup Splunk HTTP Event Collector

Data input to Splunk is quite straightforward, we setup the HTTP Event Collector (HEC) for this purpose.

  1. Login to your Splunk Cloud instance
  2. Go to Settings > Data > Data Inputs
  3. From Local inputs > HTTP Event Collector > click the “Add new” button
  4. Provide a Name and Description
  5. For input setting go to “select” and choose “log2metrics_json”
  6. Review the setup
  7. Copy the Token value (you’ll need this along with the URL later)

With Splunk prepared, our next activity is to configure Zentral. For this, a short splunk.py file (just 26 lines of code) and adding some data fields to the base.json config file is necessary.

Data Inputs
Add new Local inputs — HTTP Event Collector
Provide Name and Description
Source type, choose: log2metrics_json
Review and copy token value

Setup and enable Splunk data store in Zentral

We’re almost there to see first events in Splunk — in next steps you have to ssh into your Zentral instance (i.e. Zentral-all-in-one), and ensure the splunk.py backend file is present.

If not present copy the code for plunk.py from GitHub manually here, save that as a file to your instance namedsplunk.py (saved into path below).

/home/zentral/app/releases/current/zentral/core/stores/backends/

Alternatively you could also run a full code update and deploy latest master branch from GitHub repository as described here.

splunk.py file to connect datastore with Splunk

To activate the new data store you’ll need add some config details to the Zentralbase.json config file. Remember before doing any edits in base.json ensure you create a safety copy first.

  1. Ensure the splunk.py file is present
ls -l /home/zentral/app/releases/current/zentral/core/stores/backends/

2. Edit the base.json file in a text editor (nano or vim).

sudo nano /home/zentral/conf/base.json

3. Multiple value statements in JSON are separated by a comma sign. In base.json file scroll to the “Stores” section. Insert a new JSON dict for “splunk” following existing “elasticsearch” as shown below. In your newly created “splunk” JSON dict replace the base_url value with your own Splunk instance URL. Ensure you’ll correctly append the port :8080 followed by /services/collector/raw — just as presented in the example dict below. Next insert your Splunk HEC access token as vcalue for the api_token .

"splunk": {
"frontend": false,
"backend": "zentral.core.stores.backends.splunk",
"base_url": "https://prd-p-xxxx.splunkcloud.com:8088/services/collector/raw",
"api_token": "c1l9beef-41bf-4d88-8838-00c64c3e36c3",
"verify_tls": false
}

See the example below how an end result would look like :

Add config to Stores with Splunk details

4. Validae your new addition did not malform the JSON file. Check for errors with this command, correct formatting if needed.

python3 -m json.tool /home/zentral/conf/base.json

5. When config is free from errors. next stop and then start the Zentral Workers again. Now events start shipping to the new data store.

sudo systemctl stop zentral_workerssudo systemctl start zentral_workers

With a proper setup as shown above you should immediately see new events forwarded from Zentral to Splunk Cloud. With endpoints sending Santa or Osquery events you can start search for them in Splunk.

Search Events in Splunk

In Splunk Cloud you need to navigate to the App > Search & Reporting section. In this view you can search for historic events, save queries, build reports, and do much more.

Search & Reporting
  • Try start with some basic search terms first.
  • Try out more complex ones — i.e. look for known file hashes (works if Santa log/event is collected in Zentral).
  • Refine some searches with Splunk drill-in capabilities (shown below)
*| spath file_sha256 | search file_sha256=a0e6e7997a818d7093dbf9678d0e96fbc5420a9f9a4a9c1d3eab441404153acf
Splunk Search reasults — select value, add or exclude from search
a refined search query

In the next section you’ll see some additional examples how different endpoint events and simple queries to start with look like in Splunk.

  • Osquery — a distributed query
  • Google Santa logs
  • Google Santa events
  • Jamf client side logs

The current setup shows how a basic raw JSON shipping of all event data from Zentral looks like — with some extra effort the data formating can be adjusted. Also a pre-defined drop filter for some (uncritical) event types could help reduce the volume of data shipped to Splunk (in turn save Splunk cost).

Find more details about Zentral here and here.

Osquery

Osquery — a distributed query result

Google Santa

Google Santa — santa_log details
Google Santa — santa_event details

Jamf — client side logs

Jamf Pro

  • jamf local log file on macOS client— /var/log/jamf.log
  • look into recurring events, Self Service and Policy executions
Jamf.client log (as seen locally in macOS)
/var/log/jamf.log file — aggregated, normalized, and forwarded to Splunk

Wrap Up

In this example we’ve seen how to setup additional event data store in Zentral. We enable Splunk with HEC and allow to stream events stream from Zentral into Splunk. In Splunk you can search for endpoints events from Santa, Osquery and jamf client log events.

In a future posts we will look into some more Splunk use cases — stay tuned.

For more details about Zentral look here. For custom integration and development, professional services and consulting contact us here.

--

--

zentral
Zentral Pro Services

We’re the developers behind Zentral. We operate a consultancy business, provide expertise and services all around Mac management. Contact: https://zentral.com