Application level S.I.E.M. Monitoring.

Swapnil Patil
DevOps Dudes
Published in
5 min readFeb 22, 2022

What is SIEM

SIEM stands for security, information, and event management. SIEM technology aggregates log information, security alerts, and events into a centralized platform to produce time period analysis for security monitoring. There are various ways/tools to implement SIEM on your application.

Use case for SIEM

In one of our recent app deployment which has a C.I.A. rating of 4,3,3 we had to implement S.I.E.M. so that Business team should be in control of who is using and getting logged in to application. Business team should get instant alerts

  • Who is getting added to the application
  • Who is getting removed from the application
  • Change in First name, Last name, Email, Mobile, Address in user profile.
  • Change in user groups such as administrators to system administrators etc.

Business team should get alerts with details such as which user has changed which field at what time and what is the new value.

Implementation

We have implemented this solution using Tidal (as a scheduler), Splunk and Power Shell script with SQL query. Tidal runs a Power Shell query and connects to application database then SQL query imports latest events. Afterwards Splunk reads that events from csv file. Once we have events in Splunk we can create alerts on top of it.

Architecture diagram for SIEM.

Splunk Configuration

As this is C4 data we have to create a private index in Splunk and mention the configurations as below in indexes.conf file.

# Private Index for our application

[priv_yourappname]

repFactor=auto

homePath = volume:hotwarm/priv_yourappname/db

coldPath = volume:cold/priv_yourappname/colddb

thawedPath = /data/splunk/thawed/priv_yourappname/thaweddb

tstatsHomePath = volume:hotwarm/priv_yourappname/datamodel_summary

Then we mention the file name and path from which Splunk should read the data In inputs.conf file, this file can be stored on application server or database server.

# File path for splunk to read.

[monitor://D:\Your\Path\Your_file.csv]

index = priv_yourappname

sourcetype = csv

crcSalt = <SOURCE>

The server needs to be whitelisted for splunk to be accessed so we now configure the serverclass.conf as below.

In serverclass/local/serverclass.conf add below details.

[serverClass:TEAM_yourapp]

filterType = whitelist

whitelist.0 = YourServerName

restartSplunkd = true

[serverClass:TEAM_yourapp:app:TEAM_yourapp]

SQL Query and PowerShell Configuration

For the SQL query we enter the required fields in below script such as SQL server name, Database name, Username and password for database login and SQL query.

Now we create a PowerShell script from below code. SQL query will be query which will pull the data from table which consist of user information. It will pull the data from rows which has been changed.

And it will mention which field has been changed and by which user.

#Variable to hold variable

SET QUOTED_IDENTIFIER OFF

$SQLServer = “”

$SQLDBName = “”

$connString = “Data Source=;Database=;User ID=;Password=”

$uid =””

$pwd = “”

$delimiter = ‘,’

#SQL Query

$SqlQuery = “”

$SqlConnection = New-Object System.Data.SqlClient.SqlConnection $connString

$SqlCmd = New-Object System.Data.SqlClient.SqlCommand

$SqlCmd.CommandText = $SqlQuery

$SqlCmd.Connection = $SqlConnection

$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter

$SqlAdapter.SelectCommand = $SqlCmd

#Creating Dataset

$DataSet = New-Object System.Data.DataSet

$SqlAdapter.Fill($DataSet)

$DataSet.Tables[0] | ConvertTo-Csv -NoType | Select-Object -Skip 1 | Set-Content “D:\Your\Path\Your_file.csv” -Encoding UTF8

Tidal Configuration

In tidal command we should mention is PowerShell command and we mention the file path of our PowerShell script.

Tidal Job Configuration.

Now scheduled a tidal job for every 15 minutes. That means every 15 minutes this PowerShell script will run and connect to database and pull the latest activities from database.

This schedule can be adjusted according to your application resource usage. As this activity will take memory usage every 15 minutes on your database server.

To get events of tidal job we can run the below query in splunk.

index=Your_Tidal_Index env=tst | rex field=_raw “JobName:\s(?<T_JOBNAME>.*)” | search T_JOBNAME=”Your_Tidal_Job_Name” | rex field=_raw “Output:\s(?<T_OUTPUT>.*)”

If you we are able to see events of our tidal job in splunk that means tidal job is performing correctly now we make sure we are able to see the private index events in splunk with below query.

|tstats c where index= priv_yourappname

Once this is set we now move forward for Splunk alert configuration.

Splunk alerts configuration

Now we configure the alert in Settings option, here we mention the search query and that will be our private index followed by host name (optional)

We can set a cron schedule according to our requirement here we have set cron schedule of every 15 minutes that means every 15 minutes it will check if new events has been arrived in Splunk and that is for last 15 minutes.

Under trigger condition we set a condition if number of results are rises by 1 then trigger should happen. And trigger should happen for every result because every event is important to business team.

Alert configuration in Splunk — 1.

And in trigger conditions we can add actions, actions vary from triggering Microsoft teams channel alert to running script or even creating a service now ticket.

But here our events data is of C4 so we are sending emails to the required business team email addresses. We can mention which all things this mail should contain and you can tick mark the options which are required.

Alert configuration in Splunk — 2

Now we test the implementation using test entries in application. Once we are getting emails then our implementation is successful.

So this way we can implement S.I.E.M. and we can be in control of who is using the application who is getting added or removed and also what changes they are doing in their personal data in the application.

--

--

Swapnil Patil
DevOps Dudes

By profession a DevOps specialist. I love the process of implementing new ideas into reality. When I am not coding, I like to travel and do photography.