Snowflake Cloud Data Warehouse — Data Driven Security Analytics with SnowAlert

This is Worth Trying Out — An Open Source Project for Security Analytics with Snowflake

--

by Ed Fron, Enterprise Architect

Recently, a North American specialty retailer was migrating their large traditional data warehouse appliance solution to Snowflake Cloud Data Warehouse and towards the end of the project the IT Security team expressed interested in reviewing Security Information and Event Management (SIEM) style solutions for their new Snowflake account.

SIEM Defined

As expert Karen Scarfone points out in TechTarget, “SIEM systems and products serve two purposes: providing centralized security logging and reporting for an organization and aiding in the detection, analysis and mitigation of security incidents.” Many customers require these security capabilities for overall compliance within their organization as well as enabling proactive response to a security incident.

A Different Kind of “Pow”

One of the great benefits of Hashmap’s Preferred SI Partner status with Snowflake is being able to collaborate with the broader Snowflake product and engineering team to help meet Snowflake customer needs. As we were reviewing options with Snowflake for the retail customer mentioned above, some of the Snowflake Security team members we were working with (Omer Singer, Andrey Fedorov, and Gregory Butzi) introduced us to SnowAlert.

Now, living in Colorado, when I hear the words SnowAlert, the actionable insight that immediately comes to mind is beating the I-70 traffic to get into the mountains for some first tracks in the light, fluffy, deep powder snow.

Snapping out of my powder day delusion it turns out that SnowAlert is a Snowflake Computing open source project that provides a security analytics framework that uses Snowflake Cloud Data Warehouse for detecting security incidents and policy violations.

So Let’s Check it Out

In this post I’ll walk you through deploying and configuring SnowAlert and will explore the features to understand how the sample alerts work to be able to create our own alerts for Snowflake account usage. Hopefully by the end, you will be comfortable and really “ripping” and “shredding” with SnowAlert.

SnowAlert Installation and Configuration

Following the Getting Started guide is pretty straight forward if you are familiar with Docker, so I will only cover the very basics I followed to get SnowAlert running in my test environment.

Login to your Snowflake account with the user that has account admin access and run the following script:

CREATE USER snowalert PASSWORD = '**************' LOGIN_NAME = 'snowalert' DISPLAY_NAME = 'snowalert' FIRST_NAME = 'Snow' LAST_NAME = 'Alert' DEFAULT_ROLE = "ACCOUNTADMIN" MUST_CHANGE_PASSWORD = FALSE;
GRANT ROLE "ACCOUNTADMIN" TO USER snowalert;

Next, on Docker server run the following command to download SnowAlert:

$ docker pull snowsec/snowalert

Then for the simplest installation which skips the KMS encryption layer, run the install script using the following command:

$ docker run -it snowsec/snowalert ./install

Once you enter the information request by the prompts, the installer outputs that it created a warehouse, user, role, database, schemas, tables and sample alerts and violations in your Snowflake account.

Starting installer for SnowAlert.
Snowflake account where SnowAlert can store data, rules, and results (URL or account name): https://<SF_ACCOUNT>.snowflakecomputing.com
Next, authenticate installer with user who has 'accountadmin' role in your Snowflake account
Snowflake username: <USERNAME>
Password [leave blank for SSO for authentication]:
Authenticating to Snowflake.. ✓
Use role accountadmin... ✓
Creating and setting default warehouse... ✓
Creating database... ✓
Creating schemas..... ✓
Creating alerts & violations tables...... ✓
Creating standard UDTFs...... ✓
Creating data view.... ✓
Creating sample alert.... ✓
Creating sample violation.... ✓
Creating role and user..... ✓
Granting role to user... ✓
Granting privileges to role............ ✓

Next, the installer prompts for information to integrate SnowAlert with Jira:

Would you like to integrate Jira with SnowAlert (y/N)? y
Please enter the username for the SnowAlert user in Jira: <JIRA_ACCOUNT>
Please enter the password for the SnowAlert user in Jira:
Please enter the URL for the Jira integration: https://<ACCOUNT>.atlassian.net
Please enter the project tag for the alerts...
Note that this should be the text that will prepend the ticket id; if the project is SnowAlert and the tickets will be SA-XXXX, then you should enter 'SA' for this prompt.
Please enter the project tag for the alerts from SnowAlert: SB
--- DB setup complete! Now, let's prep the runners... ---The access key for SnowAlert's Snowflake account can have a passphrase, if you wish.
RSA key passphrase [blank for none, '.' for random]:
Setting auth key on snowalert user... ✓
--- ...all done! Next, run... ---

Finally, execute the cat command provided in the install output to generate the .envs file and and then execute SnowAlert runner to see the sample alert and violation by running the following command:

$ docker run --env-file snowalert-<SF_ACCOUNT>.envs snowsec/snowalert ./run all
running in docker container-like environment
Loaded 6 views, 3 were 'alert_query' rules.
RULE_NUMBER_5_ALERT_QUERY processing...
Recording metadata.
RULE_NUMBER_5_ALERT_QUERY done.
Recording alerts.
SNOWFLAKE_LOGIN_WITHOUT_MFA_ALERT_QUERY processing...
Recording metadata.
SNOWFLAKE_LOGIN_WITHOUT_MFA_ALERT_QUERY done.
Recording alerts.
SNOWFLAKE_LONG_RUNNING_ALERT_QUERY processing...
Recording metadata.
SNOWFLAKE_LONG_RUNNING_ALERT_QUERY done.
No alerts to log.
Recording metadata.
Loaded 6 views, 1 were 'alert_suppression' rules.
Received suppression SNOWFLAKE_LOGIN_WITHOUT_MFA_ALERT_SUPPRESSION
Recording metadata.
Suppression query SNOWFLAKE_LOGIN_WITHOUT_MFA_ALERT_SUPPRESSION executed.
Recording metadata.
Found 1 new alerts.
Found 1 new alerts to handle.
Creating ticket for alert {'ACTION': 'login', 'ACTOR': 'SNOWALERT', 'ALERT_ID': 'e45d0f2a8dc040b194ee668402b8c68c', 'ALERT_TIME': '2019-01-21 23:55:05.900 -0800', 'DESCRIPTION': 'PYTHON_DRIVER logging in as SNOWALERT from 10.0.0.1', 'DETECTOR': 'SnowAlert', 'ENVIRONMENT': 'SnowAlert', 'EVENT_DATA': {'CLIENT_IP': '10.0.0.1', 'EVENT_ID': 506024969, 'EVENT_TIMESTAMP': '2019-01-21 23:55:03.695 -0800', 'EVENT_TYPE': 'LOGIN', 'FIRST_AUTHENTICATION_FACTOR': 'RSA_KEYPAIR', 'IS_SUCCESS': 'YES', 'REPORTED_CLIENT_TYPE': 'PYTHON_DRIVER', 'REPORTED_CLIENT_VERSION': '1.7.3', 'USER_NAME': 'SNOWALERT'}, 'EVENT_TIME': '2019-01-21 23:55:03.695 -0800', 'OBJECT': 'SNOWALERT', 'QUERY_ID': 'f4424e8bd69c4ff4beb9ade2f3979920', 'QUERY_NAME': 'snowflake_login_without_mfa', 'SEVERITY': 'low', 'SOURCES': ['successful_snowflake_logins'], 'TITLE': 'Successful Snowflake login without MFA'}Creating new JIRA ticket for Successful Snowflake login without MFA in project SBUpdating alert table: UPDATE snowalert.results.alerts SET ticket='SB-1' WHERE alert:ALERT_ID='e45d0f2a8dc040b194ee668402b8c68c'

Take a close look at the output and you see that it found one new alert to handle. The reason is that the SnowAlert user authenticated to Snowflake during installation and does not have Multi-Factor Authentication (MFA) configured, therefore it resulted in an alert appearing in the alerts table:

Opening details of the ALERT column of type VARIANT shows the completed details of the alert:

{
"ACTION": "login",
"ACTOR": "SNOWALERT",
"ALERT_ID": "e45d0f2a8dc040b194ee668402b8c68c",
"ALERT_TIME": "2019-01-21 23:55:05.900 -0800",
"DESCRIPTION": "PYTHON_DRIVER logging in as SNOWALERT from 10.0.0.1",
"DETECTOR": "SnowAlert",
"ENVIRONMENT": "SnowAlert",
"EVENT_DATA": {
"CLIENT_IP": "10.0.0.1",
"EVENT_ID": 506024969,
"EVENT_TIMESTAMP": "2019-01-21 23:55:03.695 -0800",
"EVENT_TYPE": "LOGIN",
"FIRST_AUTHENTICATION_FACTOR": "RSA_KEYPAIR",
"IS_SUCCESS": "YES",
"REPORTED_CLIENT_TYPE": "PYTHON_DRIVER",
"REPORTED_CLIENT_VERSION": "1.7.3",
"USER_NAME": "SNOWALERT"
},
"EVENT_TIME": "2019-01-21 23:55:03.695 -0800",
"OBJECT": "SNOWALERT",
"QUERY_ID": "f4424e8bd69c4ff4beb9ade2f3979920",
"QUERY_NAME": "snowflake_login_without_mfa",
"SEVERITY": "low",
"SOURCES": [
"successful_snowflake_logins"
],
"TITLE": "Successful Snowflake login without MFA"
}

Since Jira is configured, the Jira alert handler ran and created a ticket ‘SB-1’ in the Jira project for the alert.

Now that SnowAlert is installed and configured, and the sample alert was received after executing the SnowAlert runner, let’s explore how the successful login without MFL sample alerts works as a final step in understanding how to create your own alerts.

SnowAlert Database Objects

Let’s dig a little deeper into the Snow Alert database objects shown below.

Before an alert can be generated for logins with MFA enabled, we need a list of users who have logged in. The SNOWALERT.DATA schema contains the views that generate the query results we are interested in monitoring and generating an alert on.

Examining the SQL in view SNOWALERT.DATA.SUCCESSFUL_SNOWFLAKE_LOGINS_V shows a select from information_schema.login_history()(a system-defined table function) with the condition the login is successful.

CREATE VIEW IF NOT EXISTS snowalert.data.successful_snowflake_logins_v AS
SELECT *
FROM TABLE(snowflake_sample_data.information_schema.login_history())
WHERE is_success = 'YES';

Now that the we have a view that provides a list of users that have successfully logged in, we need to define the condition where MFA was not used for each login.

The SNOWALERT.RULES schema contains the views that provide the alert information and conditions to generate alerts on. Examining the SQL of in view SNOWFLAKE_LOGIN_WITHOUT_MFA_ALERT_QUERY shows a select from the snowalert.data.successful_snowflake_logins_v view with the condition that second_authentication_factor IS NULL and formats the results with the additional fields and values to be written into the SNOWALERT.RESULTS.ALERTS table.

CREATE OR REPLACE VIEW snowalert.rules.SNOWFLAKE_LOGIN_WITHOUT_MFA_ALERT_QUERY COPY GRANTS AS
SELECT 'Successful Snowflake login without MFA' AS title
, array_construct('successful_snowflake_logins') AS sources
, user_name AS object
, 'SnowAlert' AS environment
, event_timestamp AS event_time
, CURRENT_TIMESTAMP() AS alert_time
, reported_client_type || ' logging in as ' || user_name || ' from ' || client_ip AS description
, user_name AS actor
, 'login' AS action
, 'SnowAlert' AS detector
, OBJECT_CONSTRUCT(*) AS event_data
, 'low' AS severity
, 'snowflake_login_without_mfa' AS query_name
, 'd9236c99d6954276b11b74c428a6ff0a' AS query_id
FROM snowalert.data.successful_snowflake_logins_v
WHERE second_authentication_factor IS NULL
AND DATEDIFF(MINUTE, event_timestamp, CURRENT_TIMESTAMP()) < 60

Creating a New Alert

Now that we understand how SnowAlert works, let’s add a new alert to for a long running query.

Snowflake provides data dictionary object metadata, as well as historical usage data, for your account via a shared database named SNOWFLAKE. The Account Usage schema QUERY_HISTORY view will return the total_elapsed_time for every query run within the last 365 days.

To create a view in the SNOWALERT.RULES schema execute the following to create a new view called snowalert.rules.snowflake_long_running_alert_query that will generate an alert for every query that runs over 5s.

create or replace view snowalert.rules.snowflake_long_running_alert_query as
select
object_construct('cloud', 'Snowflake', 'account', current_account()) as environment
, array_construct('snowflake') as sources
, 'Snowflake Query' as object
, 'Snowflake Long Running Query' as title
, START_TIME as event_time
, current_timestamp() as alert_time
, 'User ' || USER_NAME || ' select ran for ' || TOTAL_ELAPSED_TIME as description
, 'SnowAlert' as detector
, TOTAL_ELAPSED_TIME as event_data
, USER_NAME as actor
, 'submitted long running query' as action
, 'Medium' as severity
, '2e7b45c89d9b4f5ba6ebc51124408c29' as query_id
, 'snowflake_long_running_alert_query' as query_name
from snowflake.account_usage.query_history
where query_text like 'select%'
and total_elapsed_time > 5000
and datediff(minute, end_time, current_timestamp()) <= 14400
;

Next, execute SnowAlert runner…

$ docker run --env-file snowalert-<SF_ACCOUNT>.envs snowsec/snowalert ./run all
running in docker container-like environment
Loaded 6 views, 2 were 'alert_query' rules.
SNOWFLAKE_LOGIN_WITHOUT_MFA_ALERT_QUERY processing...
Recording metadata.
SNOWFLAKE_LOGIN_WITHOUT_MFA_ALERT_QUERY done.
Recording alerts.
SNOWFLAKE_LONG_RUNNING_ALERT_QUERY processing...
Recording metadata.
SNOWFLAKE_LONG_RUNNING_ALERT_QUERY done.
Recording alerts.
Recording metadata.
Loaded 6 views, 1 were 'alert_suppression' rules.
Received suppression SNOWFLAKE_LOGIN_WITHOUT_MFA_ALERT_SUPPRESSION
Recording metadata.
Suppression query SNOWFLAKE_LOGIN_WITHOUT_MFA_ALERT_SUPPRESSION executed.
Recording metadata.
Found 1 new alerts.
Found 1 new alerts to handle.
Updating alert table: UPDATE snowalert.results.alerts SET ticket='SB-5' WHERE alert:ALERT_ID='157a3a5c7fc848b6afddc050ffc4ed65'
Creating ticket for alert {'ACTION': 'submitted long running query', 'ACTOR': 'HASHMAPUSER', 'ALERT_ID': 'c18a69d61d7748268ee6f869f2ab9570', 'ALERT_TIME': '2019-01-25 00:01:55.345 -0800', 'DESCRIPTION': 'User HASHMAPUSER select ran for 5349', 'DETECTOR': 'SnowAlert', 'ENVIRONMENT': {'account': '<ACCOUNTNAME>', 'cloud': 'Snowflake'}, 'EVENT_DATA': 5349, 'EVENT_TIME': '2019-01-24 23:52:53.812 -0800', 'OBJECT': 'Snowflake Query', 'QUERY_ID': '2e7b45c89d9b4f5ba6ebc51124408c29', 'QUERY_NAME': 'snowflake_long_running_alert_query', 'SEVERITY': 'Medium', 'SOURCES': ['snowflake'], 'TITLE': 'Snowflake Long Running Query'}
Creating new JIRA ticket for Snowflake Long Running Query in project SB

If you take a close look at the output, you’ll see that it found one new alert to handle for the long running query and a ticket ‘SB-5’ was created in the Jira project for the alert.

SnowAlert UI

Also, it’s worth mentioning that a SnowAlertUI is actively under a development branch at this time. The SnowAlertUI provides a clean interface to manage the alerts using a Form or SQL editor. To fetch and run the latest version use the following commands:

$ docker pull snowsec/samui
$ docker run -it -d -p 8000:8000 --env-file snowalert-<ACCOUNTNAME>.envs snowsec/samui

From here, open a browser and navigate to http://<server-running-docker>:8000 and the UI will appear. Try it out. At the time of this writing the UI is still a work in process, but that means there is plenty of opportunity to contribute to the open source development. Have fun!

Give SnowAlert a Try!

Many of the existing SIEM tools that are available can be frustrating, provide noisy detection rates, and are pretty expensive to boot. SnowAlert is an open source solution that is worth checking out if you are looking to create rules to generate high fidelity detections pretty quickly and easily.

I was particularly impressed that you don’t have to be a security engineer I to use SnowAlert. With a basic understanding of SQL and the knowledge of the Snowflake Information Schema and Account Usage system-defined tables and views, anyone can define rules to alert on event conditions that involve user authentication and system usage.

I have another post planned that will illustrate how to ingest semi-structured data and structured asset data from Azure or AWS into Snowflake and create SnowAlert rules to enable data-driven security analytics use cases with Snowflake Cloud Data Warehouse.

Need Snowflake Cloud Data Warehousing and Migration Assistance?

If you’d like additional assistance in this area, Hashmap offers a range of enablement workshops and consulting service packages as part of our consulting service offerings, and would be glad to work through your specifics for you.

Feel free to share on other channels and be sure and keep up with all new content from Hashmap by following our Engineering and Technology Blog including other Snowflake stories.

Ed Fron is an Enterprise Architect with Hashmap providing Data, Cloud, IoT, and AI/ML solutions and consulting expertise across industries with a group of innovative technologists and domain experts accelerating high value business outcomes for our customers.

Connect with Ed on LinkedIn and be sure to catch Hashmap’s Weekly IoT on Tap Podcast for a casual conversation about IoT from a developer’s perspective.

--

--