Sending Email Notifications from Snowflake Using External Functions

Making Sure the Right People Know When There is a Data Quality Issue

Jalindar Karande
Hashmap, an NTT DATA Company
6 min readJul 12, 2021

--

“We want to send an email notification when something fails in a data load, transformation, etc. Is it possible to send an email alert in such cases from Snowflake without the use of ETL tools?”

This question comes up all the time with Snowflake customers. The demand motivated me to build an email alerting capability through the use of an external function in Snowflake that seamlessly integrates with SQL and helps send email alerts without the need for any ETL tools.

This article explains the process through which I developed this external function as well as sample use cases. I hope that engineers and architects will find this information useful and can adopt the external function approach as a part of their data pipeline design.

More details about the deployment of external functions with Snowflake by Venkatesh S. are presented here. A step-by-step guide by Chris Hastie, presented here, is a nice demonstration of the deployment of Snowflake external functions.

In addition to the steps provided in the demonstrations above, you will need to perform the following to build this email alerting capability:

  • Add permissions to use SES to an IAM role that you want to use for the Lambda function execution.
  • Verify your Amazon SES identity.
  • Create a Lambda function with logic for reading parameters from Snowflake, send an email, and return the status.
  • Create the Snowflake External Function.
  • Move out of the Amazon SES sandbox

Let’s step through each of these in detail.

Add permissions to use SES

1. Create an IAM policy using the JSON policy editor. Paste this JSON policy document into the policy editor:

{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"ses:SendEmail",
"ses:SendRawEmail"
],
"Resource": "*"
}
]
}

Attach the policy created to an IAM role that you want to use for the Lambda function execution.

Verify your Amazon SES identity

  1. Open the Amazon SES console at https://console.aws.amazon.com/ses/
  2. Use the region selector to choose the AWS Region where want to verify the email address.
  3. Under Identity Management, choose Email Addresses.
  4. Click on Verify a New Email Address.
  5. In the Verify a New Email Address dialog box, type the email address in the Email Address field and then click on Verify This Email Address.
  6. Check the inbox for the email address that you’re verifying, you will find a verification link.
  7. Click on the verification link to complete the identity verification process.

Create a Lambda function

The following code snippet shows the Lambda function for reading input from Snowflake, sending an email, and sending a response to Snowflake. Snowflake puts all inputs in the data section of the JSON request body with each input row as a separate array element. In every row, the first element indicates the row number of the input. The response from the Lambda function to Snowflake is also expected to be in a similar format.

def lambda_handler(event, context):
AWS_REGION=<AWS REGION OF VERIFIED EMAIL ID>
client = boto3.client('ses',region_name=AWS_REGION)
event_body = event["body"]
payload = json.loads(event_body)
rows = payload["data"]
SENDER=<VERIFIED EMAIL ID>
for row in rows:
RECIPIENT=row[1]
BODY_TEXT=row[3]
SUBJECT=row[2]
#Provide the contents of the email.
response = client.send_email(
Destination={
'ToAddresses': [
RECIPIENT,
],
},
Message={
'Body': {
'Text': {
'Data': BODY_TEXT,
},
},
'Subject': {
'Data': SUBJECT,
},
},
Source=SENDER,
)

return {
'statusCode': 200,
'body': "{\"data\": [[0,\"Notification Sent!\"]]}"
}

Test case for the Lambda Function:

Every new Amazon SES account is placed into the sandbox, restricting us from sending emails to the verified email address. So, all our test cases should include an address as a verified email address only. The following code snippet shows a sample test case:

{
“body”: “{ \”data\”: [[ 0,\”verified email id\”,
\”Tets Subject\”, \”Test Description\” ]]}”
}

Create the Snowflake external function

The following code snippet shows the creation of an external function with three input parameters RECIPIENT, SUBJECT, BODY_TEXT.

create or replace external function sendNotification
(RECIPIENT varchar,SUBJECT varchar, BODY_TEXT varchar)
returns variant
api_integration = <API INTEGRATION CREATED>
as ‘API_GATEWAY_ENDPOINT HERE';

Moving out of the Amazon SES sandbox

When an SES account is out of the sandbox, you can use that account to send an email to any recipient, regardless of whether the recipient’s address or domain is verified. However, you still need to verify all identities that you use as “From”, “Source”, “Sender”, or “Return-Path” addresses.

The complete procedure to remove your account from the sandbox is presented here.

Some Sample Use Cases

  1. Notifying stakeholders if critical data is out of the normal range.

Example 1 — Notify stakeholders about payment_failures beyond the threshold

Select 
(case when count(payment_failure) > 10000 then
sendNotification('stakeholders email group',
'Increase in Payment Failure',
'More details here')
else null end)
from payment_status_table;

Example 2 — Notify stakeholders about total_sales below the threshold

Select 
(case when sum(sales_amount) < 100000 then
sendNotification('stakeholders email group',
'Drop in sales',
'More details here')
else null end)
from payment_status_table;

2. Notifying errors/exceptions in a data pipeline to developers/stakeholders.

create or replace procedure sample_stored_proc(par1 int)
returns string
language javascript
strict
execute as owner
as
$$
<some code here>
try
{
<some code here>
}
catch (err)
{
snowflake.execute(
{sqlText: "select sendNotification('email ID',
'Failure in Store Procedure','" +err+"')"
}
);
}
<some code here>
$$
;

3. Sending data pipeline failures from SnowSQL

snowsql -o exit_on_error=True -f somefile.sqlif [ $? != 0 ];then
snowsql -q "sendNotification('stakeholders email group',
'Failure in Data Pipeline',
'More details here')
fi

Final Thoughts

Knowing that there are issues, knowing what those issues are, and knowing when they happen can be a game-changer. I hope you find this method of leveraging Snowflake external functions useful and you’ll consider implementing these use cases into your data pipeline design. If you have any other recommendations or needs for other external functions reach out. We enjoy finding solutions to improve your Snowflake environment.

Ready to Accelerate Your Digital Transformation?

At Hashmap, we work with our clients to build better, together.

If you are considering moving data and analytics products and applications to the cloud or if you would like help and guidance and a few best practices in delivering higher value outcomes in your existing cloud program, then please contact us.

Hashmap, an NTT Data Company, offers a range of enablement workshops and assessment services, cloud modernization and migration services, and consulting service packages as part of our Cloud service offerings. We would be glad to work through your specific requirements.

Feel free to share on other channels, and be sure and keep up with all new content from Hashmap here. To listen in on a casual conversation about all things data engineering and the cloud, check out Hashmap’s podcast Hashmap on Tap as well on Spotify, Apple, Google, and other popular streaming apps.

Other Tools and Content You Might Like

Jalindar Karande is a Lead Data Cloud Engineer with Hashmap, an NTT DATA Company, providing Data, Cloud, IoT, and AI/ML solutions and consulting expertise across industries with a group of innovative technologists and domain experts accelerating high-value business outcomes for our customers.

Be sure and connect with Jalindar on LinkedIn and reach out for more perspectives and insight into accelerating your data-driven business outcomes.

--

--