AWS Lambda Invocation from RDS PostgreSQL with DLQ Integration

Shubham
Simform Engineering
9 min readOct 30, 2023

Efficiently handle RDS PostgreSQL events using AWS Lambda with DLQ

Have you thought about integrating AWS to manage inventory and respond to critical events? Imagine a scenario where your inventory is running low, and you need to send immediate email notifications to restock. The magic happens when an AWS Lambda function is triggered directly from an RDS PostgreSQL database to perform such follow-up actions.

This seamless serverless workflow not only helps you to streamline inventory management but also ensures that no critical messages go unnoticed, thanks to the Dead Letter Queue (DLQ) integration.

The below flowchart shows the order in which all events take place.

It’s worth noting that, as of now, Lambda can be seamlessly invoked from PostgreSQL and Aurora DB.

To embark on this learning spree, you’ll need a few prerequisites to ensure smooth sailing:

  • An AWS account with sufficient permissions to create resources.
  • DBeaver or a similar database client to connect your Amazon RDS instance.
  • Basic knowledge of AWS services used in this blog: RDS, Lambda, and SQS.

Setting Up Amazon RDS to Connect with Database Client

Step 1: Create an Amazon RDS Instance

Let’s assume you already know how to create your own RDS Postgres instance. If you want to connect your RDS instance with your local database client, follow step 2, or continue from step 4.

Step 2: Allow Connection Requests from Database Client

To establish a connection between your RDS instance and the database client, ensure that you’ve set the “Connectivity” section to allow public access by setting it to “Yes”.

If you’re working with an existing RDS instance, you can easily modify the settings to grant public access and enable seamless connectivity. Update the security group of your RDS instance to allow inbound connections from your database client running on your PC.

Follow the below steps to allow connection:

  1. Open the VPC security group under the connectivity and security section.

2. Open security group.

3. Edit inbound rules.

4. Click on the add rule and add your IP address to allow access to port 5432, on which your PostgreSQL RDS is running.

Step 3: Connect with the DBeaver Database Client

Connecting to your RDS instance from your local development environment is straightforward using the DBeaver database client:

  1. Open DBeaver and navigate to “Database” >> “New Database Connection”, or press Ctrl + Shift + N
  2. Select your database type (in this case, PostgreSQL) and provide the necessary connection details, including the host, port, database name, username, and password. You can find the host address and port under the connectivity and security section.
  3. Test the connection to ensure it is successful, and click “Finish”.

You’re now connected to your RDS database and can manage it conveniently using DBeaver.

Setting up AWS Lambda Function to Get Triggered From RDS

Step 4: Create an AWS Lambda Function

AWS Lambda enables you to run code without the need for server provisioning or management. Create a Lambda function to handle actions when inventory quantities reach zero and send email notifications.

For this blog, I have created a Lambda function with the name inventory-alert.

Follow the below steps to create your Lambda function:

  1. Navigate to AWS Lambda.
  2. Create a Function with your desired runtime, and we will use Python 3.10.

Step 5: Add AWS Lambda Invocation Role to RDS Instance

To invoke Lambda from RDS for the PostgreSQL DB instance, specific permissions are needed.

One way is to create an IAM policy that allows invoking Lambda functions. This policy can then be associated with a role that is subsequently linked to your DB instance.

Using this method grants your DB instance the necessary privileges to trigger the designated Lambda function.

The following steps will help you achieve this:

  1. Create a policy using the policy editor with the following permissions:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowAccessToInventoryAlertFunction"
"Effect": "Allow",
"Action": "lambda:InvokeFunction",
"Resource": "arn:aws:lambda:us-east-1:921358061875:function:inventory-alert"
}
]
}

2. Give your policy a name. In this case, it is lambda-invocation.

3. Create a role that policy can assume at runtime using a custom trust policy.

4. Attach the lambda-invocation policy to that role.

5. Add that role to the RDS instance. You can add it under the manage IAM roles section.

Step 6: Lambda Invocation

To enable AWS Lambda integration with your RDS PostgreSQL DB instance, include the “aws_lambda” PostgreSQL extension in your RDS configuration. This extension helps your RDS PostgreSQL DB instance with the capability to invoke Lambda functions directly from PostgreSQL.

Follow these steps to test Lambda invocation from your DB instance:

  1. Install the aws_lambda extension.
CREATE EXTENSION IF NOT EXISTS aws_lambda CASCADE;

2. Test Lambda invocation with the following SELECT statement:

SELECT * FROM aws_lambda.invoke( 
aws_commons.create_lambda_function_arn('arn:aws:lambda:us-east-1:921358061875:function:inventory-alert'),
'{"body": "Hello from Postgres!"}'::JSON
);

Here, use your Lambda function arn inside aws_commons.create_lambda_function_arn() function. Your output should look like this:

Dead Letter Queue (DLQ) Configuration

Step 7: Create a Dead Letter Queue (DLQ)

A Dead Letter Queue (DLQ) is essential for handling errors and failed Lambda executions. To create DLQ:

  1. Navigate to the AWS SQS (Simple Queue Service) service.
  2. Click “Create Queue” and select “Standard Queue.”
  3. Configure the queue settings, including the queue name (we will be using inventory-alert-dlq) and other relevant attributes, or you can keep the default.

Step 8: Set DLQ Policy to Existing Lambda Function Role

To set a policy for an existing IAM role that allows a Lambda function to write messages to a Dead Letter Queue (DLQ), add the necessary permissions to the role’s policy. This can be achieved by below steps:

  1. In the function’s configuration section, scroll down to the Permission and select your role.

2. Select Add Permission > Create Inline Policy to the existing role.

3. Add SendMessage policy using the policy editor.

{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowLambdaToSendMessageToDLQ"
"Effect": "Allow",
"Action": "sqs:SendMessage",
"Resource": "arn:aws:sqs:us-east-1:921358061875:inventory-alert-dlq"
}
]
}

4. Name the policy. Here, we have named sqs-send-message.

5. The last result will look like this.

Step 9: Set Up a Dead Letter Queue (DLQ) for AWS Lambda

AWS Lambda automatically retries failed function invocations with settings for maximum retries and retry intervals.

If all retries are exhausted, or if the function fails, the event can be sent to a Dead Letter Queue (DLQ) for further analysis or manual handling. It helps diagnose and address failure causes in serverless applications.

Note that DLQs work with asynchronous invocations and custom error handling is needed for synchronous invocations.

To set DLQ, follow the below steps:

  1. In the function’s configuration section, scroll down to the Asynchronous Invocation and click on edit.

2. You can set retry attempts which can be between 0–2. And also, select Amazon SQS as your DLQ service and select your DLQ.

Let’s put everything into practice and integrate all services. Suppose you have an inventory management system in your RDS database, and you want to send email notifications when the inventory quantity of a product reaches zero:

For this scenario, let’s use an inventory table with the following table schema and records:

CREATE TABLE inventory ( 
product_id SERIAL PRIMARY KEY,
product_name VARCHAR(255) NOT NULL,
quantity INTEGER,
price DECIMAL(10, 2)
);

INSERT INTO inventory (product_name, quantity, price) VALUES
('Product A', 100, 19.99),
('Product B', 50, 29.99),
('Product C', 75, 14.95),
('Product D', 200, 9.99);

Now, create an after-update trigger that invokes the inventory-alert lambda function.

CREATE OR REPLACE FUNCTION invoke_lambda_on_zero_quantity() 
RETURNS TRIGGER AS $$
BEGIN
IF NEW.quantity = 0 THEN
-- Replace 'create_lambda_function_arn' with the actual ARN of your Lambda function
PERFORM aws_lambda.invoke(
aws_commons.create_lambda_function_arn('arn:aws:lambda:us-east-1:921358061875:function:inventory-alert'),
CONCAT('{"product_name": "', new.product_name, '"}')::json,
'Event'
);
END IF;
RETURN NEW;
END;
$$ LANGUAGE plpgsql;



CREATE TRIGGER check_quantity_zero
AFTER UPDATE OF quantity ON inventory
FOR EACH ROW
EXECUTE FUNCTION invoke_lambda_on_zero_quantity();

For this situation, let’s use the inventory-alert lambda function with a custom raise exception statement. You can add your logic to send emails (it can be achieved by AWS SES service using boto3).

import json 

def lambda_handler(event, context):
# add logic to send email
raise Exception('email service is unavailalble')

# Return a response indicating success
return {
'statusCode': 200,
'body': json.dumps('Email sent successfully')
}

We have already set inventory-alert-dlq for all failure events to be sent to DLQ.

To invoke the Lambda function, update the “quantity” column in your RDS inventory table using SQL statements. For example:

UPDATE inventory 
SET quantity = 0
WHERE product_id = 3;

The after-update trigger configured on the “inventory” table will detect the zero quantity and asynchronously invoke the Lambda function.

If the Lambda function encounters any errors, it will send failure events to the configured DLQ for further analysis and troubleshooting.

In our case, we are raising a custom exception for each Lambda call, so after all retry attempts are exhausted, the invocation will go to inventory-alert-dlq. And we can check the message count on the SQS dashboard.

In real life, DLQ events are handled manually or by driving those events to the same Lambda or some other Lambda based on business logic.

Here, let’s drive those messages to another Lambda function named handling-inventory-alert-dlq which will print the event message just for us to know how the DLQ message is structured.

This can be achieved by:

  1. Create a Lambda function named handling-inventory-alert-dlq with print(event) statement in lambda_handler function.
  2. Update execution role policy with the following SQS policy to receive a message from SQS and delete it from SQS after the message is processed by a consumer (in this case, handling-inventory-alert-dlq lambda function):
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowLambdaToConsumeMessageFromDLQ"
"Effect": "Allow",
"Action": [
"sqs:DeleteMessage",
"sqs:ReceiveMessage",
"sqs:GetQueueAttributes"
]
"Resource": "arn:aws:sqs:us-east-1:921358061875:inventory-alert-dlq"
}
]
}

3. Configure SQS Lambda trigger to forward the message to the consumer (in our case, the consumer is handling-inventory-alert-dlq lambda function)

Note: Ensure that the execution time of your Lambda function is set to a duration shorter than the SQS visibility timeout. This timeout is the window during which a message retrieved from an SQS queue remains “invisible” to other consumers.

To prevent the message from being processed by another Lambda instance or consumer, it’s crucial that your Lambda function processes and deletes the message from the queue within this visibility timeout period.

So, now whenever DLQ receives any message, it will redirect those messages to the desired consumer based on the trigger configured. And since we already had one message in our DLQ, it will be consumed by the handling-inventory-alert-dlq function and print event in the log as shown:

By following these comprehensive steps, you can build a resilient serverless application on AWS, effectively managing inventory and gracefully handling errors.

Whether you’re a seasoned AWS developer or just getting started, these techniques will help you create robust and reliable serverless applications.

Stay tuned for more AWS and serverless tutorials with Simform Engineering!

--

--