Asynchronous API with DynamoDB Streams

Best of Speed and Resilience

Vikas Solegaonkar
May 13 · 5 min read

Responsiveness is one of the most important parameters for the success of any web application. And asynchronous processing is The Solution for attaining this responsiveness. A server request from the browser should return immediately — without wait for completion. The data flow should be designed in a way that it does not depend upon an immediate response from the server.

There are several architecture patterns to do this. But a major problem with asynchronous processing is error handling. How would the client know if the request failed? We should not lose data in the process. In fact, a fire and forget service, cannot afford to fail. Even if the processing failed for some reason, the data has to reach the DB.

DynamoDB streams provides a cool solution to this problem. Let’s check it out

What is DynamoDB Streams

Most of the traditional databases have a concept of Triggers. These are events generated when some data change in the DB. DynamoDB Streams are quite similar. With one difference — instead of generating a distinct trigger per data change, it generates a stream of events that flows into a target — Lambda or Kinesis.

We can have a Lambda function triggered by such events — which can process this data. The incoming API call can directly dump the data into the DynamoDB — using API Gateway service integration. This ensures very low response time in the API. The DynamoDB streams can be configured to invoke a Lambda function that can

We have an advantage here — the data is already available in our DB before we start processing it. Even if the downstream processing fails, the data is already available in the DB. And in the unlikely event of DB insert/update failing, the API itself will return an error and the client will be able to handle it.

Thus, we have best of both the worlds — low response time as well as error resilience. Let us try to implement this in our AWS account.

Lambda Function

Start with creating a Lambda Function. This is simple. Just go to the Lambda Console, create a new Lambda Function. Call it StreamProcessor. (or any other name you like). Add the below code in it:

IAM Role for Lambda

This lambda function should have enough permissions to operate on the streams of DynamoDB, along with the update permissions. To get this, create a new IAM Role.

Along with the usual Lambda permissions, include the below for DB permissions

DynamoDB Table

Next, we create the table in DynamoDB. This table should have a primary key — id. On the Exports & Streams tab, click on Enable DynamoDB Streams (Not the Kinesis Streams).

Along with that, add the Lambda Triggers to the stream. Click on Create Trigger and choose the Lambda Function we created above. And that sets up the stream + trigger.

Then, go the Additional Settings tab. There, we can enable the TTL as below.

That will setup the DynamoDB table for us.

API Gateway

Finally, we come to the API Gateway — to create an API. A client can invoke this API. Here, we configure the API Gateway to add the request into the database table. This is achieved using the AWS Service integration of the API Gateway. Let’s work on that.

Create a new REST API on the API Gateway. And add a Put method in it. Integrate the request with DynamoDB as below:

And add this to the JSON Mapping Template.

For every API call, this will add a new Item to the table — with the Request Id as the key. This insert will trigger the Lambda function and take the processing ahead.

Benefits

One would ask — what is the big deal? Why not directly call the lambda from API Gateway? There are two aspects to this. Foremost, the response time. Directly adding data from API Gateway to DynamoDB is a lot faster than invoking a Lambda. So the client application sees superfast response.

Second, resilience and error handling get complex when we invoke Lambda directly from the API gateway? What do we do if the Lambda fails half way? Retry the same API? Invoke another API that will heal the damage? This is too much intelligence to be carried into the client. It is best that the server manages this for itself.

The client just dumps data into the DynamoDB, and is assured that its data is accepted and will be processed by the server. It will not be lost because it is already in the DB. The processing and error handling is then localized on the server — leading to a cleaner architecture.

Nerd For Tech

From Confusion to Clarification

Nerd For Tech

NFT is an Educational Media House. Our mission is to bring the invaluable knowledge and experiences of experts from all over the world to the novice. To know more about us, visit https://www.nerdfortech.org/. Don’t forget to check out Ask-NFT, a mentorship ecosystem we’ve started

Vikas Solegaonkar

Written by

https://blog.thewiz.net

Nerd For Tech

NFT is an Educational Media House. Our mission is to bring the invaluable knowledge and experiences of experts from all over the world to the novice. To know more about us, visit https://www.nerdfortech.org/. Don’t forget to check out Ask-NFT, a mentorship ecosystem we’ve started

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store