Exporting DynamoDB Data to Excel using Lambda with s3 on AWS
Working with DynamoDB on AWS is relatively straightforward, especially if you’re familiar with other cloud database services (like Firebase). However, one of the first issues that I ran into with DynamoDB happened when I tried to preview more than 100 records at a time…
Unfortunately, you can’t view more than 100 records at once in the console. This restriction was put in place for obvious reasons, but for previewing larger sets of data within our DynamoDB Table we’ll need a different system.
Simple enough… We could implement a variety of solutions to address this minor drawback (API, 3rd party, etc.). However, for this article we’ll focus instead on a basic approach: Use AWS Lambda to read our DynamoDB Table data and then save it as an Excel Spreadsheet to an s3 bucket.
Build Lambda
Let’s first create the IAM Role for our Lambda with these Policies attached:
Now we can create our Lambda and select our IAM Role:
After that we can create our Lambda function which reads in our DynamoDB data, converts it to an XLSX formatted blob and then saves a timestamped file to our s3 bucket:
Now let’s install our packages and zip the project:
> npm ci && zip -r ../backupUsers.zip .
Next we’ll upload the zip file:
Let’s adjust the Timeout to 1min:
Then click Save and let’s test it out…
If all went according to plan we should see a success message (above) and have a new file saved to our s3 bucket:
Boom! 💥
Automation » CRON + NodeMailer
We can automate this process by adding a CRON schedule and some mailer code to our Lambda function:
Now let’s create our CRON schedule:
Also, don’t forget to add the AmazonSESFullAccess Role to the IAM Role Policies for our Lambda, and update the region key in the code sample.
Conclusion
Our Lambda will now send an email with the latest Excel spreadsheet of our DynamoDB Users Table attached every Monday at 12:01AM PST.
Liked the article? Questions? Ideas? Feel free to reach out!