Using AWS Lambda Scheduled tasks to Scale Dynamodb

Quodlibet
3 min readOct 11, 2015

At AWS:ReInvent 2015, Amazon announced scheduled execution of lambda functions.

This functionality can be very helpful for scheduling changes to the provisioned throughput of dynamoDB tables, for scenarios where your load is predictable based on the time of day.

This is a tutorial on how to setup a scheduled lambda function that automatically scales your dynamodb tables throughput based on a configurable schedule.

To schedule the configuration, a properties file on s3 is used. For each table that you want to scale, you can maintain a schedule for when you want the throughput to change.

Here is an example of the configuration file to scale 2 tables :

Download the file here.

As an example, I created these two tables with a read+write throughput of 1.

The code : (github)

The configuration file can be read from s3 :

Depending on the current hour, get the desired read- and write throughput from the configuration :

The actual scaling method :

Create the lambda function :

The results of the first execution :

Scaling for hour : 05
example_scaler
Requested throughput equals current throughput
example_scaler2
Requested read/write : 10/10
Current read/write :1/1
Status : UPDATING

Effect on the dynamodb table :

Results of test execution of lambda function

To schedule the lambda function to run every hour, an event source with a schedule :

Create a new event source
Add a scheduled event source to the lambda function

Get the complete code on github.

If you find this interesting, or have any suggestions or questions, let me know by writing a response below!

--

--