AWS Lambda: Auto-Backup Toggl Time Sheets to S3

AWS Lambda is a powerful tool in the Amazon Web Services (AWS) suite which lets you execute Node.js/Python/Java functions without having to think about underlying servers. You only pay for what you use — this can make AWS Lambda a much cheaper solution than a normal EC2 server, which runs continuously (and continuously costs money). Additionally, AWS Lambda has quite a generous free tier per month. The pricing overview can be found here: https://aws.amazon.com/lambda/pricing

Lambda Functions are executed via certain triggers — in this post, I will focus on the Cloudwatch Scheduled Events type, probably the most simple one. It triggers a function on a specific schedule, i.e. every week, every day, every minute,…This trigger type makes AWS Lambda very appealing as backup solution — every day, do:

  1. Download data from service you want to backup
  2. Save data to S3 (Amazon Simple Storage Service)

I chose Toggl, a free time tracking tool I currently use for my freelancing activities, as my first service to backup regularly. I think this is a valid use case, as loosing one’s time sheets would result in a loss of money. Luckily, Toggl provides a nice API (we use the Reports API) to programmatically export its data.

Step 1: Writing the Backup Script (Node.js)

AWS Lambda offers a rudimentary web-based code editor to create new functions— but in this editor, you cannot use any npm package except the AWS SDK and ImageMagick (see Lambda Execution Environment and Available Libraries).

Therefore, we will develop the function on our local machine, install suitable npm packages, zip our code together with the node_modules folder and upload this zip-File. As AWS Lambda uses the Node.js runtime v4.3.2, I would suggest using the same on our local machine. I recommend nvm (Node Version Manager) to install this version alongside the version you normally use for other projects.

Backup Script Code

Fill in your own information in config.demo.js and rename it to config.js, do a npm install of the required packages. Zip these two files together with node_modules and you are ready to go!

Notice how did not need to specify a S3 API key, as our Lambda function will have access to S3 automatically via AWS role policies we will configure later.

Possible Feature Enhancements

  • Handle pagination: Toggl’s Detailed Report will only return 50 time entries at once and will reference to the next page with the next 50 entries. Right now, we only download the first page — our script should download subsequent pages and concatenate them together as one file.
  • Export PDF reports, too (just add a .pdf to the API endpoint)
  • Only save a backup if entries were added or changed since the last one

Step 2: Create a S3 bucket

Go to the S3 Management Console, click Create Bucket and choose an appropriate Bucket Name and Region. This will be the bucket where your backups are going to be stored.

Step 2: Creating the AWS Lambda Function

Now it is time to create our Lambda function! Go to the Lambda Management Console and click Create a Lambda function. Skip the Select blueprint section, select CloudWatch Events — Schedule as Trigger and configure how often your function should be run (I set “rate(1 day)”). Do not enable this trigger yet. Next, at the Configure function section, select Node.js 4.3 as runtime, Upload a .ZIP file as code entry type and upload your ZIP file.

Step 3: Setting the correct role policies

As I am not that familiar with AWS configurations yet, I found the Role configuration the hardest part. It is important to set the role and its poli1cies correct to give our function write access to our S3 bucket.

In the Lambda function handler and role -> Role setting, you should choose Create a custom role. A popup will open where you need to choose Create a new IAM Role as IAM Role. Click Allow. Then select this newly created role from Choose an existing role in your main window.

You are not finished yet with roles! Right now, our function has only rights to create log events, not yet to save anything to S3. For that, open the IAM Management Console in a new window, click on Roles and select the newly created role. We have to create a new Inline Policy for S3 access. Click Create Role Policy, select Custom Policy and enter the following policy configuration (as policy name, I chose “S3-toggl-backups-access”):

{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::YOUR_BUCKET"
]
},
{
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject"
],
"Resource": [
"arn:aws:s3:::YOUR_BUCKET/*"
]
}
]
}

Don’t forget to replace “YOUR_BUCKET” with your bucket’s name.

Step 4: Finalizing our Lambda Function

As for the Advanced Settings, we only need to increase the Timeout which limits our execution time (the function gets aborted if it has not finished until then). The backup likely won’t take much longer than 2–5 seconds, — to be sure, I selected “1min”.

That’s all! After reviewing your final function, you can test if everything is working as expected (in your S3 bucket, a new folder with today’s date and a file details.json should be created). If this is the case, you should Enable the CloudWatch Events — Schedule trigger. In the future, your lambda function will get called automatically and create backups — you do not need to worry about a dataloss anymore! :-)

I hope you liked my AWS Lambda tutorial! This is just the tip of what you can achieve with it — much more automation is possible. And at the same time, you keep your budget low — without having to run a dedicated server for this small function. :-)

You can find the full repository here: