AWS Lambda with Fetch, Lodash, and S3 json file

Rick Delpo
7 min readDec 11, 2021

--

My Lambda Adventure……………by Rick Delpo 12–11–21

This year (2021) I went Serverless using AWS Lambda and S3 as my data table. OMG, I ditched Java and JQuery in favor of Lambda !

I Love Lambda but don’t especially love the learning curve to get a full app going. Spoiler alert, u can test some simple lambda right inside lambda to drastically reduce the learning curve or u can write a full stack app which is what I am doing.

Recently (2021) I was playing around with React. I needed a change because I am an old dinosaur and have been using JQuery for years. Then soon in, I realized this Serverless craze out there in Dev Land. Since I already had an Apache24 instance on AWS EC2 I started getting curious about how I could use ONLY Lambda and S3 so I could eliminate EC2 and my clunker database.

I dug up lots of Lambda tutorials and immediately found out how to manipulate a Dynamo NoSQL database on AWS. This NoSQL idea started nagging me because I am very old school and have been using MySQL for ions. I was also playing around with Axios on the front end as my http client.

Then it dawns on me that data is always rendered in JSON and that NoSQL is also JSON. But using Dynamo was getting to me so I thought I would cut to the chase and do NoSQL with JSON directly in S3. Why not make S3 my database? I thought. There are more reasons not to do this than to do, but I am experimenting here.

I also ran into rendering problems in React. Array.reduce was taking too long on large tables. Hellooo….introducing Latency and Cold Starts. At first I got real discouraged with Lambda because of Latency but I found a way around it all and my SQL background helped immensely here. I discovered that Lodash was a more user friendly way to write reduce code. Objective here is to have a nice reduced view of my data, a dashboard view so to speak. A much condensed view of the data. But to do all this with Dynamo became unwieldy and complex, not to mention pricey. So I thought of a down and dirty S3 approach where I insert new data, save, fetch, reduce and save again all in the background with a small Lambda program.

Since I was new to Lambda this project seemed very daunting at first. But I was not new to ad hoc requests from my team mates at work. When they request a certain view of the data they simply want it on a dashboard and they never seems to care how I make this magic work. They just want it and they want it now. Since many of these ad hoc requests most always only involve 1 table that constantly gets updated and re rendered I figured S3 and Lambda would do the trick, and it turns out that I was right about this.

At the same time, though, I was finding React kind of complicated so I opted for just a plain vanilla javascript front end. Excuse me all of u out there but sometimes, based on the requirement, we just need to break some rules. Also PS I was using react without Node anyway with the Babel library. There was another insidious thing going on too. Babel was setting a cookie and i just spent the last 2 years trying to be cookie free.

By now folks, I am sure u just want to see some code so I can show u my Lambda right now

update….it is now 2023 and below code does not work in Node18 unless u include the aws-sdk in a deployment package which is a major pain in the butt. The below code does work in Node16 where the aws-sdk library is native. (below was originally written in Node14)…PS: passing javascript variables using Python in Lambda still uses the sdk (boto3 library) natively without a deployment pkg in Node18. See end of this page for Lambda Python version but if you go the Python route we then cannot use lodash which is a nodeJS library….it is now 2024 and upon review of this article my recommendation is for a total rewrite, but use Python in AWS Lambda because it is much easier..At this point it is best to start searching for newer Lambda tutorials…sorry about having to say this but things just get outdated.

//demo of passing javascript params into s3 json file, acting as my database//this is a Node.js Lambda//we need to upload 2 modules, Lodash and node-fetchconst AWS = require(‘aws-sdk’);const fetch = require(‘node-fetch’);const lodash = require(‘lodash’);const s3 = new AWS.S3();//note : 2 node modules were ziped into same folder called nodes2.zip using 7zip, then importedexports.handler = async (event) => {const res = await fetch(‘https://rickd.s3.us-east-2.amazonaws.com/tracker2.json'); //get current working json file in s3const array = await res.json();//pass user geo javascript variables into environmentarray.push({country: event.country2,session: event.ses,page_name: event.page,hit: event.hit2,ip: event.ip2,time_in: event.time2,time_out: event.time3});//S3 bucketvar params = {Bucket: ‘rickd’,Key: ‘tracker2.json’,Body: JSON.stringify(array), //pass fetch result into body with update from pushContentType: ‘json’,};//save above data back to s3 json filevar s4Response = await s3.upload(params).promise();//reduce above array into a new view using lodashvar result = lodash(array).groupBy(‘country’).map((key, country) => ({country: country,unique_users: lodash.uniq(lodash.map(key, ‘ip’)).length,hits: lodash.size(key, ‘hit’) //using size reduces result to one val}))//pass result into sort descending functionlet sortedInput = result.slice().sort((a, b) => b.unique_users — a.unique_users);//this time bucket params passes lodash into new json file called view.jsonvar params2 = {Bucket: ‘rickd’,Key: ‘view.json’,Body: JSON.stringify(sortedInput), //pass fetch result into bodyContentType: ‘json’,};//save reduced data back to s3 view.json filevar s3Response = await s3.upload(params2).promise(); //save to s3//return sortedInput; //only return this if testing otherwise not needed};

Then there are all the little caveats and obstacles u need to know about how to actually get this all up and running in AWS.

Examples of gotchas and caveats

Hint

1 In Lambda we need to import 2 node libraries

2 we also need an api gateway in AWS to provide a link to Lambda

3 we need a json file in S3 and need to know how to write to this file

4 beware of AWS IAM permissions…as they are always a problem for beginners

5 Newsflash!! BTW we also need AWS style SSL and a Cloudfront distribution

6 and DNS pointed to AWS route53…..had enough yet?

Hold on there pardner !!! way too much information…there is no way I can teach all of this without going on a massive tangent and it is overwhelming at first but we gain much satisfaction after crossing this hurdle because it catapults us right into modern day 2020s Serverless Knowhow.

Remember this is all just to get some Lambda going. But don’t u as a developer want to be cutting edge? I do and for me this is all in the rear view mirror now. Take a deep breath and convince urself that this endeavor is a must do.

BTW…..all the above is on a Windows platform..I don’t have a clue about doing this in Linux or whatever else is out there.

Now for my front end code

//Frontend, this is just a snippet of my code…..At some point I can provide the whole TutorialpassVars(); //run passVars then on each pg click run passVars again to capture pg namefunction passVars() {//then pass 5 objects… ses, hit2 etc to lambda and call event.ses inside lambda..note, object must be called article//passing my geo vars from jsonpconst article = { ses: session, city2: city, hit2: date, page:pg_name, ip2:ip_address,country2:country, time2:now, time3:out };          //this is an API Gateway frontend linkaxios.post(“https://xx8pchcrt4.execute-api.us-east-2.amazonaws.com/default/lodash2",article).then(res => {//console.log(res.data);});//this axios is for user time in….see onpopstate for user time out…this is all part of my Geo Tracking app}

More to come soon !

Original content can be found at https://javasqlweb.org

or Google Rick Delpo and find me out there on the Web

Happy Coding !

below is Python code to accomplish same thing in Lambda, added November 2023

import json
import boto3
def lambda_handler(event, context):
s3 = boto3.client(‘s3’)
resp=s3.get_object(Bucket=’rickd’, Key=’tracker.json’)
content = resp[‘Body’]
list = json.loads(content.read())
list.append({“date”:event[‘hit2’],”country”:event[‘country2’],”page_name”:event[‘page’]})
s3.put_object(Bucket=’rickd’, Key=’tracker.json’, ContentType =’text/html’, Body=json.dumps(list).encode())

note: need to execute above frontend javascript code for this to work and a minimal list object must first be created and stored as a .json file in order for this append python code to work

In Lambda the above Python code will work without uploading anything but if u still want to use Fetch in NodeJS follow the below steps:

below added Nov 29, 2023

assuming we have Node installed globally in windows, open a cmd prompt and type the below steps (might need to tweak a bit if below is not 100% accurate..not sure)

1 mkdir nodejs
2 cd to this nodejs folder
3 npm init -y
4 npm install node-fetch
if u want another module do a second entry
5 npm install lodash
in nodejs folder u will now have 3 files 1. node_modules 2. package.json 3. package-lock
6 go to ur favorite zip tool and select these 3 files we created in nodejs folder
7 then zip into 1 file called nodejs.zip
8 while in Lambda press ‘upload from’ while in Code tab
9 then press zip file and locate the nodejs zip file we just created and press enter
note this part needs to be done before creating the index.js file otherwise current contents will be erased
10 after uploading open the node_modules folder in the lambda file directory and u will see lodash and node-fetch modules there
11 next create ur index.js or index.mjs depending on which node version u are using
also note that to upload a zip it must be less than 10 mb in size or u will need a separate deployment package

thanks again for reading !!

for more about Rick Delpo click here

--

--

Rick Delpo

With 20 years experience in web dev and data management, Rick has witnessed the entire evolving of the web. Now offering FREE tutorials for beginners.