Day 2: Capture segment events

Gijs Nelissen
Solving Marketing Attribution
4 min readApr 22, 2020

Today I only want to deploy some simple code to capture and store track() and identify() calls.

To keep it as simple as possible I will be using AWS lambda function and DynamoDB. I can probably get starting in less than an hour saving these raw events in a data store for analysis a little later

Setting up the project

This project will be using javascript and lambda. So I assume you know the basics around npm, git. Ensure you have node 10.19.0 (i use nvm for that), the project might work in other versions but just to be sure.

mkdir solving-marketing-attribution && cd solving-marketing-attributiongit initnpm initnvm use 10.19.0npm install -g serverlessnpm install body-parser aws-sdk serverless-http express --savenpm install serverless-offline --save-dev

To simplify deployment and speed up the overall development we’ll be using serverless (which is awesome) and Amazon Web Services. Check out the getting started guide.

Log into your AWS account and find (or create) your access keys. Another guide for that. After that tell serverless to use them

sls config credentials --provider aws --key YOURKEY --secret YOURSECRET

All done. Let’s get started.

Testing Serverless

Add this basic boilerplate to serverless.yml

serverless.yml

And this very basic code to index.js

index.js

Now let’s try it

sls deploy

If all is good you will get some output with your endpoint:

Serverless: Packaging service...
Serverless: Excluding development dependencies...
Serverless: Service files not changed. Skipping deployment...
Service Information
service: solving-marketing-attribution
stage: dev
region: eu-west-1
stack: solving-marketing-attribution-dev
resources: 11
api keys:
None
endpoints:
POST - https://[YOURURL]/dev/hello
functions:
hello: solving-marketing-attribution-dev-hello
layers:
None

After which you can try to hit that endpoint using a POST request

curl -X POST https://[FILLINURL].com/dev/hello

Voila! We have a working boilerplate for the project. I use httpie myself and not curl so would be:

Let’s keep going!

The database — Dynamo

Honestly, I started with Postgres (using Prisma). It worked but was a lot harder to explain needing to model the database schema, set up the RDS instance and managing migrations.

For the sake of learning something new (have not done anything with DynamoDB yet), and keeping this guide simple i am going with AWS DynamoDB.

For day 1 we need 2 tables to store track() and identify() calls. Update your serverless.yml file to this:

serverless.yml

Nothing special here: Create 2 tables, add some permissions and set some ENV variables to use later in the storing code.

Hit sls deploy and see serverless create the resources for you. Under the hood they are using cloudformation for you to document the resources you need in serverless.yml. It’s pretty awesome.

If you log in to your AWS account and go to DynamoDB you should now see something like this:

Created tables. SMA = Solving Marketing Attribution

Storing Events (using lambda)

Now the only thing we need to do is create a controller that accepts the body of a segment webhook and stores it into the newly created tables.

Hit deploy again. You can do it faster by only deploying the function so it will skip the Cloud Formation setup.

sls deploy --function=hello

Mucho faster!

Deploying and testing

To test if everything is working create some fake json files with identify() and track() calls. I grouped them in a folder /events

The easiest way to get to those is to log into your segment account, click a source and go to the debugger.

Click `RAW` and copy the event into a json file

If you’re too lazy you can borrow some of mine(took out IP address and other sensitive info).

events/identify.json
events/page.json

Now to test this using HTTPie use this:

http POST https://YOURURL/dev/events < events/identify.json

It should play back the event for you like this:

Output of the command

And 200 status code means that it’s now stored in DynamoDB. Some more tricks.

I use a variable to store the endpoint host:

export BASE_DOMAIN=https://YOURURL/devecho $BASE_DOMAINhttp POST $BASE_DOMAIN/events < events/identify.json
http POST $BASE_DOMAIN/events < events/track.json

From now on I will use this shorter syntax.

If you want to do console.logs in your code and see them use the following command

sls logs --function=hello -t | tr '\r' '\n'

The part after the pipe is fixing a bug with end lines and node 10. It’s a known bug.

See you tomorrow!

Update posts:

Day 0: Why am I doing this?
Day 1: The Plan

Day 2: Capturing and Storing segment events
Day 3: Analysing the Referrer (upcoming)
Day 4: Running in production + API
Day 5: Importing Historic Data
Day 6: Feeding attribution back to segment
Day 7: Trigger webhooks + Visitor Source Reporting (mixpanel)

--

--

Gijs Nelissen
Solving Marketing Attribution

Belgian Techie. Builder. Bootstrapper. Dad x3. Entrepreneur. Smarty pants. Passionate about the web & technology. Founder of @prezly