Day 6: Feeding source attribution data back to Segment.com

Gijs Nelissen
Solving Marketing Attribution
5 min readMay 2, 2020

So in the first 5 days we have created and deployed some lambda functions that:

  1. store incoming segment events in a dynamoDB table (webhook)
  2. store mapping between anonymousId and userId in DynaoDB table (webhook)
  3. process URL and HTTP referrer and detect where the traffic is from using this github library
  4. allow us to query (for debugging purposes) that data by userId or anonymousId

Additionally I have imported a 3 year archive of page() and identify() calls to not have to start from scratch.

Next up: Feeding this data back to segment.com.

feeding information back to segment

Triggering the right segment events

Over time we want to trigger those events in real-time but I will start by providing an endpoint to manually trigger them so we can debug in segment, mixpanel and customer.io if everything is coming in.

In the first version of the below code I was using segment analytics node SDK (https://github.com/segmentio/analytics-node). It worked fine when triggered through command line (node script.js) but working with Lambda/Async architecture I found some calls not to be triggered. Then I found this bug so switched to using the HTTP tracking API specifically the batch API to feed my events.

For now we’ll start with 2 attribution models: First and Last touch. To allow the different reporting tools to use that information in different reports we will write user traits using identify()

The information coming out of the visitor source detection (see Day 3 — Identify User Source) looks like this

Not sure if segment supports nested properties but I tried it in the past and didn't get it to work. A quick google search thought me that it’s better to flatten the properties. As we’re sending these properties for both first and last touch the end result will look something like this:

This is the file that does most of the heavy lifting. Not too proud about this file but I told you I’m not the best JS programmer. It works though 😊

I have refactored the codebase a bit to use dotenv in combination with serverless. Here are some of the changes:

npm install serverless-dotenv-plugin --save

Create an .env file

STAGE=dev
REGION=eu-west-1
SERVICE_NAME=sma

IDENTIFY_TABLE=sma-dev-identify-event
PAGE_TABLE=sma-dev-page-event
ATTRIBUTION_TABLE=sma-dev-source-attribution
USER_MAP_TABLE=sma-dev-user-anonymous-map

ANALYTICS_WRITE_KEY=[YOUR_ANALYTICS_WRITE_KEY]
ANALYTICS_SOURCE_IDENTIFICATION_EVENT="Source Identified"

And update the serverless.yml file like this:

serverless.yml

And another serverless.yml file for the segment handlers:

segment.serverless.yml

Notice how the handlers are now in different files. To make the segment part work create the following handlers (all in one gist file).

A quick sls deploy will result in a few new endpoints:

Serverless: DOTENV: Loading environment variables from .env:
Serverless: - STAGE
Serverless: - REGION
Serverless: - SERVICE_NAME
Serverless: - IDENTIFY_TABLE
Serverless: - PAGE_TABLE
Serverless: - ATTRIBUTION_TABLE
Serverless: - USER_MAP_TABLE
Serverless: - ANALYTICS_WRITE_KEY
Serverless: - ANALYTICS_SOURCE_IDENTIFICATION_EVENT
Service Information
service: sma
stage: prod
region: eu-west-1
stack: sma-prod
resources: 62
api keys:
None
endpoints:
GET - $ENDPOINT_PROD/api/anonymous/{id}
GET - $ENDPOINT_PROD/api/user/{id}
POST - $ENDPOINT_PROD/events
GET - $ENDPOINT_PROD/segment/identify/user/{id}
GET - $ENDPOINT_PROD/segment/track/anonymous/{id}
GET - $ENDPOINT_PROD/segment/track/user/{id}
functions:
getAnonymous: sma-prod-getAnonymous
getUser: sma-prod-getUser
processPage: sma-prod-processPage
processIdentify: sma-prod-processIdentify
storeEvent: sma-prod-storeEvent
identifyUser: sma-prod-identifyUser
trackAnonymous: sma-prod-trackAnonymous
trackUser: sma-prod-trackUser
layers:
None

Let me explain the endpoints:

  • POST — $ENDPOINT_PROD/events → Segment to send every event payload and store it in DynamoDB
  • GET — $ENDPOINT_PROD/segment/identify/user/{id} → Fire identify() call with source_first_* and source_last_* properties
  • GET —$ENDPOINT_PROD/segment/track/anonymous/{id} → Fire track('Source Identified') calls with flattened visitor detection properties for anonymous users.
    GET — $ENDPOINT_PROD/segment/track/user/{id} → Fire track('Source Identified') calls with flattened visitor detection properties for identified visitors.

Here is a quick command to update your endpoint so you can run curl/HTTPie commands after deploying or destroying the serverless application

export ENDPOINT_PROD=$(sls info --verbose --stage=prod | grep ServiceEndpoint | sed s/ServiceEndpoint\:\ //g | awk '{print $1}')

Changed the name from BASE_DOMAIN to ENDPOINT_PROD

Testing User Properties (with attribution data).

Let’s trigger the identify for the user with the new attribution data:

http GET $ENDPOINT_PROD/segment/identify/user/[YOUR_USER_ID]

Watching the segment event debugger you’ll see identify() calls coming in:

identify() payload in segment

If all is well you will see that information appear in any integrations you have linked up. In our case CustomerIO and Mixpanel:

Profile in Customer.io

Source Identified Events

For other reports (Funnels, Flow Diagram, …) it will be useful to spawn events whenever the user visitor source is detected.

Let’s trigger those calls for the user with the new attribution data:

http GET $ENDPOINT_PROD/segment/track/user/[YOUR_USER_ID]

Watch the segment debugger and see new events (you can specify the event name in .ENV) coming in

Source Identified events

This is possible because I have imported all old events. Every event will have a historic timestamp property which will feed the events as old events. More information in segment Importing Historic Events documentation.

This will make a profile in Mixpanel look like this. Notice both the event information on the left, and the user trait information on the right.

Profile in mixpanel

That’s it. Tomorrow we’ll be exploring how to fire those tracking and identify calls for the last months visits and explore how to report on this newly available information.

Previous posts:

Day 0: Why am I doing this?
Day 1: The Plan

Day 2: Capturing and Storing segment events
Day 3: Analysing the Referrer (upcoming)
Day 4: Running in production + API
Day 5: Importing Historic Data
Day 6: Feeding attribution back to segment
Day 7: Trigger webhooks + Visitor Source Reporting (mixpanel)

--

--

Gijs Nelissen
Solving Marketing Attribution

Belgian Techie. Builder. Bootstrapper. Dad x3. Entrepreneur. Smarty pants. Passionate about the web & technology. Founder of @prezly