Build a personalized newsletter with AWS cloud services and ElasticSearch

Pierre Cavalet
Oct 15, 2019 · 4 min read

For a french version of this article, click here.

I am part of a team that develops a content platform which is accessed essentially from our newsletter. Our platform is built with a lot of cloud services, and some of them are used to generate our newsletter.

We’ll go trough our newsletter specifications and then dive in its implementation.

Requirements

  1. Our users are interested in different topics, and our content is very specific. We need to be able to send interesting content for a specific user based on the information we have. We don’t want to send a generic newsletter that would not be interesting for a user looking for specific content in his/her field.

Architecture

We use multiple services. The following diagram shows how they interact with each other.

  1. We trigger an AWS lambda from our back-office.

That’s a lot of things to do, but thankfully, it’s also just a chain of small tasks to perform. What is great about this architecture is that everything is broken down to simple steps and you can easily change things without breaking everything.

Initial trigger

The architecture uses a back-office that triggers a lambda trough API gateway to push the SQS messages. We chose this because it was simple for us, but you can use anything you want to push the messages (script, web hooks, etc …).

Handling personalization

This is a brief explanation of how we use ElasticSearch as a recommendation system. If you want the full details, check out my article on How we built a reversible recommendation system using ElasticSearch.

The idea is that we define a list of criteria and we order them. For example:

  1. the user has not read the article

We then use ElasticSearch function score to give a weight to each criterion.

When we send a query to ElasticSearch, the response contains the articles ordered by their score. It also includes the article score itself in the response. This score can be reversed to obtain the criteria.

Final thoughts

Using lambda and an SQS queue to handle our newsletter process worked out well for us. Writing in dynamoDB allows to us to know what happened for each newsletter batch.

We used Serverless framework to deploy both the “trigger lambda” and the “SQS-listening Lambda”, which simplified the deployment process. We also used serverless-offline and serverless-offline-sqs to develop the newsletter, allowing us to simulate everything that would happen on AWS without deploying.

Overall, we are satisfied with the solution and will continue to use these AWS services.

Kaliop

A series of articles made by techs for techs. Writers share their experience and best practices for a more beautiful digital world.

Thanks to Samuel Bouic and Antonin Savoie

Pierre Cavalet

Written by

A web developer passionate about education. Technical Expert @ Kaliop.

Kaliop

Kaliop

A series of articles made by techs for techs. Writers share their experience and best practices for a more beautiful digital world.

More From Medium

More from Kaliop

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade