Rapid and Highly Scalable Development Using Scala and Lagom

Never written an application in Lagom? Me neither. I decided to challenge myself to come up with a real world modern progressive web application in under a week. The task was to create the start of a Project/Task Management tool that was modern and super fast compared to some of the tools out there (I am looking at you Jira). As everyone knows, the first step to creating a project is coming up with a cool name. For this task management tool that simply seeks to add and tick off tasks, I call “TaskTick”. Isn’t that fantastic? … Umm anyways lets continue.

The following blog posts will outline the thought process and the problems solved along the way. I also include full project source code on github, which should be enough to jump start any similar development endeavour. Here are some of the things I tackle:

  • React FrontEnd (PWA) that connects to the Lagom backend using a WebSocket connection.
  • JWT authentication and Auth management:
  • This includes login and registration
  • Restricting access to routes based on the jwt auth token.
  • Publish Events to kafka stream (for a future Notification or other services)
  • Handle OAuth flow to allow for users to connect their github accounts.
  • Event Sourced Project and User Entities

Here is a video of the final product after just one week.

Of course this is far from done, but that was never the point. The point was to see just how far I could push development using the “Opinionated Lagom Framework” in one week.

What is Lagom?

Lagom (pronounced [²lɑːɡɔm]) is a Swedish word meaning “just the right amount”.

Lagom is a framework created by Lightbend that facilitates development of scalable microservices.

From the site:

“Develop distributed systems faster and easier than ever before with Lagom, an open source microservices framework.
Most microservices frameworks focus on helping you build fragile, single instance microservices — which, by definition, aren’t scalable or resilient. Lagom helps you build microservices as systems — Reactive systems, to be precise — so that your microservices are elastic and resilient from within.”


Hello Lagom

My very first step was in fact downloading the example “Hello” project from here:

and spending a few hours (4ish) going through the documentation. If you are like me and have never worked with Lagom, you’ll want to spend a similar amount of time on this step before continuing.

The rest of my document assumes some familiarity with Lagom project structure and terminology.

Getting Started

I begin with a rough outline for the structure of my project. At a very high level I am going to need at least the following:

  • Gateway Service: This will handle Auth and JWT, We will also negotiate websocket connections here as well. OAuth routes will be included. User state and data will get stored in a UserEntity.
  • ProjectManager Service: Store projects, Store Tasks for each project, Store Notes on Task, Publish Events to Kafka
  • Front End PWA (progressive web application): Gulp & transpiler for typescript, MobX for reactive state storage, React + MaterialUI

This will serve as my minimal grouping in what “Domain Driven Design” refers to as “Bounded Context”. Since our application is an extremely simple task manager the bounded context for the actual project will remain in one service, namely the ProjectManager Service. It is worth mentioning that a larger project would further flush out the internals of this domain, providing a larger set of bounded contexts and deepening the domain.

Project Manager Domain

Our simple project manager needs to be able to do a minimal set of operations:

  • Create a new project and add a description to it.
  • Add New tasks to a project:

Tasks have a name and a description, optional date started, date finished, can be assigned to a user, can be in a “done” state, and can have any number of Notes:

  • Notes are further instructions of clarifications that anyone who is part of the project can add to a task
  • Notes simply contain text and the date that they were created.

When we identify the “aggregate roots” in our domain, Lagom maps those to an “Entity”. Here it is clear that “Project” is a root level Entity. What is maybe not as clear, is whether or not make “Task” an Entity? In my case, where I have seriously time boxed my project I decided not to, however there are plenty of good arguments that could be made around making Task into its own Entity.

Behaviour Driven Design

Now that we have our basic domain flushed out, we can start to think about what kind of behaviours our system will have. Behaviours in our system should map to test cases that we can write using Scala Test. These test cases will literally map to the behaviours that we expect our system to have.

My behaviour design started out with a pen and paper and all had the form of:

<?optional pre condition> <target> should <behaviour>

A “Project Entity” should “create a new project when it receives a CreateProject Action”

A “Project Entity” should “add a new task to the list of tasks when it receives an AddTask Action”

So how does all of this map to code? One of the best features of Lagom is how dependency injection and testing is built right into the framework.

Project Setup

It is time to write some code and get our project setup. I generated a project for my Project Service with the Lagom project generator that is located here:


I used “io.surfkit” for my organization name and “Project Manager” for the project name.

After getting the zip file I copied the files over into my fresh “tastick” folder. Then I removed the “projectmanager-streams-api” and “projectmanager-streams-impl” projects.

My next step was to go back to the Lagom project generator and generate another project for my Gateway Service, this time using the same “io.surfkit” for my organization but using “Gateway” as the name for my project.

Again unzipping the project and moving the directories over to the “tasktick” folder, I deleted the project that contain “streams”, namely: “gateway-straems-api” and “gateway-streams-impl”. I should point out that we are using a websocket and thus a streamed endpoint, but that I opted to just include it in the other service rather than breaking it out into its own project. Feel free to keep your “streamed” projects if you wish to separate your streaming endpoints from your REST service.

Now that we have all the project files together in one directory it is time to modify our build.sbt so that we get the proper dependencies and structure that we are looking for. Here is my build.sbt

Writing our Behaviours

Now we can take our behaviours that we defined for our system and move them into the test Specs. For example we can write all our CRUD cases that we expect in the ProjectManagerEntitySpec like so:Project entity” should {

“create a new project when it receives a CreateProject Action” {

Similarly we can map our behaviours for how we access the service (our API) to the ProjectManagerServiceSpec. These will be stubbed out cases that look something like this:

“ServiceManager service” should {
“add a new project” in {

Why are we doing this?

When we then went a step further to describe the behaviour of our system we took the time to map out what our system should and shouldn’t do. With these tests in place we have established a very important metric. That is, we already have a clear idea of what “done” or “complete” looks like for our project iteration. Therefore by making all of our behaviour tests pass we have a clear goal.

In practice it is a little bit more of a dance back and forth with the behaviours and acceptance criteria being modified as you go, but it certainly provides a large frame of reference where most projects simply have vague assumptions.

Here are links to some of the final behaviour specs:

Coding the Project Entity

The Project Entity is made of a Projects State, as well as the actions that we want to perform, and finally the events that mutate the state. Since I was under time constraint I opted for large actions like “UpdateProject” that would encapsulate and change made to the project (change the name or the description etc), rather than a more fine grain approach that would have actions like: “ChangeName” and “ChangeDescription”. When coming up with your own actions and events you will often want to weigh the benefits of having this more fine grain event semantic downstream. For example it might be important to respond to a Kafka event only when there is a name change.




Extra Data Transfer Object Types

In order to make your Entity (which is just a sharded akka actor) have better type safety, Akka Typed was used. This had the side effect of forcing you to define your Action types with a Result type parameter

One of the unfortunate side effects of this is that you must define your Command types in your `Impl` project. For me this means there was a lot of translation from a service API call to a more or less identical Command object. For example I have a call in my ProjectApi that is used for creating a project. On the implementation side I am forced to translate this into my Command object with the same name. Seen here:

Adding the Gateway API

The gateway API will be our interface to the public facing clients. This service will take care of performing authentication, user management, OAuth third party providers, and websocket negotiation. One thing that may make sense is to split User Management into its own service. I chose against because of my self imposed time constraints.

I will walk through the service and reference some code for each of the main parts that it exposes:

  • Project API exposure + User Management
  • Authentication and JWT
  • OAuth server flow
  • Browser websocket connection

User Manager Entity

This pattern follows the same one outlined for the Project Entity. In fact once you are use to doing a few CQRS Entities, Lagom does a great job of standardizing your workflow.

The User Entity stores information about the user as well as a Set of “ProjectRef”. This acts as a simple way to aggregate a users projects. This is probably better done with a read side view in the Project Service. Storing a read side view in a RDM or Elasticsearch would allow us to perform a lot better aggregations, and protect against data inconsistencies that might occur with my approach. My choice to store this in the User Entity again relates back to my time allotted for this project.

One final shortcut that I should mention is that I made the User Entity key the users email. This is a terrible key choice and will likely result in what are referred to as “hot spots” or an uneven balance of Entities across our nodes. The reason that I choose to use email was to avoid having a read side lookup, again in the interest of time.

Exposing the Project API

Since all the traffic will be handled through our gateway service we simply want to perform authorization and then forward calls on to the Project Manager Service. These are simple routes that then use the service client to call the service that we are looking to hit. An example of this is the “createProject” route that is in the Gateway Service. The implementation of which looks like this:

Authentication with JWT

The Authentication API makes heavy use of the code found here:


However I do a fair amount of code consolidation and cleanup to make things smaller and more readable. The flow can be followed by tracing the code to first register a user and then login that user.

The actual authentication is made from a ServerServiceCall composition. The code for creating that composition is listed here:

We can now simply wrap any call that requires authorization with the “authenticated” composition. Here is an example of this:

OAuth Server flow

A pretty aggressive goal for my project was to include a way to OAuth with github and link code to tasks and such. I got as far as getting the OAuth token flow working but was not able to add any useful github features with my remaining time. The flow to auth can easily be expanded to include other services as well (Facebook, Twitter, etc).

OAuth and related concepts can be found here: https://oauth.net/2/

Our service consists of 2 endpoints:

The simple implementation of this can be seen inside the GatewayService companion object. Here:

Simply adding your api keys into the application config should be enough to get you rolling with any service provider.

Browser Websocket Support

Websocket support is built right into Lagom by way of the Source[T] type. In my case I wanted to have a single websocket connection to a browser that supported to full range of message types that we declare for the Gateway Service. I accomplished this by making an algebraic data type SocketApi. A websocket message and response then takes the form of

case class SocketEvent(payload: SocketApi)

My websocket route then looks like the following:

def stream(token: String): ServiceCall[Source[SocketEvent, NotUsed], Source[SocketEvent, NotUsed]]

Notice that I take the JWT token as part of the route to the websocket. This is due to a limitation with browsers not being able to set request headers for websockets, thus we could not define the “Authorization” header.

At this point, making all of my API types extend the SocketApi trait will allow them to travel over the socket as well as being used for the REST endpoints. This has the added benefit of simply calling those implementations from our websocket handler. Here is the protocol:

Our handler then becomes a simple matter of mapping SocketApi types to the respective handlers that we have already written. The only additional hurtle is to extract the token and put it back into an Authorization header that will work with our ServerServiceCall composition. The final code for this is:

What’s Next?

There were a few things that I still wanted to highlight about Lagom but did not have enough time to (and still meet my goal), namely event streams. I was hoping to have an additional service called the Notification Service. This service would simply be listening to Kafka for ProjectUpdated events. When it received one of these events it would respond by sending out an email notification or by pushing some information into a slack channel. Really the kafka / Event streaming model for Lagom is a really powerful feature and you should make sure that you are aware of and know how / when to use it.

In a follow up post I will talk about how to deploy this project into a production Kubernetes environment. I will also take you through creating a CI/CD pipeline that is extremely powerful.

Part 2: Deploying Lagom with a CI/CD pipeline on Kubernetes