Project Dashboard — serving first screen content 10x faster.
Unleashing the potential of microservices and event streams
Like many big tech companies (Uber, Netflix), Azimo started building its tech stack in a monolithic architecture. There is nothing wrong with that approach. When you want to build the core of your product quickly, with a small team — two engineers, in our case 😎 — there is no better choice.
But you can’t use a monolith forever. Over time your centralized code becomes too complex to be handled by a few engineers, which blocks the team’s growth. The system becomes difficult to scale and adding relatively simple features costs too much to justify their development. This is especially true for user experience improvements in client apps, which will always be deprioritised versus core platform improvements.
Since we started rethinking our monolith a couple of years ago, Azimo has already come a long way. We migrated from managed hosting to the cloud (AWS), introduced DevOps culture and built a new stack with microservices architecture that replaces our monolithic system piece by piece. This allows us to work on the core platform and user experience improvements simultaneously.
The Azimo dashboard is the first screen a user sees after signing up. It has countless functions, from welcoming new users to providing status updates on Azimo money transfers.
Every customer is different. There are people who have just joined, people who send money once a day, once a month or even once a year. There are also personal users and business users. We personalise the dashboard so that, no matter how long you have been using Azimo, you are always shown what’s most relevant/important to you. We call this the dynamic dashboard.
The first dynamic dashboard for Azimo’s apps was built in 2016 and was integrated with a REST API exposing data from our monolithic system. The simplified version of the system worked like this:
- When the app was launched, it triggered a couple of requests for data — such as a list of transactions and a list of recipients.
- After the data for all requests was loaded, the app presented them as a single list of items.
- The list was enriched by some custom items like a hardcoded welcome banner shown to new users.
This approach severely limited our speed. Because we were building solutions on top of the monolith, our endpoints became really slow. The median response time for displaying a user’s transfers was more than one second, and more than 700ms to show the user’s recipient list.
The tail of those values was reaching 3–7 seconds each. The result was that, in edge cases, our users had to wait up to 10 seconds to see any meaningful content.
Unfortunately, neither the caching policy nor local app storage helped us very much. They improved the UX a bit, so the user saw the previous app state instead of an empty screen. But our customers don’t need to see where their money was the last time they checked the app. They need to know where their money is right now.
In 2019 a big part of the Azimo monolith was migrated to microservices and event streams, which allowed us to finally build an apps dashboard that is fast and up-to-date.
What has changed? Most critical events now go through our Kafka cluster. This changed the way we handle data coming from multiple sources. Instead of reading the information via REST calls from our core system (transactions, recipients), and then composing them on the fly (client-side or endpoint-side), we build them in a reactive way:
Our dashboard system has two important flows. The first, responsible for data composition, starts in our Kafka cluster. We have topics responsible for transactional events (e.g. transfer status change — “In progress”, “Available for pick-up”, “Money delivered”), recipient events (e.g. recipient was created or updated) or customer support events (new chat was created, a new message was sent).
One of the events consumed is Dashboard Aggregator. This service is connected to Amazon DynamoDB — a fully managed and single-digit-millisecond-fast database that stores the state of each user’s dashboard. The Aggregator listens for new events coming from our Kafka topics and updates models stored in DynamoDB accordingly.
The second flow is built for the client-side app. When a user launches Azimo on their device, the app calls the BFF (Backend For Frontend) Dashboard Service that is responsible for building a client-consumable data model that requires no additional transformation on the apps side. Our BFF talks to two services — the translations service that serves content in the user’s language, and the Dashboard Aggregator service. In this flow Aggregator only reads the user’s dashboard data from the DynamoDB database.
Our dashboard system is built with many interesting solutions that we will describe in the near future:
- Polymorphic models stored in DynamoDB,
- Polymorphic API in our Dashboard BFF with automatically generated code for client-side implementation,
- Flexible UI Kit implemented on Android and iOS apps allowing us to build truly dynamic content,
- …and more.
The most important takeaway here is that by building microservices architecture, event streams and using highly scalable databases like DynamoDB, we reduced the dashboard’s loading time from several seconds to less than 200ms.
Most of our users now see the first meaningful content up to 10x faster, usually without even seeing any loaders. We were able to achieve that without an additional caching layer or local apps storage. By using a fully managed database we also don’t have to worry about our growth. Response times do not vary much, even with extreme traffic.
The new dashboard is available for our Android users right now, and it’s coming to the iOS app in the following weeks. Follow Azimo Labs for more insights from Azimo’s engineering teams!
Building that kind of system wouldn’t be possible without the hard work of our DevOps, backend and client engineering teams. And the biggest credit should go to my colleagues — Dominik Barwacz and Kamil Dziublinski, who led the client and backend side of Dashboard system work.