Adopting an event driven architecture for our realtime vehicle analytics.

Abhigyan Kole
fleetx engineering
Published in
4 min readMar 11, 2024

Introduction

At Fleetx, we specialise in providing real-time analytics and tracking for jobs in progress with our clients’ vehicles. However, we encountered latency issues, especially for clients with large fleets exceeding 10,000 vehicles. After investigation, we identified a slow analytics API as the culprit, fetching data from multiple sources and causing delays.

The Latency Challenge

The analytics API returned extensive details for each vehicle, including ETA information, KM data, and job-specific details. The slow performance resulted from fetching data through complex mysql joins, redis, cassandra, and runtime calculations.

Existing Architecture

The Plan : Precalculating responses

To address this, we devised a plan to precalculate API responses upon relevant job events and store them for quick retrieval. This approach aimed to eliminate the need for numerous database calls during runtime, allowing us to consolidate all the required data into a single, efficient retrieval process.

Planned Architecture

Challenges

Our strategy involved capturing update events, maintaining synchronisation for precomputed data, and handling new job creations. The challenge, however, lay in dealing with updates originating from over 100 service locations without introducing repetitive and error-prone code.

Execution

Designing an application level CDC

To overcome this, we implemented an Aspect-Oriented Programming (AOP) interceptor for the save and update queries in the JobRepository. This interceptor operated in three stages:

1. Intercepting relevant events

In our JobRepository, we focused on three main methods: delete and save, which are provided by JPA (Java Persistence API), and custom methods with update queries that are identified by our own annotation called @EventTrigger. To handle these operations, we created a Spring AOP (Aspect-Oriented Programming) interceptor.

AOP based interceptor

2. Extracting id(s) of affected entities

The interceptor, through join points, extracted affected IDs for save, delete, and custom update methods.

The save method in spring data’s JpaRepository, accepts either a single entity or a list of them to save, we were able to get hold of the passed entities from the join point and then the ids as well.

The delete method in JpaRepository either accepts a single entity, a list of entities or ids to delete. In all this cases we were able to get hold of the affected ids from the join point.

And in the custom update queries, we encountered two scenarios:

  1. One which had the id of affected job provided as part of the function parameters like this :
A custom update query which updates job with given id

2. Another which updates an arbitrary set of jobs, based on some condition like this :

A custom update query that updates a random

For the first type we realised that the id of the job to be affected is already provided to us through the function parameter and if we can know the position of the id among the parameters we can easily get its value from the join point and thus for each query of this type we added the event trigger mentioning the position.

Event trigger with positional parameter 1

For the second type the only way to identify the id(s) is by using another reverse query with the same where filters, thus for each query of this type we added an event trigger with the reverse query which returns the ids specifically.

Event trigger with reverse query

We designed the reverse query to be such that it can reuse the positional parameters from the original function signature.

3. Collecting Information and Pushing to Messaging Service:

With the obtained IDs, we retrieved updated data and pushed it into Kafka, a messaging service. The real-time service consumed this data, promptly updating precomputed analytics for the vehicles.

Results

Implementing this event-driven architecture and the AOP-based interceptor significantly improved our system’s efficiency. With vehicle-wise analytics readily available, we streamlined the data retrieval process, reducing response times by over 75%.

--

--