From Monitoring to Data Streaming

Iñaki Alzorriz
adidoescode
Published in
3 min readJun 19, 2018

The Seed

Three years ago, we started developing Business Activity Monitoring 2.0 in adidas. It was a re-platform project based on Opensource technologies with the real time event processing and scalability as the main purposes of the new platform.

BAM 2.0 reference architecture

In this technology stack, Kafka was one of the pillars — it was the selected technology to transport the events from the multiple source systems in adidas to be processed by the complex event processing engine. We selected this technology because it met all the requirements that we wanted to cover: horizontal scalability in consumption and production of the messages, open source project extensively adopted by the community, high-retention by design…

The Pilot

The more we worked with Kafka, the more versatile we perceived it was, covering more use cases apart of event sourcing strategies. It is designed to implement the pub-sub pattern in a very efficient way, enabling Data Streaming cases, where multiple can easily subscribe to the same stream of information. The fan-out is the pub-sub pattern is implemented in a really efficient way, allowing to achieve high thought in the production and the consumption part. It also enables other capabilities like Data Extraction and Data Modeling, as long as Stateful Event Processing.

The Product Creation IT area in adidas had one big initiative in their roadmap to stream the information from the main sources in this area and enable real-time analytics, reporting, data replication… They conclude that Kafka was the right technology for that and we enabled their initiative by preparing the platform for them, and this is the DNA of our department Platform Engineering.

Today 74 topics, 29 sources and 6M messages per day demonstrates that this platform is being accepted by the product teams. Still a humble pilot, but it shows that the the solution is production ready.

We are in the right track!

In April we had the pleasure of being in the Kafka Summit in London and it was really exciting to validate that we are in the right way when we assisted the presentation from companies like Apple, Audi, IBM, BBC, ING… We confirmed that it is the right momentum to create Data Streaming Platforms offered as internal service.

The Journey

This story is not trying to sell Kafka, at the end it is ‘only’ a technology. But depending of the adoption of the new technologies, it can create an impact able to trigger the transformation on the IT landscape of one company.

This is only the first step, since that is strategy of Event Driven / Data Streaming architectures is being adopted by many companies about the globe. We are working with Confluent ‘the creators of Kafka’ to assess our solution and they have created a diagram that illustrates perfectly the adoption of this technology:

An exciting journey in front of us! Stay tuned!

--

--

Iñaki Alzorriz
adidoescode

Director of Platform Engineering — @adidas. Software engineer passionate about DevOps and transforming the way we deliver software.