Image for post
Image for post
Azkarra Streams : Create streaming microservices, based on Kafka Streams faster than ever before!

I am pleased to announce the release of Azkarra Streams 0.8. Azkarra has benefited from several internal optimizations since the last version. This new release includes some new features as well as some minor breaking changes of the public API.

What is Azkarra Streams ?

For readers that discover the framework: The Azkarra Streams project is dedicated to making the development of cloud-native streaming microservices based on Kafka Streams simple and fast!

I recommend you to check out these blog posts to discover all the many possibilities Azkarra has to offer:


Today, it’s easy to say that almost everything we do, everything we use, and even everything around us is capable of producing data. But what is even more true, is that this data is produced in real-time to describe something that is happening.

Therefore, it’s logical to think that data must be also harnessed in real-time to be able to extract the most value from it. In addition, and perhaps most importantly, data must be stored and processed with a temporal context to retain its full significance. …


Image for post
Image for post
Photo by Ross Sokolovski on Unsplash

This is the third article in the “Streaming data into Kafka” series. In the first two, we saw how it’s fairly easy to use Kafka Connect to load records from CSV and XML files into Apache Kafka without writing a single line of code. For doing this, we have used the Kafka Connect FilePulse connector which packs with a lot of nice features to parse and transform data.

Now, let’s see how to integrate JSON data, another file format that is widely used on most projects (and much more appreciated than XML for web-based applications). …


Image for post
Image for post
Photo by Ross Sokolovski on Unsplash

In the previous blog post Streaming data into Kafka S01/E01- Loading CSV file, I’ve illustrated how it can be easy to integrate data into Apache Kafka using the Kafka Connect framework.

In particular, we saw how to parse and transform CSV files to produce clean records into Kafka by using the Kafka Connect FilePulse connector.

XML(Extensible Markup Language) is another well-known data format. Usually, the XML format is not very appreciated by most developers because of its heaviness (or complexity). However, it’s still used by many organizations to make systems interact with each other.

In this second article, we will see how to read records from XML files and load them into Kafka. To do this, we will once again use the Kafka Connect FilePulse connector, which offers native support for reading XML files. …


Image for post
Image for post
Photo by Ross Sokolovski on Unsplash

Ingesting data files in Apache Kafka is a very common task. Among all the various file formats that we can find, CSV is probably the most popular one to move data between different systems. This is due to its simplicity and to the fact that it can be used to export or import data from one (small) database to another.

A CSV file is nothing more than a text file (with a .csv extension). Each line of the file represents a data record and each record consists of one or more fields, separated by a comma (or another separator).

Here is a chunk of example…


Image for post
Image for post
Apache Kafka + ksqlDB + ClickHouse + Superset = Blazing Fast Analytics Platform

Recently at StreamThoughts, we have looked at different open-source OLAP databases that we could quickly experiment in a streaming architecture, based on the Apache Kafka platform.

Our goal was to be able to respond to analytical needs on large volumes of data that were ingested in real-time. For this, we were looking for a solution that would allow us to execute ad-hoc queries, interactively, with “acceptable” latencies (a few seconds or more).

In addition, to allow us to quickly evaluate different ideas, we were looking for a solution that :

  • Is easy to setup out-of-the-box.
  • Offers a SQL-like query language (with JDBC support if possible). …


Image for post
Image for post
Azkarra Streams — Release 0.7

I am pleased to announce the release of Azkarra Streams 0.7. This new release packs with several major new features.

For readers that discover the framework; the Azkarra Streams project is dedicated to making the development of streaming microservices based on Apache Kafka Streams simple and fast.

With this new release, the Azkarra framework enhances Kafka Streams applications and make them easier to operate in production.

This blog post summarizes the most important improvements.

Monitoring Kafka Streams Consumers Lags

A fundamental indicator to monitor on an Apache Kafka data streaming platform is the “consumer group lag”. The consumer group lag tells us how far behind our consumer applications are from the producers — i.e if our applications are up to date in terms of records processing. Generally, the offset-lag of a consumer is computed as the (last_produced_record_offset — last_consumed_offset)for each topic/partition. …


Image for post
Image for post
Photo by Brandon Green on Unsplash

Most of the projects I’ve worked on in the last few years have involved ingesting data into systems such as Apache Kafka® and Apache Hadoop® to perform processing and data enrichment both in real-time and in batch.

One of the recurring challenges of each of these projects has always been to manage the complexity of legacy systems in which data was frequently exported, shared and integrated through the use of files.

I am convinced that organizations that operate today with applications that rely on export files will probably continue to do so in the future. …


Image for post
Image for post
Create Kafka Streams applications faster than ever before!

I’m the pleasure to announce the release of Azkarra Streams 0.5

This release includes not only significant new features but also a few breaking changes in public APIs.

When I started the Azkarra project it was just a few utility classes used to wrap the Kafka Streams and Topology classes. But, the project has quickly evolved to become a more elaborate development framework. Since its first open-source release, we got feedback from the community on how the project starts to be used and which features could be improved. …


One week ago we announced Azkarra Streams, a new open-source micro-framework for simplifying the development of Kafka Streams applications.

https://medium.com/streamthoughts/introducing-azkarra-streams-the-first-micro-framework-for-apache-kafka-streams-e13605f3a3a6

Image for post
Image for post
Azkarra is Basque word for “fast”

Azkarra Streams provides developer-friendly features such as an embedded HTTP server to expose metrics and states about the local Kafka Streams instances. It also provides the ability to query Kafka Streams state stores. Last but not least, it offers a Web UI to manage the application and to visualize the DAG of streams topologies.

If you are interested in learning more about Azkarra, check out our “Getting Started”.

Why Azkarra needs to be secured?

Security is one of the most important concerns (if not the most important) when it comes to deploying an application in production. …

About

Florian Hussonnois

Co-founder @Streamthoughts , Apache Kafka evangelist & Passionate Data Streaming Engineer, Confluent Kafka Community Catalyst.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store