Connecting to Kafka using redux-lenses-streaming

Sebastian Dragomir
lenses.io
Published in
4 min readJul 12, 2018

More and more companies are embracing the data streaming revolution in order to get a better insight of their business and thus drive growth upwards.
With Apache Kafka getting a lot of momentum in the space, it is normal to want to connect a Javascript UI application to it. Streaming data to the browser has long been available and solutions for getting data from Apache Kafka are already present.

However there are new ways to achieve low latency updated from Kafka into the browser. Lenses offers a new data streaming platform for Apache Kafka and it exposes a few endpoints for a Javascript application to use LSQL (Lenses SQL — Landoop SQL engine for Apache Kafka) to get messages from the Kafka topic to the browser using consumer group semantics.
The communication is bi-directional; from your web app you can also push records back to Kafka.

Consider a scenario of an FX trading application; to get a list of currency exchange tickers it is as easy as firing a request with the following:

SELECT bid, ask, _key.* as ticker 
FROM tickers
WHERE _key in (‘EURUSD’, ‘GBPEUR’) and _ktype= STRING and _vtype= AVRO

In this tutorial I will show you how to setup a React/Redux application and retrieve data from Kafka, via SQL, or to connect an existing Redux application and stream data in a matter of minutes.

For this we will be using redux-lenses-streaming , a redux middleware built to make the process as fast and easy as possible.

You can see the end result in the video below.

This guide assumes some basic knowledge of React and Redux. The demo project has the minimal setup needed to connect, login, publish, subscribe and view streaming Kafka messages via the Lenses WebSocket connection.

You can check out the code on github.

If you already have an existing Redux project which you want to update with the library, you will need to install it alongside it’s peer dependency, RxJS. (although no knowledge is required of RxJS, I highly recommend looking into it, especially if your application deals with large streams of data / events / messages)

npm install rxjs redux-lenses-streaming --save

Setup and connection

We start off by adding the Lenses reducer to our root reducer.

Next we’ll create the lenses middleware and add it in our Redux store; while doing so we can customise the connection options. ( this way the library will attempt to connect on initialisation )

Currently we support the following options.

In case you need to wait for data or user input, leave the options empty when setting up the middleware and dispatch a connect action with the options as payload:

On successful connection, an action will be dispatched that you can listen to and respond in your reducers/middleware.

import { Type as KafkaType } from 'redux-lenses';

Once the KafkaType.CONNECT_SUCCESS has been received you will be able to access the connection via the reducer which has a connection object and subscriptions array (currently empty).

As you noticed, the options payload includes a user and password property. If they are provided during the connect stage, the library will automatically attempt to login. Another way to achieve this is by dispatching a LOGIN action at a later date using the provided action creator. (see docs for full list)

By this point we also notice KafkaType.KAFKA_HEARTBEAT messages that we can track in our reducers.

Subscribing and unsubscribing from topics

Next step is to subscribe to a topic. For this we will use the subscribe action creator that we will import in the same way we did with connect and send as the payload an object with the sqls property (String).

An example sql would look like this:

On a successful subscription, the SUBSCRIBE_SUCCESS will be dispatched and the subscriptions array in the lenses reducer will be updated with the new subscription.

In order to listen to kafka messages, we’ll add an action handler in our application reducer ( named sessionReducer in the example ) for KafkaType.KAFKA_MESSAGE . The payload will have a content property with the messages array, that we can visualise in a data grid or chart.

In order to unsubscribe, all we have to do is dispatch an unsubscribe action with the following payload:

const request = {
topics: [topic],
};

Publishing to a topic

In a similar way to subscribe, for publishing we will dispatch a publish action with the following payload.

The acknowledgement will be of PUBLISH_SUCCESS type. If you have already subscribed to that topic, you should see the message streamed back to the client.

What’s next?

We are working on adding additional features to the library, such as message batching, delay and timeout customisation.

We recommend trying out the demo project and letting us know what features you would like to be implemented next.

Additional Resources

How to explore data in Kafka topics with Lenses

Find out about Lenses for Apache Kafka

Lenses documentation

Lenses SQL for Kafka

Lenses for Kafka Development The easiest way to start with Lenses for Apache Kafka is to get the Development Environment, a docker that contains Kafka, Zookeeper, Schema Registry, Kafka Connect, Lenses and a collection of 25+ Kafka Connectors and example streaming data.

--

--

Sebastian Dragomir
lenses.io

Software Engineer. Consultant. Coder. Sometimes bug fixer. Developing Web Apps for Investment Banking, Healthcare, Educational Sector and startups since 2007.