React/D3 Graph Visualisation

Vsevolod(aka Seva) Dolgopolov
captureriver lab
Published in
4 min readAug 16, 2018

Every application running in production generates data, and if it comes to applications logs then an amount of generated data can get huge very fast. If you analyse it line by line in a raw format you will get overwhelmed trying to get any sense out of it. To be able to focus on relevant information we need data to be visualised in form of graph (or table), so we can understand the general trends at scale.

ELK Stack do a great job to get data visible and actually became a standard if it comes to visualising of really big and heterogeneous amount of data. Though Kibana is a powerful interface to discover your logs persisted in Elasticsearch, there are use cases when you would like to have this data visualised within some custom interface. For example if you need to provide some time series charts over business meaningful logs (like domain events) for your customers.

ELK (Elasticsearch, Kibana, Logstash) Stack, Beats is an option to Logstash

Our use case is exactly that one, to show up a statistics of captured object movements persisted in Elasticsearch index. After short research we found the Brush & Zoom example, provided by creator of d3.js library Mike Bostock. It is the one, that pretty much fits our main requirements the visualisation has to provide:

  • possibility for detailed view
  • keeps track of the big picture
  • good browser performance
Mike Bostock’s Brush & Zoom graph visualisation example

The last one — good performance, was guaranteed in first place by the power of d3.js library animating smoothly all transition taking place at time we explore graph’s timeline. But there are also a few problems to solve, since our user interface(UI) has not just that single graph to render. The problems here are:

  • Isolation from other UI elements
  • Sufficient re rendering of graph for new data time series

Isolation

To resolve this issues we used React.js to provide a clean isolation of our graph from outside context and redux with redux-observable to be able sufficiently fetch and update the time series state we visualise.

To do it we create a react Component and render a SVG DOM element first:

Here we have our basement for graph visualisation. Before we delegate the rest to d3.js we will need to create an entrypoint for d3.js, so it can interact with SVG we’ve created and nothing else. To do it we apply react refs to be sure d3 dealing with SVG element we want.

From here on we can apply the whole awesomeness of d3 and explicitly target it on already existing SVG DOM element prerendered by React. So Isolation issue is resolved. What about time series data rendering.

Re rendering

Not a big deal, we just use redux’s connect to bind BrushZoom component to application current state. Each time we have new time series data BrushZoom Component will get it and trigger its componentDidUpdate method. The actual data fetching and updating of application state will be handled by redux-observable. The only thing we still will need to do is convert our incoming data. Since we do Elasticsearch date histogram aggragation we need to parse Elasticsearch date format into d3s one.

And we are done, our d3.js graph wrapped in react environment is ready to go.

You may be noticed, that we do graph visualisation only in case the React component gets updated. What user is going to see, when component will render for the first time. We actually don’t care, because initially the application state is empty anyway and all we need is to trigger redux-observable to fetch the data write after react rendered our SVG DOM element first and here we are already in component’s update loop. So also the re rendering issue could be closed.

Conclusion.

Visualising data makes a lot of fun. Visualising it properly brings a competitive advantage. To do it you need a proper interface for aggregating data first. Elasticsearch with its REST API and powerful json query DSL makes data aggregation pretty flexible for any javascript speaking client. Through it outstanding scalability it also ensures — you will get a lot of data to visualise.

Since you have your data in place and ready for rendering, a lot of things can go wrong. Browsers were first of all designed for navigating over hypertext links and not for visual data analysis, so you will need a transparency over what is going on with data you loaded and DOM elements applying your data. React.js and redux were designed for it. D3.js takes the fancies part and apply transitions that React.js is actually not good at, at least on that low level as we have here.

Complete implementation of the graph we talked about here you can find on https://github.com/captureriver-lab/brushzoom

--

--