[2]

Measuring Visual Complexity with Pixel Approximate Entropy

Gabriel Ryan
thewulab
Published in
4 min readSep 5, 2018

--

When designing visualizations, making charts that are straightforward to read and interpret is usually desirable. This becomes especially important in settings where users may have to make fast judgements based on the visualizations, such as in emergency rooms or on financial trading floors. In our recent paper, we introduce a new method for quantifying visual complexity, Pixel Approximate Entropy, that can be used to develop better visualizations in these types of settings.

Examples of visually simple and complex charts.

For an example of visual complexity, look at the chart on right. Intuitively, it is more difficult to read than the chart on the left. To improve readability, a visualization program could enhance different important aspects of the data to make it easier to read. However, current visualization programs have no way to identify which charts are difficult to read. Pixel Approximate Entropy solves this problem by providing a “visual complexity score” that can be used to identify difficult charts.

This collaborative research was between Gabriel Ryan and Eugene Wu of the WuLab, and Abby Mosca and Remco Chang at Tufts. We will present this work at the IEEE VIS 2018 conference in Berlin, Germany during the week of Oct 21, 2018 — Oct 26, 2018. Come by and say hi if you will be attending!

Pixel Approximate Entropy scores for charts.

What is Pixel Approximate Entropy?

Pixel Approximate Entropy adapts Approximate Entropy for use as a quantitative visualization complexity measure. Approximate Entropy was originally developed for low dimensional systems such as time series [1]. Approximate Entropy works by running a sliding window over the data and then comparing the two furthest points between each window and counting how many are under a given threshold.

The following image shows a snapshot of the comparison of two windows in the Approximate Entropy calculation. The two windows are lined up and the distance between each pair of points is calculated. The maximum distance between a pair of points (the leftmost points in this example) is taken as the overall distance score for the two points, and added to the count of close windows if it is under the threshold.

Windowing comparison process for Approximate Entropy.

Intuitively, more random or “noisy” data will have more windows that are under the distance threshold as window size increases, resulting in a higher Approximate Entropy.

In Pixel Approximate Entropy, we first scale the data so that each point represents a pixel, then calculate the Approximate Entropy score of the scaled data. This ensures the resulting complexity score reflects the chart the user sees, and not the underlying data.

How do we know it works?

To determine whether or not Pixel Approximate Entropy is a useful visual complexity measure, we conducted a series of user studies to see how well it predicted user performance on visual tasks. Specifically, we used two visual tasks:

  1. Chart Matching: Identifying which of two charts was identical to a previously seen chart
  2. Chart Classification: Identifying what type of shape a chart was showing.
Screenshots of Chart Classification task.

Our studies found a clear correlation between the Pixel Approximate Entropy of the charts and user performance on every task, showing that Pixel Approximate Entropy works as a visual complexity measure.

Correlation between entropy and performance on Chart Classification task. Both user accuracy and confidence decrease to minimal values as Pixel Approximate Entropy increases!

What are some use cases?

We foresee several potential uses for Pixel Approximate Entropy, including highlighting changes in visualizations, approximate visualizations of large datasets, parameter selection for visualization, and guided chart simplification techniques.

If you want to try out Pixel Approximate Entropy, its available in both Python and Javascript. Check out the project at https://github.com/cudbg/pae, or install the package with pip install pae or npm i pae.

[1] S. M. Pincus. Approximate entropy as a measure of system complexity. Proceedings of the National Academy of Sciences, 1991.

[2] “turned on flat screen monitor” by Chris Liverani on Unsplash

--

--

Gabriel Ryan
thewulab
Editor for

Gabriel Ryan is a PhD Student at Columbia University, where he conducts research in Machine Learning, Security, and Visualization. cs.columbia.edu/~gabe