Data Visualization, Ants, and Synesthesia

Shahid Karim Mallick
4 min readJun 24, 2016

--

My goal is to improve the way we interact with and share information.

A major way in which we share information, knowledge, discoveries, is through the sharing & depiction of data. All information is data in different forms, if you think about it. Therefore, a huge application of natural interfaces and other communication tools is data visualization — graphical depictions of data that are more intuitively interpreted and understood, e.g. infographics.

Infographic made with Visual.ly, designed by Ivan Cash

With all the data being acquired on our health, movement, online activity, shopping preferences, etc, data vis tools are becoming increasingly popular. Some of the better known ones include D3, Tableau, Looker, and Visual.ly, along with more programming based solutions such as MATLAB and Python.

A fascinating step is being taken by neuroengineer named Dave Warner MD, PhD, who has designed a tool for the “spatial visualization of data,” called ANTz after its initial use to examine ant social behavior.

Physiometric data collected from a bicyclist riding uphill and downhill on Cowles Mountain in San Diego. Data points are mapped to geographic location and altitude, and are pulsing according to heart rate, respiration, etc. (data files)

In ANTz, data points are breathing objects in a 3-dimensional world that have a certain color, size, shape, texture, and are moving/mutating in 3-space according to certain parameters. Built from extensive perception research, the program uses high contrast shapes and colors that are easy to visually discern, even in a dense cluster of data points. The data can also be placed on topological maps and has been used to study physiometric data from bicyclists, academic profiles of students, even regional conflict in Africa and Asia. These “data worlds” are so useful because they can be spatially traversed and examined from any angle or distance.

ANTz is currently being ported over to VR so that it can be truly experienced in an immersive way. With VR, you could actually dive into datasets and observe the datapoints breathing all around you. You could explore it from any perspective, at any scale — as a giant or (naturally) an ant.

When I first talked to Dave in April 2015, he explained that ANTz was proving to be incredibly efficient at shedding light on trends in the data. He likes to say that it is especially good at “finding the needle in the haystack.”

How long does it take you to find all the 2's?

Here’s an analogy I like to use: If I showed you a drawing of a bunch of 5s, and there were a few 2s hidden somewhere in there, how long would it take you to find them all? It would take at least a few moments, depending on the size of the cluster of 5s. However, grapheme-color synesthetes, who see colors when they read symbols, can find the 2s instantly. They just appear, maybe as red dots in a sea of green 5’s. (This is actually a common diagnostic test for grapheme-color synesthesia, developed by V.S. Ramachandran and Edward Hubbard.)

How a grapheme-color synesthete might view the same image. The great author Nabokov described his synesthesia: “I see q as browner than k, while s is not the light blue of c, but a curious mixture of azure and mother-of-pearl.” (src)

What if I asked a week later if you remembered where the 2s were, just the general location? I’d wager the synesthetes would once again outperform the others. And you might say, “Oh well, they’re able to do that because they have more information available to them, another data point. They have color AND shape, while we just have the shape of the numbers to look for.” And that’s true, that’s exactly why they are able to find the needle in the haystack much more quickly.

Now think about virtual and augmented reality. These tools literally add another DIMENSION to data visualization: color in 3d, motion in 3d, multiple parameters extended at once. It makes perfect sense, of course we should be able to visualize data better in a 3d environment and find the needle in the haystack more easily.

Using ANTz to look at regional conflict in Angola, 1960–2002 (details)

This is precisely the motivation behind Dave’s ANTz tool. Interacting with and easily manipulating high-contrast data in 3-space allows us to step into the questions we are trying to answer. Dave theorizes that this ability to fluidly view the data from any perspective lets us create mental models that we can wrap our heads around. Through exploration, we can better conceptualize the entire dataset and how it moves — making it easier to see trends, to find the needle in the haystack, or the 2s in the 5s.

Next, Dave is tackling data sonification — the representation of data through sound — so that one may not just enter a visual world of data, but also hear a complete soundscape. A person might listen for the needle in the haystack instead of just visually scanning for it. Given the many advantages of combined audio-visual interaction (catalogued in my series on auditory processing), I believe that a sonified version of ANTz would have immense potential.

For more info on grapheme-color synesthesia, see this paper I wrote while studying neuroanatomy at Brown:

--

--

Shahid Karim Mallick

I build natural interfaces (see my latest work at smallick.com). Studied neuroscience @BrownUniversity. Product @NeoSensory, previously @CalaHealth