Empowering the Blind with Data

Designing data displays that play to the strengths of blind individuals.

--

This image shows a blind individual with a cane on a sidewalk. The individual is facing a car that is moving in the opposite direction. The image depicts sound waves traveling from the car and traveling from a far away construction site. The individual is using the sound waves and haptic cane feedback to construct a mental map of the layout.
Navigating outdoor space using non-visual senses. Blind individuals build mental maps by sensing environmental sounds, and use their cane for performing echolocation and perceiving haptic feedback; in other words, a combination of sound, and touch towards non-visual sensemaking of spatial concepts. Our work explores how these capabilities can be used for accessible data visualization.

Blind people deserve universal access to information, but current data access mechanisms are not always up to the task. In particular, as pointed out by many disability advocates, data visualizations are typically inaccessible for blind users. Furthermore, as the world gets increasingly inundated with data, accessing it is likely to become even more prevalent in the future.

To begin to address this problem, we (a team of researchers at the University of Maryland, including two faculty, two Ph.D. students, and a Blind professional) asked ourselves how to design data displays that not only help blind people understand large datasets, but that also leverage the strengths that these blind bring to the table. Just like data visualization draws on the intrinsic strengths of the human visual system, surely there are other skills and abilities that a blind person could use in a similar fashion to efficiently view large amounts of data? To answer this question, we conducted interviews with a group of Orientation & Mobility (O&M) instructors on how blind individuals perceive the world. Our findings are to be presented in a research paper at the upcoming IEEE VIS 2021 conference titled “Towards Understanding Sensory Substitution for Accessible Visualization: An Interview Study” at the end of October (see the end of this article for the full citation).

This image shows the Data Visualization pipeline where raw data is transformed into structured data tables, and then subsequently mapped to visual structures, and views. The human interacts with these views to complete various data tasks. How do we successfully replace the visual sense here with sound and touch?

Sensory substitution is standard practice in accessibility research where the loss of one sense, such as vision, is balanced through other senses, such as touch or hearing. O&M instructors, who teach blind individuals to navigate the world, are excellent people to ask about this practice because many of the skills they teach deal with navigating and interacting with the world using non-visual senses. Many, if not most, O&M instructors are also blind; that was true for all ten of the instructors we spoke with as part of this study. In other words, these are precisely the people you want to hear from when it comes to designing technology to aid blind individuals in understanding data!

Visualization accessibility challenges are currently overcome by employing sound and speech, and using touch. These solutions support access by employing assistive technology such as Screen Readers, non-speech sounds, and tactile (touch) representations.

This contains images of tactile maps, sonified maps, and screen reader icons from past research work.
Sensory substitution using sound, speech, and touch.

In analyzing the transcripts from these interviews, we collected many insights and design guidelines for how to best build the next generation of accessible data displays. Our insights focused on both the use of touch and sound as the primary substitution mechanisms for vision, as well as practical aspects of how to use these sensory channels to best effect. For example, our participants consistently stressed that touch is more akin to vision than sound because of its two-dimensional nature and its superior ability to give an overview of a tactical data display. At the same time, sound is easier to create and can be particularly effective for a trained listener, particularly for a blind person who already relies so much on their hearing to navigate the world.

Our findings also give clear guidelines for further work in this area. Perhaps most important is the realization that blind individuals already tend to be familiar with traditional data visualizations, such as bar charts, pie charts, and line charts. Beyond the fact that some individuals were born sighted and saw such charts before they became blind, blind and sighted people alike are familiar with basic visual representations such as maps and charts merely from everyday life. In other words, this means that the term “accessible visualization” is not a misnomer, and that continuing to use standard visualization techniques as a starting point for sensory substitution is a good idea. Overall, these results provide ample encouragement for continuing the work on accessible data displays in the future.

OM5 about chart accessibility:

“…And in most of the companies that I’ve ever worked in, in my life, I was the only blind person there, so what was useful to me, if it wasn’t useful to other people on my team, or in the company, then it wasn’t going to be useful. In other words, we live in a sighted world, and if we live in a sighted world then we have to figure out how am I going to communicate the information I have to other sighted people in a way that makes sense to them…

Full citation here:

  • Pramod Chundury, Biswaksen Patnaik, Yasmin Reyazuddin, Christine W. Tang, Jonathan Lazar, Niklas Elmqvist. Towards Understanding Sensory Substitution for Accessible Visualization: An Interview Study. IEEE Transactions on Visualization & Computer Graphics, 2021. [PDF]

--

--