Responsive audio and tactile way-finding methods for the vision impaired

According to Vision Australia, the number of people in Australia with a vision impairment was 384,000 in 2016. This is projected to grow to over 500,000 in the next decade. This is a substantial number of people who will have issues accessing the built environment. Giudice & Legge (2008, p479) state that “to facilitate safe and efficient navigation, blind individuals must acquire travel skills and use sources of non visual environmental information that are rarely considered by their sighted peers”. Currently, much of the technology provided to the vision impaired is static and takes a ‘one size fits all’ approach, meaning people with vision impairments must adapt to their environment. But what if the environment adapted to the specific needs of the user? Smart cities make use of various sources of data to react and adapt to the requirements of citizens. This provides a new opportunity to help those who may have issues with accessing our cities. This thesis explores how to take advantage of new Internet of Things (IoT) and non-visual based forms of technology in order to improve wayfinding accessibility for those with vision impairment.

Adaptive and responsive technologies allow those with vision impairments, when in an unfamiliar environment, to independently and efficiently find their way. In order to understand the needs of people with vision impairments, research will be undertaken into adaptive and responsive technologies that exist in the market and their limitations. With further exploration and case study analysis, specific requirements for an adaptive, non-vision based technology to be installed in a public space, will be defined.

According to Vermeersch et. al. (p732), “processes of action, perception and cognition turn out to occur in a haptic exploration of a design proposal similar to sighted practice”. For this reason, a tactile model for wayfinding will be explored. Audio wayfinding instructions that “incorporate sensory, motion, and social contact information” will be included, as “visually impaired people rate their level of workload less”, when messages have these features (Bradley & Dunlop 2005, p402). The design solution will make use of Arduino (a simple electronics platform for developers), particle (an internet enabled arduino), tactile buttons and a 3D printed, tactile model to provide audio and tactile information to users. These technologies will be consolidated into a product that can be placed in a reception or lobby area of a public building in order to convey important and specific information to vision impaired visitors.

From the research undertaken, it will be seen that the adaptive and responsive nature of digital technologies can improve wayfinding conditions for the vision impaired. Non visual methods of communication are necessary and can be produced using Arduino and related technologies. The design solution makes use of this technology by providing specific and real-time information to users through auditory feedback and tactile interactions.

From this research, it can be seen that existing, static technologies for the vision impaired are insufficient. Because the number of people with vision impairments is increasing, a revolution in the way that places are designed must take place. Digital technologies are an effective way of providing real-time, audio and tactile information and are therefore required to assist the vision impaired.


Visual impairment, tactile, audio, real-time, wayfinding.


Blindness and vision loss 2016, Vision Australia, accessed 27 June 2017, <>.

Vermeersch, P.W., Nijs, G. and Heylighen, A., 2011, July. Mediating artifacts in architectural design: a non-visual exploration. In Conference Proceeding at CAAD Futures 2011: Designing Together (pp. 732).

Giudice, N, Legge, G 2008, ‘Blind Navigation and the Role of Technology’, in A Helal, M Mokhtari & B Abdulrazak (eds), The Engineering Handbook of Smart Technology for Aging, Disability, and Independence, John Wiley & Sons, Inc., Hoboken, p479.

Bradley, N, Dunlop, M 2005, ‘An experimental investigation into wayfinding directions for visually impaired people’, Personal and Ubiquitous Computing, vol. 9, no. 6, p402, DOI: 10.1007/s00779–005–0350-y.