IRIS

Ignacio Barboza
13 min readApr 15, 2020

--

INTRODUCTION

In sighted people, messages from the retina are processed in a region at the back of the brain called the primary visual cortex. Neuroanatomy defines the primary visual cortex as the “human ability to receive, integrate and process visual stimuli” (Furlan, 1). However, when someone is lacking vision, the visual cortex seems to apply its spatial mapping ability to different senses.

For someone who is lacking vision or has a vision disability, the brain can rewire itself to recognize the world in very different ways by using the remaining four senses. The activation of the other senses launches information deeply into the heart and mind, mapping out the very rich and wonderful world of shapes and sounds, aromas and tastes; rendering a vivid impression of the surroundings in a complex, intertwined and profound way.

As a child growing up I’ve always questioned how my uncle Marcus perceives the world. He was born with a vision disability in the late fifties in San José, Costa Rica and with limited access to technology, medical, psychological and pedagogical advancements, his life hasn’t been the most lenient. Nonetheless, he’s one of the happiest individuals I’ve ever met.

This is my story of how User Experience Design can raise the technological bar for the promotion of independence and inclusivity for populations living with visual disabilities.

How can design and biology merge to improve the quality of life of its users and enhance human interactions by creating dignified experiences?

IRIS VISION

Having a conversation over coffee with my uncle it came to my attention that he had never done an online banking transaction before due to the user interface architecture prevailing on visuals. Since then I’ve felt driven to apply my knowledge and expertise to have an inclusive vision moving onward.

IRIS is an online platform that provides a user-friendly and accessible auditory interactive layer for sites of banking and financial institutions. The responsive system allows for a secure and private experience by incorporating Web Knowledge Authentication, Face and Touch Identification. The system takes advantage of emergent technologies like voice user interfaces and will ultimately incorporate the latest artificial intelligence software.

User Experience Design Timeline

BACKGROUND

According to the World Health Organization (WHO), two in ten adults have a disability, corresponding to approximately “one hundred-and-ninety million people worldwide” (WHO, 1). WHO states that:

“Disability is thus not just a health problem. It is a complex phenomenon, reflecting the interaction between features of a person’s body and features of the society in which he or she lives. Overcoming the difficulties faced by people with disabilities requires interventions to remove environmental and social barriers” (WHO, 2).

Various people believe that having a vision disability is the obstacle, but it’s not, it’s managing those around you. People living with disabilities experience a narrower margin of health, both because of poverty and social exclusion, and also because they may be vulnerable to secondary conditions. Evidence globally suggests that people with disabilities have inequitable access to services in the health, technological, financial industries among others.

At the Society for Neuroscience Annual Meeting in San Diego California, scientists discovered “a unique group of individuals with superhuman mental intelligence” (Fields 1). Functional brain imaging unveiled that some blind people’s brains can rewire themselves, giving them “extraordinary auditory comprehension” (Fields, 2).

Researchers at the Hertie Institute for Clinical Brain Research at the University of Tübingen in Germany have found systematic support for this hypothesis, “blind people can easily comprehend speech that is sped up far beyond the maximum rate that sighted people can understand” (Duffy, 1). Sighted people can process speech about a maximum of six syllables per second. This study reported that people with visual disabilities can comprehend and analyze speech at about twenty-five syllables per second.

Subjects under this study were closely analyzed under functional magnetic imaging, MRI. There primary visual cortex seems to activate, powering a high-speed comprehension mechanism. For sighted people, this region in the brain, also called V1, is “only stimulated by light” (Duffy 2). When sight is not present, the brain rewires to work in cross-modality. Users with visual disabilities can take full advantage of today’s technology since there ‘superpower’ facilities easier access to user interfaces. Both technology and biology can merge to provide an optimal experience.

BACKGROUND TECHNOLOGY

Stanford University led a study in two-thousand-eighteen in which subjects could choose between tapping on their phones or talking to their phones through a voice user interface. They concluded that “talking to your phone is about three times faster than tapping” (Carey, 1). One of the most innovative discoveries showed that with the latest text-to-speech software, talking to your phone was way more accurate than tapping. These findings provide a space to design inclusive proposals that influence both technological progress and the conditions of people with visual disabilities.

The latest development in the Voice User Interface (VUI) industry is Conversational User Interfaces (CVUI), which structurally speaking, differentiate from normal VUIs. Rather than command and answer, CVUIs simulate natural language, learn from experience and are enhanced by artificial intelligence. Herbert Paul Grice, a British philosopher of language, whose work on meaning has influenced “the philosophical study of semantics,” (Encyclopædia Britannica, 1) explains that natural conversations are “accountable by how people speak cooperatively”. (Grandy, 1) The principle called the “Cooperative Principle”, introduces Grice’s maxims:

  1. Quality — “Only say things that are true”
  2. Quantity — “Don’t use more information that is necessary”
  3. Relevance — “Only state content that is relevant”
  4. Manner — “Be brief and avoid ambiguity”

IRIS follows these CVUI principles and presents a business and financial solution through the use of a voice user interface, generating an experience that is eyes-free, hands-free and omnidirectional. The system can be adopted without interfering or altering the behavior of its users and it can be applied to any smart device. Although the device can be switched over time, its UX core can remain the same, therefore lowering the costs for production, maintenance, and software updates.

IRIS PROCESS

The framework of the IRIS system is constructed upon four main pillars, user experience, accessibility features, employment, and community engagement. Through the applicability of user experience design tools, IRIS reduces the friction between design and disability and incorporates accessible features that are embedded into its operational system.

In a further development, the service would deliver employee training and sensitivity for a wider inclusive banking experience. Additionally, the service would work with a dynamic and relational vision that facilitates communication, interaction, involvement, and exchange between the users with visual disabilities and the organization.

LITERATURE STUDY

Initially, the work consisted of a Literature Study on extensive research on a variety of scholarly reviews. The type of study ranged from design articles, books, previous interviews, videos, and blog posts. The research consisted of work mainly referenced to Stanford University’s study on “Creating AI Conversations Panel Series: Voice User Interface Design & Nonverbal Communication in AI” and “The Senses: Design Beyond Vision”, book by Andrea Lipps and Ellen Lupton.

PRE-STUDY

As part of a pre-study, diverse interviews were done with experts in the field. This phase of the study was done in collaboration with an organization called “Nueva Luz” which is based in San José, Costa Rica. They serve as an educational care center for adults with various disabilities. “Nueva Luz” is a pioneer of private entrepreneurship that offers education and development options for populations with disabilities.

They work close to the Ministry of Education and offer personalized programs for students in order to promote social independence. The goal of this collaborative work with the organization was to have a deeper understanding of the ethical concerns and limitations regarding design work with vulnerable populations.

Additionally as part of the pre-study, and before the user testing was held, several interviews were carried with Cirsa Alvarado, the director of the User Experience Design team at BAC International Bank. Alvarado is responsible for UX strategies in Latin American.

Diverse questions were sent concerning the inclusivity initiatives and programs that the bank runs at a regional level. An analysis of the different digital channels was done and it was concluded that users with vision disabilities don’t have the same access to these platforms as sighted people. A chart was created to understand the different limitations in Digital and Non-Digital services based on the different types of disabilities.

USER DATA ANALYSIS

During the work, an analysis of interviews and user testing was held with users with visual disabilities. The primary objectives were to investigate the contextual auditory information regarding financial transactions, an exploration of the existing digital tools that are currently being used and the possibility of the implementation of voice user interfaces. The findings of the study concluded that multimodal interfaces expedite the process by combining voice, haptics, and quality visuals.

A semi-structured interview format provided a space for open dialogue in contrast to a questionnaire or other more formal methods. All the participants live in Costa Rica and were recruited from different online platforms. The participants’ degree of visual disability depended on their condition, some of the participants lost their sight from birth and others have lost their vision in later years.

The process of the interviews started with a transcript that served as a starting point for the user data analysis. Later, the most relevant feedback was written down in post-it notes and different thematics were drawn in terms of similarity. The thematics were drawn by chronological order depending on the interviews that were carried.

THEMATICS

It was concluded that the following thematics were predominant in almost all the interviews:

  1. There is a difference between users with low vision and users who were born blind.
  2. CVUIs should be able to accommodate in the visual disability spectrum, giving the users complete independence and control over the different functionalities.
  3. Using CVUIs is not ideal in public spaces.
  4. Privacy is always a barrier when using online tools.
  5. Security measures that are currently being implemented are not user friendly or inclusive.
  6. Some online platforms don’t follow accessibility norms like the “Accessible Rich Internet Applications (ARIA)”
  7. Some security measures have a time constraint, which in some cases, is not compatible with screen readers.
  8. Online platforms are usually not user-friendly with screen readers. Users require turning them off, switching their phone settings, etc.
  9. The visual navigation of newer online platforms is initially hard to learn.
  10. Users learn to navigate from A-B and B-A.
  11. A linear navigation method would be the most ideal and inclusive

The goal of this qualitative study was to identify and understand the pain points of users with visual disabilities regarding their administration of personal finances. Moreover, empathize with how they’ve attempted to relieve their struggles with existing tools and gather insights about VUIs and haptics.

Photo by JJ Ying on Unsplash

CONCEPT

CONCEPT GUIDE INFORMATION

The concept of the IRIS system is formulated with information gathered from literature studies, expert interviews, user interviews, and prototyping. The initial concept was built with the platform “Voice Flow”, which creates an auditory mockup in a high fidelity set. For this phase, user journeys were built and tested iteratively.

Language settings were tested with two concepts in English and Spanish. Users who had little to no knowledge in English were only shown the concept in Spanish, whereas bilingual users were shown both concepts.

CONCEPT FUNCTIONALITIES

Through the analysis of data, conclusions were made per different concept functionalities. The future iteration of the concept can accommodate different types of visual disabilities, incorporating a multimodal function with a haptic system of voice, vibrations, sounds, and high-quality inclusive visuals.

Since some users indicated that they had low vision, feedback will also be included taking into consideration the use of visual interfaces. IRIS is designed inclusively for users with low vision following the “GOV.UK, Home Office Digital, Data and Technology (DDaT)” guidelines. Originally, the layout of the screens was designed with a high contrast mode, bold and readable text. The system also applies a combination of colors, shapes, and text and follows a linear logical layout. Buttons and notifications are put in context.

Original Wireframes

In the case of users with no vision at all, the system runs solely on voice, sounds, and vibrations. The users are provided with an integrated auditory feedback like the sound of a piano key indicating a successful transaction. If a user is stuck or needs to go back in the process the user is instructed to give a verbal command for the system to repeat itself or go back several steps.

CONCEPT STRATEGIES

IRIS is a cloud-based service that allows for the development and design teams to work on it on the fly with bug fixes, updates, and service expansions. The marketing structure of the service is shared with more than one financial institution making a split development cost.

The system is designed to be marketed with a Year One Exclusivity by giving the first banks that acquire the service exclusive pricing. The license of the system is sold as a Monthly Subscription, including service & maintenance costs, making IRIS innovative and inclusive by raising the bar of user experiences in the banking fields.

CONCEPT METRICS

The financial metrics and marketing plan started with the creation of a Value Proposition Canvas (VPC), following “Strategyzer’s” template. The chart below visualizes the IRIS format:

“Strategyzer”

The Marketing Plan is divided within the first four operational months. In the first month, the team would have acquisition and sale meetings with the initial Financial Institutions. This would provide a learning curve for the other institutions in later months. At the same time, the team would be doing product samplings and live demos. From the second to the fourth month the team would be working closely with a Marketing Team which will be in charge of creating Brand Interactions, Trend Based Content weekly and Scheduled Content for continual promotion.

Concept — Four Month Market Plan

USER JOURNEY

The goal of a user journey is to map out the experience of a user or customer when utilizing a service or product. Ordinarily, the user journeys are defined by two possible scenarios, what the service or product currently does or what it could do.

For the work, the IRIS service is mapped out in a user journey to demonstrate the vision of the project by communicating the possible concept stakeholders, user pains, needs or possible outputs. The user journey also served the purpose of identifying functionality levels, dependent on the level of visual disability. This User Experience tool helped define user flows and the overall information architecture.

FUTURE

The proposed concept would be satisfying to be executed in the future since it would eventually become a very necessary technological solution for populations with visual disabilities, and a design breakthrough that would make me feel personally accomplished.

IRIS will create autonomy, independence, and inclusivity for a population that has been excluded from dignified digital experiences. Having the empathy to understand the struggles that people with disabilities face day-to-day ignited the need to design for those in vulnerable conditions. I felt driven to use my design skills and talent to create a platform that facilitates the banking experience for so many users. IRIS will change and enhance the quality of life with its vision of User Experience Design.

CITATION

Carey, Bjorn. “Smartphone Speech Recognition Can Write Text Messages Three Times Fasterthan Human Typing.” Smartphone Speech Recognition Can Write Text Messages Three Times Faster than Human Typing, Stanford News Service, 24 Aug. 2016, news.stanford.edu/press-releases/2016/08/24/stanford-study-sn-faster-texting/.

Duffy, Maureen. “Two for Blindness and Neuroscience.” Vision Aware Organization, VisionAware Blog, 24 Sept. 2012, visionaware.org/blog/visionaware-blog/two-for-blindness-and-neuroscience/12/.

Fields, R. Douglas. “Why Can Some Blind People Process Speech Far Faster Than Sighted Persons?” Scientific American, Scientific American, 13 Dec. 2010, www.scientificamerican.com/article/why-can-some-blind-people-process/.

Furlan, Michele, and Andrew T Smith. “Integrative Neuroscience Research.” Global Motion Processing in Human Visual Cortical Areas V2 and V3, 2016, doi:10.35841/neuroscience.

Grandy, Richard E. and Warner, Richard, “Paul Grice”, The Stanford Encyclopedia of Philosophy (Winter 2017 Edition), Edward N. Zalta (ed.), URL = <https://plato.stanford.edu/archives/win2017/entries/grice/>.

“Semantics.” Edited by The Editors of Encyclopaedia Britannica, Encyclopædia Britannica, Encyclopædia Britannica, Inc., 1 May 2017, www.britannica.com/science/semantics.

WHO, “World Report on Disability.” World Health Organization, World Health Organization, 2020, www.who.int/.

REFERENCE

Leonie Watson. What is a screen reader? https://www.nomensa.com/ blog/2005/what-screen-reader, 2005.

Lupton, Ellen, and Andrea Lipps. The Senses: Design beyond Vision. Copper Hewitt, Smithsonian Design Museum, 2018.

Olofsson, Stina. “Designing Interfaces for the Visually Impaired.” Diva Portal, Ume˚a University Department of Applied Physics and Electronics, 2017, www.diva-portal.org/. Fall, 2017 Master’s Thesis in Interaction Technology and Design

Pearl, Cathy, et al. “Creating AI Conversations Panel Series: Voice User Interface Design & Nonverbal Communication in AI.” Stanford Arts, 14 Feb. 2018, arts.stanford.edu/event/75479/.

S. Oviatt. Ten myths of multimodal interaction. Communications of the ACM, 42(11):74–81, November 1999.

SOFTWARE

“Adobe XD.” 2020.

“Adobe Premiere Pro.” 2020.

“Miro.” 2020.

Ream, Braden. “Voiceflow.” Voiceflow. Design, Prototype and Build Voice Apps. , 2020, www.voiceflow.com/.

--

--