nav — An HCI Case Study
“With the passage of time, the psychology of people stays the same, but the tools and objects in the world change.” — Don Norman
Don Norman is one of the founders of human computer interaction as it is known today. In his book, The Design of Everyday Things, Norman suggests that the principles design of remain constant, but the application requires modification to account for new activities, new technologies, new methods of communication and interaction (Norman, 2013). For example, humans have used kettles for centuries, but as technology has advanced, we have moved away from copper kettles, which were unsafe (i.e., copper poisoning), inefficient, and inconvenient, and towards electric kettles. This is a basic example of how human needs remain the same, but the tools around us change. For the instance of a kettle, designers have moved away from the handle-bar and towards side handles, which allow for easier pours. Additionally, on modern kettles you will find a button or switch to turn the kettle on/off, and a see-through portion to show the user how much water is in the kettle. Although, these improvements may seem minuscule, they have made a difference in our lives by making boiling water easier and safer.
Introduction
Two decades ago Fischer (2001) explored the historical context of human computer interaction (HCI). At the time, the primary focus of HCI research was to inform better user experience (UX) design practices by ensuring systems are more usable, useful and user specific. For example, researchers would focus on how regular daily users navigate websites. Since then, HCI has expanded into a multidisciplinary field, with distinct facets for researchers to explore. Some examples of these new explorations include optimizing how people use technology, think, communicate, learn, critique, explain, debate, calculate, augment, simulate, and design (Fischer, 2001).
HCI research methods can be approached from different perspectives, such as an emulation approach and complementing approach (Fischer, 2001). The emulation approach is based on the idea that society wants to implement technology with more human-like abilities (e.g., artificial intelligence (AI)). The complementing approach suggests that technology cannot simulate human behaviours and research should focus on the interaction of humans and computers, and optimize this collaborative relationship. Norman (2013) argues HCI fields are not well defined, but we can conceptualize three fields with a high level understanding of the emulation approach and complementing approach:
Industrial Design: Form & Material | Professional service of creating and developing concepts that optimize the function, value, and appearance of products and systems for the mutual benefit of both the user and manufacturer.
Interaction Design: Understandability & Usability | Increasing the users awareness and understanding of how they can interact with technology.
Experience Design: Emotional Impact | The practice of designing products, processes, services, events, and environments with a focus placed on the quality and enjoyment of the total experience.
The design principles above are used globally to foster a better understanding of what HCI is and how designers can use it to improve the way we interact with technology. However, you may have noticed these is little to no mention of accessibility above, which might indicate that users with accessibility limitations have been somewhat neglected.
In 2019, a reporter from Slate interviewed individuals with accessibility limitations and discovered how deeply sad it can be to live in their world:
“…a constant feeling of being devalued. It doesn’t matter about the stupid button that I can’t press in that moment. It’s that it keeps happening. … And the message that I keep receiving is that the world just doesn’t value me, and that people really don’t care.” — Chemel, blind
I don’t know about you, but this broke my heart.
It is unkind of humanity to continue moving forward with new technologies, without addressing some of the severe problems that exist within populations who cannot access technology.
In this Medium story, I will review current research on accessibility and why it matters. I will then define what biosignals are and address how we can integrate biosignals into wearable technology to aid persons with accessibility limitations. Followed by a discussion on how we can optimize UX using biosignals. Finally, I will propose an app concept that fuses accessibility, biosignals and UX.
Discovery & Research
Accessibility
The power of the Web is in its universality. Access by everyone regardless of disability is an essential aspect. — Tim Berners-Lee, W3C Director and inventor of the World Wide Web
Web accessibility means that websites, Web tools, and apps are designed and developed so that people with disabilities can use them (WCAG, 2021). All individuals accessing the Web should be able to perceive, understand, navigate, interact with the Web, and contribute to the Web.
The World Wide Web Consortium (W3C) has guidelines for accessibility under the Web Content Accessibility Guideline (WCAG) 2.0. WCAG guidelines can be enforced at the provincial level in Canada, which means all resources that are accessed on W3C need to pass a certain level of accessibility, otherwise the owner will face negative consequences.
According to WCAG (2021), Web accessibility encompasses all disabilities that affect access to the Web, including:
- auditory
- cognitive
- neurological
- physical
- speech
- visual
WCAG (2021) outlines that Web accessibility also benefits people without disabilities, for example:
- people using mobile phones, smart watches, smart TVs, and other devices with small screens, different input modes, etc.
- older people with changing abilities due to ageing
- people with “temporary disabilities” such as a broken arm or lost glasses
- people with “situational limitations” such as in bright sunlight or in an environment where they cannot listen to audio
- people using a slow Internet connection, or who have limited or expensive bandwidth
Websites, web tools, and apps need to be properly designed and developed to allow individuals with disabilities to use them. However, currently many sites and tools are developed without accessibility in mind, making them difficult or impossible for some people to use (WCAG, 2021). Making the web accessible benefits individuals, businesses, and society.
According to the Global Digital report, in 2019, the number of Internet users reached 4.39 billion, with year-on-year growth of 9% (Kemp, 2019). There are now more than 1.5 billion websites on the World Wide Web, and it continues to grow at an accelerated pace. Acosta-Vargas (2019) states that websites related to social networking, education, government, businesses, and research have a high impact on designing social and technological development. Therefore, the information and various communication tools offered through websites and apps have become the ideal medium to reach users to exchange information and circulate research (Acosta-Vargas, 2019).
Web accessibility has come into recent focus because of the reasons outlined above; our whole world is being slowly digitized.
Web accessibility allows the inclusion of all types of users, improves access to web content, helps to obtain better results in search engines, and enables the reuse of content in multiple formats or devices. Web accessibility can help to reduce the digital divide, improve efficiency, improve response time, reduce development costs, maintain websites, and demonstrate social responsibility (Acosta-Vargas, 2019).
Accessible websites and apps will allow users with some permanent or temporary disability to receive and understand the content of a website, as well as to be able to navigate everything correctly. According to data from the World Health Organization, it is estimated that 15% of the population, approximately one billion people in the world, live with some kind of physical or mental disability (WHO, 2019). Hence, web accessibility is crucial, not only because it increases digital equality, but also because it provides both better Internet interaction and the benefit of using multiple technologies.
Biosignals
Registered biomedical signals, commonly referred to as biosignals, are a core component of medical diagnosis, subsequent therapy, and assistive applications, such as daily driver monitoring (Schmidt, 2015). Biosignals can be defined as a physiological phenomena, such as heart rate acceleration, that captures electrical activity within a biological being using external technology to convert electrical signals into data (Giannakakis et al., 2019; Schmidt, 2015).
Biosignals can be measured reliably in relation to stressors, including physiological (EEG, ECG, EDA, EMG) and physical measures (respiratory rate, speech, skin temperature, pupil size, eye activity) (Giannakakis et al., 2019).
A very familiar example of biosignal technologies are heart rate recordings on fitness trackers (e.g., Apple Watch). Biosignals have multiple therapeutic applications, such as functional muscle stimulation (e.g., on the arm) or measurement of neural activity (e.g., frontal lobe activity). In this paper I will outline the historical progression of biosignals, examples of modern applications of biosignals, and how we can use biosignals to aid persons with disabilities.
To better understand how biosignals work, I will provide an example here of a biosignal from its generation to its registration used by Schmidt (2015). To assess cardiorespiratory pathology, acoustic biosignals are used. The corresponding source in the heart is the sound of the heartbeat, which is the closure of heart valves. The sound propagates throughout the issue and the coupling (amplification) of the sound is heard via a stethoscope. The conversion of the acoustic vibrations into an electrical signal is through a microphone. A biosignal formation is similar to an equivalent circuit because the source of the biosignal is represented by a sinusoidal circuit (Schmidt, 2015). Another example is electrocardiography (ECG or EKG), which records signals from the human body using electric biosignal technology (Schmidt, 2015). ECG records the timing and strength of electrical signals from your heart beat. Historically, if you wanted to view your biosignals using measurement tools such as ECG, you would have to visit a doctor, today devices such as, fitness trackers, (e.g., Apple Watch) can record your heartbeat and atrial fibrillation (AFib) using the electrical heart sensor built into the watch (Figure 1; Apple, 2021). If your heartbeat is out of rhythm or irregular, it will notify you and record the data in your Health app. The ECG app has multiple practical applications, such as for doctors of patients who have heart concerns, because real-time data is recorded and readily available (Apple, 2021). The app cannot detect heart attacks, blood clots or a stroke, or other heart related conditions, but it can provide doctors with useful information that can be used to prevent such occurrences (Apple, 2021).
Figure 1
Apple Watch ECG Recording
The therapeutic application of biosignals has grown far beyond the parameters of measuring only physiological phenomenons into more complex brain functions, such as stress, memory, movement and more. Researchers Giannakakis et al. (2019) surveyed how we can detect psychological stress using biosignals. The term “stress” was initially introduced by Selye in 1926 but the term continues to remain elusive due to its subjective nature (Giannakakis et al., 2019). Human stress response is multidimensional and can be composed of psychological stress, behavioural stress, and physiological stress (Giannakakis et al., 2019). Stress responses can be measured using a multitude of technical tools that record brain activity to better understand the neural and cognitive connections upon the introduction of stressful stimuli (exogenous or endogenous). Biosignals can detect stress using physical signals (e.g., facial expressions, extremity position/movement, blinks, etc.) and physiological signals (e.g., ECG, electroencephalogram (EEG), etc.). EEG allows observers to measure changes in neurological activity after subjecting a person to external stress (e.g., performing stressful tasks). Scientists can utilize these physical and physiological signals to localize brain activity and design technology to help people communicate with the external world (Giannakakis et al., 2019). For example, researchers Gómez-López et al. (2019) utilize this form of data to assist elderly persons with Parkinson’s disease. Gómez-López et al. (2019) developed a brain computer interface (BCI) application that allows users to control their touchscreen devices by tracking facial and head movements captured by the camera, which allows the user to control the mouse on their phone screen. The researchers modeled their development, BCI Touch, after the EVA Facial Mouse, with the primary difference being that the latter only utilized facial biosignals. This technology has the potential to allow persons with disabilities, such as amputations, cerebral palsy, multiple sclerosis, and other disabilities to take back some control of their daily device usage.
The authors tested this technology on seven elderly persons with Parkinson’s disease who had prior experience using mobile devices. The participants were assigned three tasks: set an alarm, find a contact (son, daughter or sister), and make a phone call, using the newly developed technology, BCI Touch and earlier technology, EVA Facial Mouse. To measure the functionality effectiveness, the research metric used was the time spent on tasks in seconds. The results yield significant values based on the paired sample t-test, affirming that the time taken to complete the tasks was lower when the users completed them on BCI Touch, versus EVA Facial Mouse. The facial recognition effectiveness was measured based on the number of errors participants made during the tasks.
The results show that participants made less mistakes using BCI Touch and more using EVA Facial Mouse. Lastly, the UX was measured by the participants’ verbal description of how their experience was using the two different tools using The Usability Metric for User Experience Lite. Results revealed that BCI Touch UX was perceived as more user-friendly than EVA Facial Mouse. In conclusion, designing BCI Touch to assist persons with disabilities, specifically Parkinson’s disease, allowed users to effectively utilize mobile devices in a way that was functional (i.e., perform meaningful tasks, such as phone calls) and user-friendly (Gómez-López et al., 2019).
User Experience x Biosignals
User Experience (UX) emerged as a specialized field in HCI for researchers to focus on end users and go beyond traditional HCI research practices and focus on emotion, for example (Liapis et al., 2021). UX tools can measure emotions using different approaches, such as post-questionnaires, interviews, and observation. However, these methods are difficult to quantify and can be subjective.
To address this pain point for UX researchers, new technologies, such as facial recognition, speech tone analysis, heart rate and respiration, have emerged to quantify data (Liapis et al., 2021). One of the most well-studied psychophysiological markers of the autonomic nervous system stress response is the Galvanic Skin Response (GSR), which can be measured by Wearable Stress and Affect Detection (WESAD). Researchers Liapis et al. (2021) set out to examine how UX researchers can utilize biosignal technology, such as WESAD, to capture subtle stress responses when users are navigating UX evaluation studies (e.g., show us how to delete your Amazon account online). WESAD is a publicly available tool and it can measure blood volume pulse, electrocardiogram, electrodermal activity, electromyogram, respiration, body temperature, and three axis acceleration (Liapis et al., 2021). All of these measures are used to inform the researcher or practitioner of the participant’s physiological state.
The researchers recruited 30 participants to complete a usability study while their biosignals were tracked. The participants were asked to record any usability issues they encounter while performing the usability tasks.
The results were analyzed by comparing the participants’ self-reported usability issues with the timing of the GSR, recorded by the WESAD. The results found that WESAD classified the participants’ subtle stress response to usability issues with 95.8% accuracy. Although biosignals such as WESAD are generally used to measure GSR during stressful events such as violent movies, it can also effectively measure subtle UX stress responses (Liapis et al., 2021).
Haratian and Timotijevic (2018) also explore the use of GSR to detect emotions. The researchers propose the use of GSR in addition to a pulse rate measuring tool can measure the affective states of users. This algorithm can recognize the user’s emotional state while it interacts with the technology in real-time. Haratian and Timotijevic (2018) conduct single subject trials, in which participants are exposed to standardized images of specific emotional states. The participants are asked to label how they feel when viewing the images on an emotion quality indicator scale. The results found that the biosignal technology incorporated could accurately predict the emotional state recorded by the participants.
The research above suggests that incorporation of prediction models directly into computer systems would improve the UX for individual users. For example, if a user seems to be in a high intensity emotional state, the website could speed up to accommodate the user, thus improving overall usability.
I plan to explore how we can combine research on accessibility, biosignals, and UX to create better technologies for individuals who have accessibility limitations.
nav —
an accessible app for persons with accessibility limitations.
The research above summarizes what accessibility is and why we should care, what biosignals and how they are applied, and how we can combine accessibility and biosignals to create better UX for individuals with accessibility limitations.
Below I will take you through my process of ideating and conceptualizing an app that can be useful for people with web and app accessibility limitations.
Comparative Competitive Analysis
To help visualize and compare the apps that are currently available for people with accessibility barriers, I created a comparative competitive analysis (CCA).
As you can see above, none of the apps currently available to individuals with accessibility limitations offer all of the tools they may need. The apps are limited to one type of disability, only two offer live help, and only one is available on both iOS and Android.
The CCA allows me to easily refer back to my research and understand the information I gathered with a simple glance.
While I was conducting my research and CCA, it became evident to me that I need to conceptualize an app that is available to users who are deaf, blind, or suffer from cognitive impairments, and offers live help to these users.
User Personas
User personas are archetypical users whose goals and characteristics represent the needs of the projects target group.
The research outlined previously helped create the user personas (below) for this project.
The user personas were a helpful tool throughout the design conceptualization process and helped bridge the gap between research and planning.
The Design Process
The “Why”?
The first step to establishing the visual language of a website or app the designer must establish the “why”. The “why” allows designers to set the stage for who they are designing for and why they are designing.
This is why I am creating an app for users who experience inaccessibility:
To create more equitable technology that allows all people to utilize it, regardless of [dis]ability.
In addition to establishing the why for my app, I also wanted to name it because it brings it to life (for me). The name I picked is:
nav
I named the app nav because its purpose is to help individual navigate through the digital world. It’s also part of my brothers name, Navchetan.
The Objective
Use Case
nav will be an app that detects frustration using biosignal technology (i.e., ECG) when a user is navigating an app/website or real-life scenario.
The app will ask the user if they need help with navigation and prompt them to select their preferred method (call or written).
The app will be given access to the users device (i.e., remote access) and a live agent will help the user troubleshoot in their preferred method of contact. The user permissions and access information will be completed in the preliminary sign-up stages of the app, so no need to grant access at this stage.
The live agent (trained remote volunteers) will help the user with the web or app they are having trouble with and end the call/chat once they are done.
nav will allow users to use the internet freely and without worry that they will not be able to get pass certain barriers, opening up a world of opportunities.
Design Inception
The “why” establishes the purpose and drive behind the mood and visual language, which are discovered using design inception.
The mood I set out to create was an optimistic, engaging, playful and friendly vibe catered to individuals who are generally outcasted.
I took the principles of biomimicry into consideration when establishing my visual language.
Biomimicry is the art and science of designing products and systems that mimic biological entities and processes to create more sustainable designs.
I have elaborated on my visual language below and how it encompasses biomimicry.
Visual Language
Space | To foster an optimistic and playful vibe, I choose to include positive space, which is a fancy way of saying white space. White space gives the user room to breathe and has an organic feel to it; the same way a user has room to breathe in nature. This lends its way into organized and balanced space. To avoid overwhelming my users, organization and balance is key to a positive experience.
Shape | I wanted to keep the shapes organic with a little bit of depth to mimic the unique properties of nature.
Colour | The colours blue and green are found everywhere in nature; from mountains to the depths of the sea. These colours were chosen to give the app life.
Movement | Everything in nature has a purpose and is fluid. To model this process in the app, we chose to keep the movement logical and continuous.
Moodboard
A moodboard is a digital or physical collage of ideas that reflect the mood you plan to achieve with your design.
To the left is the moodboard for nav: Views in the Valley.
I named this moodboard “Views in the Valley” because our ecological footprint stems beyond the boundaries of our backyard. Contributing to a more sustainable planet includes (but is not limited to) supporting all forms of nature, such as the forests, oceans, mountains and much more.
In this moodboard, you will find elements of greenery and nature to encompass a wholesome and friendly atmosphere. This is contrasted with blue imagery and gradients to create a down-to-earth and calm space.
The textures are primarily flat, which fosters a simple and meaningful design. The features are minimalistic with vibrant colours to bring life to the picture.
Style Tile
Style Tiles are a design deliverable consisting of fonts, colours and interface elements that reflect the visual brand for the app.
The typography chosen for the app is Open Sans. Open Sans is a humanist sans serif typeface designed by Steve Matteson. It is a neutral, legible and friendly typeface, optimized for print, web, and mobile interfaces. It is utilized in other educational apps for youth and fit the mood for the current project.
Design
The design objective was to utilize the UX research to foster simplicity, ease of use of the app and responsive design.
Wireframes are the blueprints of the app or website. UX/UI designers take these blueprints and bring the design inception of the project to life.
In the next iteration of this project (January 2022), I will be taking my conceptualized app design and flushing it out completely. This next iteration will include the completed UX/UI, user testing, usability testing and a prototype of the app.
I am looking forward to continuing this project and updating medium and my supervising professor, Dr. Jennifer Jenson with the finished project.
That’s a wrap, for now! If you liked my case study, claps would be appreciated and follow for part 2!
Let me know what you think in the comment section below, all feedback is welcome!
Thank you,
Kirn Bhela
References
Acosta-Vargas, P., Salvador-Ullauri, L. A., & Luján-Mora, S. (2019). A heuristic method to evaluate web accessibility for users with low vision. IEEE Access, 7, 125634–125648.
Aizpurua, A., Harper, S., & Vigo, M. (2016). Exploring the relationship between web accessibility and user experience. International Journal of Human-Computer Studies, 91, 13–23.
Apple (2021). Take an ECG with the ECG app on Apple Watch. https://support.apple.com/en-us/HT208955
Cubero, C. G. & Rehm, M. (2021). Intention Recognition in Human Robot Interaction Based on Eye Tracking. In INTERACT 2021. Springer.
Digital 2019: Global Internet Use Accelerates — We Are Social. 2019. https://wearesocial.com/uk/blog/2019/01/digital-in-2019-global-internet-use-accelerates/
Fischer, G. (2001). User modeling in human–computer interaction. User modeling and user-adapted interaction, 11(1), 65–86.
Giannakakis, G., Grigoriadis, D., Giannakaki, K., Simantiraki, O., Roniotis, A., & Tsiknakis, M.(2019). Review on psychological stress detection using biosignals. IEEE Transactions on Affective Computing.
Gómez-López, P., Montero, F., & López, M. T. (2019, June). Empowering UX of Elderly People with Parkinson’s Disease via BCI Touch. In International Work-Conference on the Interplay Between Natural and Artificial Computation (pp. 161–170). Springer, Cham. https://link.springer.com/chapter/10.1007/978-3-030-19591-5_17
Gupta, N., & Bruce, C. (2021, July). Accessibility Practices for Prototype Creation and Testing. In International Conference on Human-Computer Interaction (pp. 89–98). Springer, Cham.
Haratian, R., & Timotijevic, T. (2018, September). On-body Sensing and Signal Analysis for User Experience Recognition in Human-Machine Interaction. In 2018 4th International Conference on Frontiers of Signal Processing (ICFSP) (pp. 50–55). IEEE.
Introduction to Web Accessibility. 2021. World Wide Web https://www.w3.org/WAI/fundamentals/accessibility-intro/
Kim, W. J., Kim, I. K., Kim, M. J., & Lee, E. (2018, December). Effect of UX Design Guideline on the information accessibility for the visually impaired in the mobile health apps. In 2018 IEEE International Conference on Bioinformatics and Biomedicine (BIBM) (pp. 1103–1106). IEEE. https://ieeexplore.ieee.org/abstract/document/8621471
Khowaja, K., Al-Thani, D., Aqle, A., & Banire, B. (2019, July). Accessibility or usability of the user interfaces for visually impaired users? A comparative study. In International Conference on Human-Computer Interaction (pp. 268–283). Springer, Cham.
Kulkarni, M. (2019). Digital accessibility: Challenges and opportunities. IIMB Management Review, 31(1), 91–98.
Rosa, J. R. D. S., & Valentim, N. M. C. (2020, October). Accessibility, usability and user experience design for visually impaired people: a systematic mapping study. In Proceedings of the 19th Brazilian Symposium on Human Factors in Computing Systems (pp. 1–10). https://dl.acm.org/doi/abs/10.1145/3424953.3426626
Liapis, A., Faliagka, E., Katsanos, C., Antonopoulos, C., & Voros, N. (2021). Detection of Subtle
Stress Episodes During UX Evaluation: Assessing the Performance of the WESAD
Bio-Signals Dataset. In IFIP Conference on Human-Computer Interaction (pp. 238–247). Springer, Cham.
Schmidt, A. (2015). Biosignals in human-computer interaction. interactions, 23(1), 76–79. https://dl.acm.org/doi/abs/10.1145/2851072
World Report on Disability. 2019. World Health Organization.https://www.who.int/teams/noncommunicable-diseases/sensory-functions-disability-and-rehabilitation/world-report-on-disability