Implementing vocabulary packs for AAC devices

OriannaWYQ
9 min readMar 5, 2023

--

Introduction

For this Design Sprint, our group was tasked with designing for communication. We worked on identifying problems and proposing solutions for users of Alternative and Augmentative Communication (AAC) devices while using the Value-Sensitive Design framework to guide our thinking. AAC devices can range from low-tech boards with a few words or symbols to dedicated technology to read out typed or selected words. After analyzing examples of AAC technology, we attended a panel with AAC users and their aides to help us better understand the problems facing AAC users and to help inspire new designs to assist AAC users. For our project, we are proposing a set of guidelines and a prototype interface to download and create vocabulary packs that can be added to the AAC app TouchChat Discover. Making it possible for AAC users to create and share vocabulary packs that rely on a standard set of guidelines would allow AAC users to easily communicate about the topics that most interest them without having to spend excessive time themselves programming and/or learning new buttons.

Value-Sensitive Design and evaluating the existing tech

Our design process was guided by a framework of value-sensitive design, as discussed in the paper “Value Sensitive Design: Theory and Methods” by Batya Friendman, Peter H. Kahn Jr., and Alan Borning. Value-sensitive centers the design process around understanding how a solution to a problem might support or compromise human values. For example, accepting cookies from a website might support a more personalized and navigable browsing experience, but it may come with a trade off in one’s privacy. By keeping this framework in mind, we sought to promote certain values and avoid contradicting others.

Next, we familiarized ourselves with the TouchChat Discover app, and analyzed it with the value sensitive design framework in mind. Although some aspects of the app helped make the communication process smoother, the overall experience of using the app was often slow and tedious. It takes time to learn the locations of the buttons, and in general, typing and button pressing is always going to be slower than speech. Other usability issues that we noticed were that the interface wasn’t always the easiest to read and that there weren’t clear instructions for first time users. The interface also used many symbols and drawings to help those with reading comprehension, but those symbols could also be confusing, demeaning, or childish to an older user group. [explain values supported or compromised]

A view of the TouchChat Discover app showing the home page
Another view of the TouchChat Discover app showing the word finding and typing page

Panel of AAC Users

Next, we had the opportunity to get the valuable perspectives of AAC users themselves! We hosted a panel discussion of several AAC users and their aides (family members, speech & language pathologists). Hearing from AAC users themselves was incredibly helpful in understanding the most pressing problems that they faced. Sitting down on our own and trying to generate ideas about how to improve AAC devices was much less useful than getting to hear directly from users.

Two of our panelists, “Ella” and “Elena” (names changed for privacy), explained that they wished their devices sounded more natural or more like their own voices. They also faced issues when it came to communicating easily in a conversation. Ella explained that people often talked over her, or didn’t understand that it would take her longer to write a response. As a result, it was often easier to compose something ahead of time instead of trying to slowly write it out in real time. Mobility also posed a bigger issue than we had initially anticipated. For example, Ella explained that it was basically impossible to use her device without a steady surface. She desired a practical solution for how to store her device and keep it accessible, while at the same time not interfering with the other functions of her wheelchair. Ella also explained that one thing she wanted from her AAC device was to talk about her favorite things — for example, Spongebob. Her inability to type effectively, combined with the difficulty of programming unique buttons for her device made it hard to use vocabulary that mattered to her. During the panel discussion, we were also introduced to another helpful resource: the Communication Bill of Rights (shown below), which carefully outlines the rights of AAC users. After the panel, we were able to go into our brainstorming session with a clearer idea of the most pressing problems faced by AAC users.

This image shows the Communication Bill of Rights, which we used to guide our discussion with our classmates and helped us think about the values supported by our design.

Affinity Diagramming, Prototype Brainstorming and the Idea

In order to decide on what communication aspect(s) of AAC devices we wanted to improve, we had a prototyping session in the idea lab, a creative space for entrepreneurs and innovators to experiment with different materials. First, we engaged in affinity diagramming, where our project group wrote about what problems and potential solutions we could identify for AAC users. Then we worked together to sort out our ideas into different categories and themes. We took the time to quickly draw out several potential prototypes based on the problems and values we identified in our diagramming activity.

These photos show our affinity diagramming process. In order to brainstorm potential prototype ideas, each member of our group spent five minutes writing down ideas, values and/or problems faced by AAC users onto sticky notes. Afterwards, we came together as a group to sort the sticky notes and find connections between our ideas.
After our affinity diagramming activity, the members of our group spent time drawing out quick sketches of prototypes, shown in the four photos above. These included personalized interfaces, AAC devices with the ability to connect to one another, different voices expressing emotional tones, and more.

After sharing these ideas, there were 3 main categories of issues we thought to improve:

  • Personalized devices. For example, making sure there is a history of most used words by the user which they can refer to anytime during conversations, or offering the user the ability to change the words available on their device or the aesthetics of the symbols and interface.
  • Connect devices together. By this, we thought of a type of peer-to-peer network that allowed various AAC devices to be connected in a network where they can exchange data.
  • Adapting AAC devices for emergency situations. We were thinking of having an emergency “tab” on the device, where the user could call for help or seek assistance in case they found themselves in a tight situation.

With these themes in mind, the idea that we settled on was based on the main problem with AAC devices that one of our panelists, Ella, identified during our discussion–being unable to talk about subjects she wanted (ie, Spongebob). The current application allows users to type in words or program new buttons, but it is very time-consuming and there’s no easy way to quickly add those words. Our proposed solution for this problem would be adding the ability to create, share, and download vocabulary packs based around popular media or other topics. Each vocabulary pack could share a basic, standardized structure, making it easy to navigate without spending too much time learning a new layout, and making these packs shareable would save users the time of having to program new buttons on their own. This, in theory, allows for the AAC to better express themselves and their interests with greater ease.

To assist those already familiar with the UI layout of the AAC device, our vocabulary packs will act the same as how a group works in TouchApp Discover. You can select words from a group, like “bugs” or “furniture”, and still have access to common words like “I” and “a”. This aims to keep current TouchApp users familiar and comfortable with the introduction of vocabulary packs.

We decided that for our final product, we would create a mockup of an example vocabulary pack interface (assuming compatibility with TouchApps interface) and write a document/manual explaining how to use vocabulary packs. In addition, we created a storyboard demonstrating how our prototype would be used in context.

Storyboard Draft

Storyboard Final Version

Our final storyboard version.

Refining our Idea and Getting Feedback from Classmates

One of our last steps was returning to the VSD framework by discussing our project with another group in our class. We wanted to analyze our proposed prototype to get a better understanding of what values our design supported and/or compromised. We also referred back to the Communication Bill of Rights. Ultimately, our prototype most strongly supported the values of personal expression, agency, and convenience. Similarly, it also serves to support a few specific communication rights, including the right “to have real choices,” “to be a full and equal member of my community,” and the right “to share my feelings.”

However, there is some trade-off between the values of expression and ease of use. The format to create a new vocabulary pack is fixed, and restricts the user in many ways. We think that this trade-off is worth it, as they can avoid the more painstaking process of programming additional words themselves (even if it is a limited solution). In addition, our solution is only useful for current users of the app. Our classmates also provided additional helpful feedback by asking us questions about the process of creating and uploading vocabulary packs and how that would work with the rest of the app. It was this feedback which persuaded us to include that information in a manual document.

Conclusion

Our final product for this design sprint consists of a Figma mockup of the interface for a vocabulary pack, this document explaining the structure and function of creating and using the packs, and our storyboard demonstrating how our prototype addresses the problem. The Figma mockup gives a sense of how the vocab pack would be integrated into the current interface, although it doesn’t contain the comprehensive list of vocab words that the final product would include. The following screenshots show the general ideas of how the words in the package are shown and the page to install the packs from Menu.

Our next steps with this project would be to test out our prototype with actual users and use their feedback to inform our next steps — most likely creating a revised prototype with more features fully implemented. Just as consulting with AAC users was helpful during the brainstorming process, it would be just as important to have them try out our design and let us know what did and didn’t work for them. Our intention is that our solution would allow AAC users to better communicate about their own interests and make it easier and faster to talk about their passions. Another step would be to implement a better way to create words and buttons than the options given by TouchApp. Currently, it is required to upload a file. But we could create an option to create a pack within the app.

Overall, this design sprint helped us gain experience in identifying a variety of problems faced by a group of people and choosing how to prioritize human values when proposing a solution. In particular, getting the chance to speak with stakeholders directly was really important, especially since we were designing for problems outside of our lived experience. Ella’s desire to talk easily about Spongebob was what ultimately inspired our prototype. In addition, using a framework like value sensitive design helped us understand how different design choices are interconnected with human values. We know that we will always have to make trade-offs, but we did our best to do so thoughtfully and with intention.

--

--