Promoting equity through usability testing with vulnerable populations

Kristin Taylor
10 min readDec 31, 2019



How might we help the Office of the Police Oversight make the complaint process more accessible?

Office of Police Oversight page on

Project background

The City of Austin’s Office of Design and Delivery, in collaboration with the Austin Tech Alliance, partnered with the Office of the Police Oversight to understand the current pain points and obstacles residents face when submitting complaints about their experience with the Austin Police Department.

In 2018, the Service Design Lab went through an in-depth examination of the current processes, policies, experiences and perceptions of the police complaint process in the City of Austin. Some of the biggest barriers that they cited to the complaint process were:

  • only paper complaint forms were accepted and citizens were required to file them in person at a location on the outskirts of the city,
  • little trust that submitted complaints were actually filed and that something would come of it,
  • lack of transparency in what happens after a complaint was filed,
  • the documents and data required to file a complaint caused fear of retribution by the complainant.

At the time of the Service Design Lab’s examination, most of the complaints against police were filed by middle aged white women. This fact made it clear that the police complaint process was broken.

The Service Design Team asked: How might we make it easier for anyone to complain?

Our service designers provided recommendations around the process, service delivery and policy. My team, working on a digital transformation, took on the recommendations that touched the digital realm. Those recommendations were:

  • create and online complaint form,
  • communications, including the form, should be multi-language,
  • and communicate what happens during an investigation.

Project outcomes

Austin residents can now file complaints against police officers online in both English and Spanish. This has made complaining more transparent, easier to do and less intimidating for people who have witnessed or experienced a bad interaction with a police officer.

In the short time since the digital form has been live, it has already made a huge impact.

  • Prior to release of form there were less than 10 complaints and thank yous filed a month on average.
  • In the first 2 months after its release the Office of Police Oversight received approximately 70 complaints and thank yous.
  • Since its launch in May of 2019, 182 complaints and 94 thank yous have been completed online.

This online form is the first step in making policing in Austin equitable and transparent.

My role

I was the lead design researcher on the digital implementation of the Office of Police Oversight complaint form.

After the form was developed, I facilitated usability testing to ensure functionality, refine the content and to make sure that there wasn’t something about the way we designed the form that would prevent people from using it.

My process

1. Understand the work that already happened

I joined the Office of Design and Delivery as a user researcher towards the end of this project. At this point the Service Design Lab had already handed over their recommendations, process had been changed, new policy implemented and a new digital form had been designed.

Because I came in towards the end of the project it was essential that I take time to understand the project goals, requirements and limitations of the form and the user groups.

2. Determine the appropriate research method

This form needs to be trustworthy, transparent and equitably accessed. Usability testing was the method that we decided most appropriate to test the functionality of the form and understand how people would feel and think as they filled it out. Usability testing would also help us make sure we are giving users confidence that they will not be re-victimized by filing a complaint and that the information they are entering in the form be taken seriously.

3. Recruit, schedule, and figure out logistics

Our goal was to test the form with people who previously had a negative experience with the police. They could draw on their own experiences to help us understand what it would be like filling out this form after a traumatic event.

We also wanted to run usability tests with English and Spanish speakers as well people who would rely on a screen reader to complete the form.

Since there had not been a design researcher on the digital transformation team before me, recruiting, scheduling and figuring out general logistics took some time. I also used this time to understand what processes we could put into place to make user research more nimble for future projects.

4. Facilitate usability test

Once recruiting and scheduling was mostly done, it was time to facilitate tests. Because of the complexities of this project, it was important to this was be flexible but persistent so that I could get relevant insights quickly to improve the experience of filing a digital police complaint.

5. Synthesize & share

Whiteboard of synthesized insights

After the usability tests were all complete, I gathered a few team members together to synthesize what we observed while facilitating usability tests.

As a new team member I relied on the rest of my team to understand what to share and with who so that we could feasible make a positive impact on the complaint form experience before launch.

What we learned from usability testing

Mobile view

Finding 1: Missing some “officialness”

Participants had a hard time locating visual cues that gave them confidence that this form is in fact provided by the City of Austin. They commented:

“I see the name “City of Austin” but I don’t see a logo that gives me the confidence that this is from the city of Austin.”

“This makes me think the page could be fraudulent.”

“This page looks generic. I have. to look really hard for the seal.”

To help users trust the form it is important that it look official.

Finding 2: Maps are hard

Map screen on the form

Within the form users pinpoint where the incident happened by using a map. Almost all of our participants had trouble with the map in one way or another. They couldn’t orient themselves on it or they didn’t see the part of the map that would allow them to type in an address so they were stuck dragging the map around trying their best to drop it in the right place.

We found that populations who have lower levels of literacy don’t know how to read a map this complicated. One participant noted “If I was at an elementary level in Spanish, I wouldn’t know how to read a map.” This challenge isn’t isolated to speakers of one language or another, reading a map is a complex process and developed skill. The lack of that skill shouldn’t prevent someone from completing this form. This led us to a bigger question: Is a map the best method to gather location information, if using it is so challenging?

Finding 3: We need to reinforce anonymity for sensitive fields

Areas of the form that ask for demographics and potentially personal identifying information

The form asks complainants for some information that could potentially identify them. While these fields of the form are intentionally not required, many participants thought they couldn’t move on to the next section of the form without completing those fields.

While completing the form, they commented:

“It seems like they’ll use [the demographic information] to identify me.”

“Do I have to put my email in there?”

“The only thing that would give me hesitation is if I wanted to remain anonymous, how would I get a guarantee of anonymity, because I gave my name and email address.”

Users wanted reassurance that they could stay anonymous or did not have to fill out fields, even when the fields aren’t labeled “required.” On many occasions it seemed like users assume that if there is a field for something on a form, it is required.

Any question asked on the form will give the person completing it a vision of what will happen with the information. We always need to be thinking how might we continually reassure users that they will remain anonymous while simultaneously gathering the information necessary.

Finding 4: Spanish & English

When facilitating tests with Spanish speaking participants, we quite often had bilingual participants.

“I would help my mom fill this out. But she would want to read it in Spanish and I am much more confident in English.”

There isn’t a clear boundary between English and Spanish users. For example, kids who are stronger in English might be helping their Spanish-speaking parents fill out the form. In this case, the parent may prefer a form in Spanish and their child may prefer it in English.

In a multilingual city like Austin, how might we consider designing tools that allows users to easily toggle between languages?

Finding 5: Word choice matters — in Spanish too!

Testing with Spanish speaking users revealed preferences for common, easy-to-understand language.

“The word ‘fiscalización’ stands out to me — to me, it doesn’t mean anything.”

“‘En línea’ could be mistaken for being in a line. I prefer ‘por internet.”

How might we ensure that our translated content is as resident-friendly as our English content?

How might we use user research earlier in the translation process to test translated content?

Form review page

Finding 6: Editing the review page

While on the review page we regularly heard comments like:

“How would I edit this?”

Multiple users moused over the review sections but didn’t go to dig in until we explicitly asked them to.

This review section is very important in the form process. We need the information to be correct and as complete as possible. This step in the form gives users the time to review and look at the information they put into the form and the opportunity to easily add or remove any information without the hassle of moving back and forth in the form. But, if users don’t understand that the sections open up and reveal more, we need to improve the design to make it more obvious.

Findings 7: Accessibility issues

The last round of usability testing was done with user relying on screen readers. In these sessions we found some common usability problems.

  • The date and time calendar widget was not usable for manually entering date.
  • The form’s progress indicator didn’t indicate the total steps in the process, only the current step number.
  • On the review page there is no indication of true/false when reviewing the statement “I would like an interpreter for all my interactions.” Because of this participants felt that they had mistakenly asked for an interpreter.

While a lot of the findings from testing with screen readers are technical issues rather than human-centric findings, discovering and documenting them was essential to the long term success of this project. Without this step, we would have released a form that would cause confusion and be unusable by people relying on screen readers. This would extend the issues of inequity that we are trying to overcome with the form.


Recruiting was one of the biggest challenges of this project. The user groups we were trying to reach we very specific and had to be comfortable talking about some personal and potentially traumatizing things. These participants also had to trust us, as City employees.

Another challenge with recruiting is finding monolingual Spanish speaking participants. The usual channels and tactics for recruiting did not work. We ended up intercepting people at neighborhood libraries to find monolingual Spanish speaking participants.

Researching with vulnerable people is always a challenge. We are asking them to re-live traumatizing experiences all while maintaining the rigor and structure of a usability test.

What will I do differently next time?

In order to get more participants, I will tackle the logistics and operations earlier in the process. This would mean looking over our database early on to see if we have access to the right people (in this project we did not) and then spending time engaging with community leaders as a way to find the right participants.

We facilitate a lot of our usability test in the meeting rooms of neighborhood libraries. In one particular session of testing, we had small a room that branch librarians needed to walk through to get to a supply closet. During that session, a couple of different librarians walked through the space and it was pretty distracting. In the future, when that is the case, I will communicate with the librarians and see what we can do to keep the space free of distractions.

One of the biggest struggles I had during this round of testing was knowing who and how to communicate findings. This project was long and had been worked on my so many people. I needed an informal way to share relevant findings to the appropriate people.