Designing & Testing the Common App

Using user-centered design to get more families into social service programs faster


At One Degree, we build tech for an audience that is typically not served by tech companies, but we use the same user-centered design principles and processes to inform, build, and test our products.

Recently, we saw an opportunity to tackle a part of the problem that $65 billion in public benefits go unused every year. Here’s how we did by designing and prototyping a new tool.

Using user-centered design, we built a new online tool for low-income families that cuts the time spent applying for social services by more than half.
Photo by Monica Melton on Unsplash


Once an individual finds a resource that can help them meet a basic need, we often see that the application and enrollment processes in public assistance and social service programs, like food stamps or Medicaid, are so lengthy, complicated and laborious that families frequently decide not to apply or drop off before submitting, and resulting in unfulfilled needs.

Millions of eligible individuals are simply not enrolling in benefits, with an incredible estimated $65 billion in government programs going unused.

Research has shown that enrollment in these programs is critical to making a positive impact on families in the long term.

In our efforts to help alleviate this problem, One Degree was awarded a grant by the John S. and James L. Knight Foundation to develop and prototype a tool that would streamline the process of individuals seeking to enroll in programs.

We set forth to create a single application for nonprofit and government social services, that we’re calling the Common App.

You can read more about the background and overview about the Common App project here.

As One Degree’s product designer and researcher, I led our process, which included initial research, product design, a pilot, and measurement of outcomes.

Chatting with members of the community

Initial Research

While there are dozens of programs that could be valuable to families, we wanted to first prototype the concept with three programs. In order to develop an understanding of how the application and intake process works for these different programs, we interviewed 10 community members as well as five frontline caseworkers and directors at three government/nonprofit organizations:

  • San Francisco Human Services Agency (for CalFresh)
  • Children’s Council (for SF3C, a child care subsidies program)
  • San Francisco Recreation & Parks Department (for their scholarship program, which helps low-income residents access a range of public sports and recreation activities at significantly discounted rates).

Some clear themes and pain points arose, which helped guide our initial designs:

1. Applying for multiple benefits means double or triple the work

Multiple applications means not only a redundant and overwhelming process, but many community members may have to take off work or family duties in order to travel to and from the organization and spend time waiting on-site. Some community members said they were overwhelmed and quit the process halfway through or that they only felt confident completing their application with the help of a caseworker.

We saw an opportunity with Common App to simplify and consolidate the experience of applying for multiple programs. Our goals were to:
  • Reduce cognitive load when possible. Applying for programs can feel complex and overwhelming, so it was important that we reduced cognitive load — or mental effort — on the user whenever possible by presenting a clean, simple UI and by controlling the number of questions that users saw at a time.
  • Keep text and language simple. Ideally, all copy would be written at a fourth grade reading level, as suggested by our partner organizations, and through our own experience in running our core platform,
  • Reduce repetitive actions. We not only streamlined common questions across applications into a single section but also sought every opportunity to do calculations for users when possible (e.g. totals for income calculations), as well as pre-fill fields for users, to reduce re-entering.

2. Tech literacy varies

Our community uses smartphones as their primary or only device. Depending on age and exposure to technology, tech literacy varies, i.e. a person’s ability to appropriately and effectively use technology tools to access, manage, integrate, evaluate, create and communicate information. As a result, we needed to account for varying levels of tech literacy by ensuring those less comfortable with technology could reasonably use Common App. This means making fields, buttons, links and other UI elements obvious as areas for interaction.

3. Document submission is one of the largest barriers in application processes

Our community may struggle to find all the necessary documents. In many cases, this requires that they make multiple trips to the organization’s office to present all the necessary documentation, resulting in many hours of travel and potentially wait time. As a result, it was important that we design the document upload page with flexibility — the applicant should know what documents are expected and, if they do not have a required document, be able to indicate that they would like a follow up from the organization, as to prevent applicants from giving up at this step.

4. Opportunity to increase exposure to different programs

Many first-time resource seekers find and apply for resources in daisy-chain fashion; that is, they find out about a second resource while they are applying for their first resource, and so on and so forth. We saw an opportunity for Common App to help educate and expose community members to all the benefits and programs they might be interested in by first prompting applicants with a needs assessment and suggesting programs based on their response.

Low-fidelity Mocks of Common App


We went through iterative stages of design from wireframes to low-fidelity mocks to build the Common App prototype.

Guided by our initial research, we designed for potential barriers by:

  • Limiting the amount of information per page. By being conscious of the cognitive load required of users, we divided up the application questions page-by-page, seeking a balance between related groupings of questions and the energy or amount of consideration that each question would require.
  • Writing simple, easily comprehensible copy. Our members are most comfortable with copy that reads at or below a fourth-grade reading level, so we like to use to double-check readability.
  • Pre-filling fields wherever possible to reduce repetitive actions.
  • Designing for explicit affordance wherever possible. In other words, making the function or intention of buttons, field selectors, and other patterns or components as obvious as possible, for example, with clear labels, to anticipate varying levels of tech literacy.
  • Streamlining the document upload page so that all documents are requested on one page, and it is clear which documents are required and which are optional.

Pilot & Testing

Through partner organizations and other forms of outreach (i.e. flyer distribution, email blasts), we were able to connect with a total of 29 families in the span of two weeks with the help of my team and six partner organizations. We gave participants $20 Target gift cards in appreciation for their time.

Each 40-minute session was a mix of generative (i.e. exploratory) and evaluative (i.e. usability testing) research:

  1. Baseline information and context-setting: After gathering some demographic information, we asked some background questions to understand participants’ tech usage, tech literacy, and their previous experiences in applying for benefits.
  2. Usability testing: As the participant uses Common App, we asked them questions regarding specific tasks, regularly prompting for thoughts and feedback. We also captured any areas of confusion or frustration, as well as moments of delight. We made sure to measure time to completion.
  3. Feedback survey: Once the testing has completed, we asked participants to complete an anonymous feedback survey. The post-testing feedback survey aimed to gather Likert-scale data around ease-of-use, user satisfaction, Net Promoter Score (an index used to measure the willingness of people to recommend an organization’s products or services to others), and any related open-ended feedback.


Estimated Time Saved

With traditional offline methods of applying (i.e. in-person or over the phone), it takes an applicant on average:

  • About 1 hour to complete the CalFresh application
  • About 30 minutes to complete the SF3C application
  • About 30 minutes to complete the SF Recreation & Parks scholarship application.

This means that applicants applying for all three programs would take about 2 hours. Additionally, most applicants spend at least an one hour of travel time to and from the physical site in order to apply for benefits.

Using Common App, participants applying for:

  • Two programs took an average of 29 minutes
  • Three programs took an average of 32 minutes.

The Common App can be used on a desktop computer, smartphone or tablet, so requires no travel time.

Thus, depending on the programs they choose to apply for, participants using Common App can save 1 hour to 2.5 hours compared to the traditional offline method.

Feedback Survey Results

Here is what we learned from the feedback survey:

Ease of use — 86.7% of participants “Agree” or “Strongly agree” that the website was easy to use. Those who rated lower ease of use cited tech literacy limitations: “This would be easy for those who have basic computer knowledge or have a gadget.”

Satisfaction — 91.7% of participants were “Satisfied” or “Very satisfied” with their experience of the website. When asked why, participants found the site “simple and easy” and appreciate “the idea that I can apply one time and submit to multiple organizations.”

Would you recommend to friends or family — 95% of participants were “Likely” or “Very likely” to recommend the Common App to friends and family.

What they like the most — Many participants expressed that they appreciated not having to go to the office:

“[Common App] is actually useful in so many ways; you don’t have to physically go to the many locations to apply and you can keep all of your documents safely at home while submitting online.”

Some direct feedback from people we interviewed:

  • For first-time resource-seekers: “It’s good if you don’t know about programs; if it’s your first time, it’s hard to know how to apply for stuff.”
  • For those in-need and possibly undocumented: “People right now are very needy. Because of their immigration status… Many people do not apply because of fear, even though they do not work and there is no food for their children.”

What they like the least — The most common theme for improvement from participants was that this “needs to be an app on phones, would make this app so much easier.” Once we implemented a mobile version of the website, the feedback reflected this change: “I like [Common App] a lot because I can send the documents from my cell phone and I didn’t have to go into the office.”

Photo by on Unsplash

Key Learnings

Here are the main learnings from our sessions with participants:

Optimize for mobile to account for varying tech literacy

Before the Common App was mobile-friendly, the most common request from participants was for a mobile app version. We noticed during testing that participants have limited tech literacy with computers. When using a laptop or desktop computer, many were uncomfortable using a mouse and/or unfamiliar with the concept of scrolling. However, the majority of participants own a smartphone as their primary or only device and therefore are very comfortable using it. As a result we optimized the Common App for use on smartphones and saw that participants were much faster and more comfortable completing the application on a mobile device.

Document upload

Participants initially were unsure which documents were required and which were optional, so we differentiated the groups through use of different headings and heading colors. For proof of income documentation (which is required for all the applications), some participants were in circumstances in which they were not working or did not have traditional forms of income documentation, like a pay stub. As a result, we worked with organizations to determine what was minimally acceptable for documentation and added a checkbox that, if checked, would alert the organization to follow up with the participant to obtain information or alternate forms of documentation.

Clearer copy

Although participants found most of the language in the Common App to be straightforward and easy-to-understand, some questions, like “How much money do you have on hand?,” prompted many questions from participants. Was this question asking for how much money they literally had on hand in that moment? If this question is asking about savings, then should they include their partner’s accounts? And how will this affect their eligibility or subsidy? Moving forward, we will continue to refine and test the copy to prevent confusion or misinterpretation.

Common App on mobile

Next Steps

Following a successful pilot, the One Degree team is excited to announce that we’ll be formally building out the Common App to include more programs and partners and to continue to improve on the user experience.

In the meantime, we’d love to hear from you. What was interesting? What stood out to you? Would you be interested in partnering or offering the Common App to your communities?

We are so grateful for our supporters that made this work possible, including a generous gift from the John S. and James L. Knight Foundation.

We also want to thank our partners for their help during the prototype phase:

(NOTE: The privacy of One Degree members is paramount to any solution that requires soliciting and storing personally identifiable information. All user information collected in the Common App is stored in our HIPAA-compliant platform, and all sensitive personal information and documents are encrypted.)