Redesigning Project Screening

Dan Betz
Making DonorsChoose
6 min readDec 13, 2018

The DonorsChoose crowdfunding model works by connecting teachers who need funding for their classroom with generous donors who want to help. We’ve already written about how we take great care to give our teachers the tools they need to craft a compelling and successful project and how we’ve worked to make sure that we present those projects in a way that resonates with donors. An important but invisible piece of that whole process is project screening, where projects are checked for eligibility, safety, and quality.

A project on our site generally has a photo of the classroom, a catchy title, a short essay introducing the classroom and the teacher’s idea for their students, and a list of the materials or experience that the teacher is trying to get funded. And we need all of that to be reviewed by a human before we show it to our citizen donors.

Every single classroom project that is posted on DonorsChoose is manually reviewed by our team of volunteer screeners.*

Project screening is incredibly important to both the integrity and transparency of our site and part of the reason we’ve earned the highest possible ratings from all of the major charity watchdog sites. So while we’ve always valued the process of screening and the volunteers who make it happen, until recently these brave souls were using an outdated interface with a confusing workflow. As DonorsChoose continues to grow and reach more teachers, we prioritized an overhaul of the screening process to make sure that this integral tool scales with us.

Of course it wouldn’t be a proper redesign write-up without showing you a before and after, but this project wasn’t simply a facelift. We made huge, functional changes to the process to increase operational efficiency…while also modernizing the interface and improving the experience for both screeners and teachers. Here’s what it used to look like:

BEFORE: Follow the numbers to see the workflow that we trained our screeners to follow. Clearly there was room for improvement.

Top-down design

As the product team dug into this project by talking to our in-house experts, watching our volunteers screen projects, and using the tool ourselves, it was pretty clear that there was an ideal workflow and an opportunity to bring order to the experience. While the workflow had evolved over the years, the interface just hadn’t kept up.

In the redesign, we moved to a single-column layout to make it easier for screeners to follow the recommended workflow…and you know, actually read the content that they’re supposed to be screening for clarity and accuracy! One of our volunteers even remarked “Wow, this is so easy to read!” totally unprompted when testing an early prototype. 😎

Scrolling through the new top-down project screening flow.

Screening photos for safety

Safety is our top concern when reviewing photos that will appear on our site. We work with a third-party safety expert to create rules for student and teacher safety and we train our volunteer screeners to enforce them. We had previously instructed our screeners to make sure that no student’s face was larger than 1/4 of the photo, but that was a tricky thing to try to figure out. During a user testing session, one of our volunteers told us that she had figured out that a nickel was about the right size, and would hold up a nickel to the computer screen to make sure the faces were smaller than the coin.

While we all appreciated the ingenuity, keeping a nickel on hand didn’t seem like a necessary requirement for screening photos. So we built the necessary tools in the interface to make this process easier and more accurate.

No nickels needed to screen for photo safety now!

Rejecting a project

Whenever one of our screeners has to send a project back to the teacher for edits, we like to provide contextual, personalized feedback to get the teacher back on track. We have a library of pre-saved text snippets with directions for correcting the most common problems. Previously this experience was disconnected from the project (it was on a different screen) and didn’t allow the screener to reference the content for which they were providing feedback.

Helping teachers correct their projects is now easy and inline.

Keeping the “needs teacher edits” section inline was an efficiency win for our screeners (no more toggling between tabs) but we were also able to make changes to improve the experience for teachers who are drafted.

We know it’s frustrating for our teachers when their project gets sent back to them for edits. Before, teachers would get a lengthy email explaining why they were drafted. Then, they’d click into our site to fix their project, but without any indication of what needs to be fixed or why. Thanks to the changes we made here—attaching the feedback to a specific section of their project instead of the project as a whole—we were able to repeat the coaching instructions from the email right next to the offending field when they return to our site to make the necessary edits, which will hopefully lead to higher resubmission rates.

Re-screening efficiency

After drafted teachers make the corrections outlined by the volunteer and resubmit it, we add it back to the queue for another round of screening. I told you we take this stuff seriously!

Before this redesign, the resubmitted project would come into the queue like any other and the second volunteer would have to review every single section—even if the first volunteer approved all but one section! Now we’re saving the progress from the previous round so that all sections that have already been marked as “looks good” are pre-approved. This means much faster re-screening and lowers the chance of our teachers being drafted multiple times.

Cruising past previously approved sections of the project.

Defining success

These are just some of changes we made while overhauling the screening process. Our goals were to keep turnaround time low (time from submission to approval) as we scale, to be operationally efficient and cost-effective, to make screening a positive experience for our volunteers and teachers, and to keep project quality on DonorsChoose high, all while maintaining safety. There are many metrics we’re watching but it’s still too early to share any conclusive results.

Like any major change to a system created for a group of trained, dedicated users, it takes time to bear fruit. The initial re-training period almost always creates an initial productivity dip while everyone adjusts to the new system. On the positive side, we have already learned that on-ramping new volunteers is much smoother than in the past. We’re also excited to collect data to make sure the changes we made are on track and continue to improve the experience for our screeners, who we’re finally treating like first-class citizens!

*We compensate our screening volunteers with DonorsChoose credits. Since they’re all active DonorsChoose teachers, this arrangement works well for everyone!

--

--