Reporting Program Outcomes: What Not To Do

A cautionary tale

Kelly Bell
3 min readApr 23, 2014

Vox recently ran an article with a very provocative headline: “A nonprofit in Queens taught people to write iPhone apps — and their incomes jumped from $15k to $72k”. That’s quite a lofty claim. The attention-grabbing headline begs the reader to click through and confirm something they already want to believe: that learning to code can transform your life.

Unfortunately, the data supplied in the article just doesn’t support that assumption. To quote:

Six months after the first Access Code class of 21 students completed the 18 week course, the 15 graduates who accepted job offers have seen their income rise from under $15,000 to an average of $72,190; the other six students are either still in college or have chosen to launch their own startups.

The astounding numbers reported in the headline refer to a small sample size of 21; more specifically the 15 participants who experienced a salary increase. The six (i.e. nearly 30% of the sample) that didn’t experience a salary increase are quickly brushed over.

Perhaps more concerning than the incredibly small sample size is the lack of any comparison group. When establishing the impact of a program like Access Code, we must first ask: who are the participants? What would their outcomes be if they hadn’t participated in the program? The article doesn’t give us much to go on.

“We saw lots of people in the City University of New York system who graduated as computer science majors but weren’t going into the tech industry,” says Jukay Hsu, founder of Coalition for Queens and a member of Bill De Blasio’s mayoral transition team.

The Access Code website is similarly sparse with details, but does say that the program “partnered with local community institutions such as Upwardly Global, Chhaya, Queens College, and the CUNY Macaulay Honors Program” to find “the most promising students”. Based on this information, we can assume the Access Code participants are current or recently graduated students in the computer science field, and good students at that. It’s reasonable to assume that they would experience a salary increase upon graduation, regardless of participation in Access Code or not.

According to SimplyHired, the average salary for computer science graduates in New York City is $77k. The $72k salaries of Access Code participants seem much less impressive in context. If we took into account the fact that only 70% of participants reported a salary increase, the average salary for participants would actually be much lower than reported.

Now, you might be tempted to point out that these participants were all making less than $15k before they started the program, so surely Access Code had some impact! Not so fast. Most participants appear to have been undergraduate students, probably working part-time in jobs that don’t require an advanced skill set. After graduating with a computer science degree and finding full-time work in their field, we would expect their incomes to jump dramatically. In this context, the salary increases of Access Code seem par for the course for any recently graduated college student. We aren’t given any evidence that Access Code increased salaries beyond what would be expected for students that didn’t participate in the program.

Nonprofits that teach technical skills have a lot of potential. But misleading articles like this one only hurt them in the long run — when we ignore good science, we run the risk of funding programs with exciting but unsupported claims, rather than ones with realistic outcomes based on actual research.

--

--