by Jon Stowe, President, Dev Bootcamp
Darrell Silver’s recent post about the remarkable rise of bootcamps and the need for greater transparency to drive continued adoption has the right premise: students should understand the value proposition they are signing up for. However, Silver calls for the wrong solution — that schools should simply report their outcomes along with the school-specific methodology. In reality, the current metrics being analyzed and touted perpetuate the myth that a universal standard for measuring program efficacy exists. Silver’s “solution” leaves the burden on individual students to decipher the underlying methodological flaws that often result in exaggerated outcomes claims — claims that are typically not accompanied by a disclaimer nor link to the methodology upon which they were ostensibly constructed. Worse, he celebrates some recent examples of schools that are “beginning to take transparency seriously” without fully appreciating that the very schools he cites may actually be giving prospective students a false sense of their true outcomes. All of this perpetuates the myth that outcomes based on drastically different reporting methodologies can be used as a universal measure of quality in a side-by-side comparison of bootcamp programs. The reality is far more complicated.
Currently, even those schools touting “transparency” publish claims that fall short. In evaluating outcomes claims, students must consider who the schools included in their calculations (was it those who started the program? Only those who finished? Was it only a subset of all students that opted into career training?) as some schools are making outcomes claims based on less than half of their overall student population. One must also consider the survey methodology (who is reaching out? Is it an independent auditor or someone paid by the school?), and most importantly, one must understand what constitutes employment. Is it a full-time coding job or any job in a “tech-related” field (for which there is no common way of defining)? Are internships counted? Should they be? What about graduates who are hired as marketing representatives by the very same bootcamps they graduated from?
Wittingly or unwittingly, many schools are cherry picking subsets of students to include in their outcomes data and passing off their questionable math as transparency. There is ample precedent in advertising law that calls out such approaches as misleading, but many players in the industry, given its current, immature state, do not fully appreciate the ramifications of such selective reporting. A reasonable student would reasonably assume that “99% employment” means just that — that 99% of all students from a given school get a programming job. The reality typically falls short.
Silver points out this risk correctly and directly, lamenting that all too often “marketing trumps education.” This marketing arms race risks marring an industry that has little to hide: independent research from Course Report confirms that students cite their experiences at bootcamps as a critical factor in getting jobs; employers are increasingly turning to bootcamps to fill their hiring needs and even traditional Computer Science programs are considering curricular changes to match the practical orientation that bootcamps are innovating.
Rather than flooding prospective students with the type of exaggerated claims that sully other types of industries, such as the weight loss industry, we believe it will be far better for students to evaluate real differences in the programs they are considering based on curricular design, career support, teaching methodologies, the strength and diversity of the student body and alumni community. They should speak to alumni who have been through the programs, assess the sort of placements recent grads have gotten, their success in the workplace, their career progression in the years following graduation, and generally gather informed opinions from those with first-hand experience. They should be aware of which schools are actually licensed for business, which has a bearing on credibility and consumer protection practices (to our knowledge, Dev Bootcamp is one of only three schools to have proper licensure in California), as licensure ensures programs are abiding by certain rules and regulations.
At Dev Bootcamp, our goal is to pursue a path where students evaluate their decisions based on relevant program qualities, credible feedback from alumni, and third party data that support long-term outcomes — rather than on claims that are inconsistently derived, difficult to parse, and easily misconstrued. Our industry would be better served with transparency, for sure — but transparency on the right measures.