It was bound to happen but we got our first non-5 star review on CourseReport. I wouldn’t normally take time to write a response to a negative review, but I wanted to this time — not because it’s the first, but because there were some very specific details in the review that I want to reply to, just in case prospective students take the negative remarks at face value.
As I sat down to write this, it struck me that crafting a response to a negative review is a uniquely difficult document to write. I want to reply to the specific complaints in a factual manner, but I don’t want to come across as tone-deaf, defensive, or critical of the author of the negative review.
So with that in mind, the first thing I want to do is acknowledge the author of the negative review. I want to acknowledge that he didn’t feel good in our program and that those negative feelings built up enough to prompt him to write a negative review on a public website. I’m sorry about that and I can’t change that, but I sympathize with it.
Sometimes a business lets a customer down; when it didn’t fulfill a promise or made a mistake, an apology and some compensation is in order. I don’t think that’s what happened here. I actually feel that, except for the negative review, the net outcome here is one that is desired by all parties. I’ll spend the rest of this document explaining.
I want to first address the review’s major complaint around fairness in assessment grading. I’m going to mention some data and numbers, and I want to emphasize that this isn’t to shame or blame, but to bring some data to set the context. I specifically want to address the idea that we didn’t give feedback.
In all, the student took 6 human-graded tests — 4 written and 2 live interviews — within 2 months at Launch School. Further, our staff, including myself, reached out and had two phone calls with the student after subpar assessment performances. My phone call alone with the student was over an hour long. I also went through the assessments one by one and looked at the comments our staff wrote. I won’t litigate the exact questions and answers and responses, but I’ll share some general data for the prospective student:
4 human-graded written assessments:
- our staff wrote a total of 2,000 words in the assessments for feedback
2 human-conducted live 1on1 interviews:
- 1 hour 40 minutes of live interviewing
- 755 words written feedback
From the assessment process alone (excluding code reviews, chatroom support, etc), our staff spent 2755 words communicating with the student. We also spent 1 hr and 40 minutes interviewing, and we also spent at least 1 hour talking on the phone.
In my personal phone call with the student, I wrote afterwards:
I had a great call with [redacted]. I find him to be personable, humble, and willing to improve. We chatted for over an hour and talked about remote work, his goals, and finally about his assessment performance. In the end, we agreed he was going too fast. He was aiming for the fall Capstone cohort, but after our talk, he’s removing that goal from his timeline.
So back to the question of fairness, because all this data still doesn’t prove that the assessment was fair. I can’t easily do that without litigating each question and answer, but what I want to show here is that we provide a lot of feedback, and we won’t leave students hanging. On top of giving feedback, we also solicit a lot of student feedback after every assessment. Any Launch School student can attest to how fanatical we are about asking for their feedback; this is our calibration process and how we constantly improve. It’s CI/CD for our curriculum and it one of the main reasons why we have such a high quality program.
Once again, I want to acknowledge that the student felt the feedback was insufficient and that the assessments weren’t fairly graded. That’s a valid feeling. I don’t want to take away from that and I don’t want to argue about whether a particular response was sufficient.
The main things I want readers to know are:
- we give a lot of feedback to students, especially in assessments; we take them very seriously and we try very hard to set students up for success
- we try really hard to be as fair as we can and make sure students are prepared; these assessments are not “gotcha” exams and are calibrated across hundreds of students for assessing mastery
- we are constantly updating and calibrating our assessments and content based on feedback
I want to take a quick detour about some of my thoughts on education. I promise I’ll tie it back to the negative review.
Education is a very unique business because there’s a huge time-shift between service rendered and value received. For example, if you go to a restaurant and order a dish, you can find out if the dish is worth the price you paid for very quickly. The time between service rendered and value received is a few minutes. Because of this quick service-to-value turnaround, you can easily compare the value of multiple dishes or even of multiple restaurants.
But what if the value you’re measuring suddenly changes? What if the value you care about changes from taste to long-term health? Evaluating which dishes or restaurants are best for one’s long-term health is much more difficult, because you can’t see the effects until much later.
Education is in this situation. In fact, the best education — the type that has a lasting impact on one’s life — is the most difficult to perceive at the time that service is rendered.
This time-shift in perceived value is also why education is full of hype artists and over-promising. Educational institutions lead with hype and marketing, and then trap students into their curriculum by way of a contract or a promised degree or other anti-student mechanism. This is the “marketing → entrapment” strategy most educational institutions use to build their businesses (and yes, universities are businesses).
The truth is that no educational institution can know for certain whether they’re a good fit for any student. No student can know for certain either. And due to the “marketing → entrapment” model that all educational institutions use, shopping around different schools is nearly impossible. This is true whether we’re talking about universities or coding bootcamps.
It’s my opinion that the only way to help students make better educational decision is to give them another option without entrapment. That is, allow them to leave whenever, without major consequence and certainly without entrapment.
That’s what we try to do at Launch School: we have our free prep courses and then we have our no-obligation month-to-month Core curriculum. This is also why we don’t do annual plans at a discounted price or play any tricks that holds students here — we feel that being able to leave is a critical feature precisely because the stakes are so high. If we’re not a good fit, we want students to leave.
Alternatives to Launch School
And now we get to the main reason that prompted my writing this review: the part where the author makes a couple of recommendations for prospective students to skip Launch School in favor of two bootcamps (App Academy and Hack Reactor).
I wrote this article mostly because I felt that the feature the author took advantage of — the ability to leave our program after deciding it wasn’t a fit — is not available in the two alternatives he recommended.
In other words, he doesn’t know if App Academy or Hack Reactor is better for him over Launch School (because he hasn’t taken either), let alone prescribe that path for anyone else. But yet, those two programs cost about $17k each and once you join, you can’t leave. The fact that he can leave Launch School after 2 months is an amazing feature that no other educational institutions allows.
If the suggestion was written out of frustration, I can accept that. But if it’s a serious suggestion, then I want to talk about risk for prospective students.
Low-Risk, No Entrapment
At Launch School, we want to make sure that this is the right place for students. We spend so much time even before students sign up talking about the pros/cons of our program and curriculum. And in the worst case, if it turns out that this isn’t a good fit after a few months, then you’ve only spent a few hundred dollars.
I don’t want to trivialize a few hundred dollars, but this is in contrast to tens of thousands of dollars lost.
And this brings me to the final outcome in this situation. The student spent $398 over 2 months, while Launch School also lost money supporting the student. This is an example where the student and school just didn’t fit. This doesn’t mean the student can’t be successful elsewhere and it also doesn’t mean Launch School isn’t a good program for others. It just wasn’t a proper fit between this particular student with this particular school. It happens all the time at every school and it shouldn’t be surprising.
But the key take-away I want people to realize is that the relative cost of finding that out is very low at Launch School. Compared with any other coding bootcamp, whether it’s ISA-bound or pay-up-front, Launch School has the lowest risk when it comes to testing for a fit. We spend no effort trying to trap you at Launch School.
This document has been difficult for me to write because I harbor no ill feelings. I just want to say to the author of the negative review: I acknowledge the way you felt and I sympathize with it. I also want to say that if you read this, know that you are always welcome back and there are no hard feelings. When we say a student received a Not Yet, that’s means it’s a temporary stumbling block and that there’s always a way forward, whether that’s with Launch School or elsewhere.