Lies, Damned Lies and The Accelerometer’s Statistics

Where’s the harm in a third-party survey employing dubious methodology that ignores critical facts? Funny you should ask…

Earlier today, Charlotte Street Capital (through their Techbridge brand) published The Accelerometer, a report ranking UK accelerator programmes.

As far as the team at Ignite is concerned, the report is woefully inaccurate and deeply unprofessional. We’d like to explain why.


In mid-January, Ignite received an email from Bo Pederson at Charlotte Street Capital:

We’ve produced a report, which shows the growth of the [accelerator] market overall, with over 60 accelerators now operating across the country.
As expected, Ignite performed well, although with some variation of the metrics we tracked, in the face of strong competition.
We’d like to review our findings with you before we publish this data.

This was the first we’d heard about the research. We met Bo three days later, who explained Charlotte Street Capital had conducted their research using a technique called the Net Promoter Score (NPS). They approached 55 Ignite alumni, of which 27 people responded.

Contrary to Bo’s email, Ignite did not perform well at all.

We were devastated. The report was completely at odds with all the feedback we’d received from teams, both during and after programmes. Of course there were founders who felt unsatisfied by the programme; as hard as we try, we can’t please everyone. But they were in the tiniest of minorities, maybe 12 out of 170 founders? So either our alumni were lying to us, or we were lying to ourselves. We genuinely couldn’t comprehend what we were being told.

At the meeting, Bo admitted the data wasn’t very accurate and that his team didn’t have any expertise or experience in conducting market research. We were promised that before the report was published, Charlotte Street would work with us to improve the data available to them, to ensure their report was at least accurate and fair.

In turn, we requested the names of the alumni contacted, so we had some indication of when respondents worked with Ignite; our business is five years old — in 2011 and 2012, we were a different limited company, with different investors and little-to-no connections to London VCs and angels. Lumping together output from two different companies and investment models is dangerous, since the responses of founders from four or five years ago have no relevance to how we’ve operated since.

Aside from receiving the raw topline data mentioned in the meeting, we heard nothing until yesterday, when it was announced the report would be published today. We asked about the promised follow-up, and again made the point that we felt the research was flawed. We were promised a phone call to discuss the data ahead of the launch. The call never came. To be clear — everyone was agreeing the data was flawed, but Charlotte Street Capital published it anyway.

Net Promoter Scores (NPS)

We hadn’t heard of Net Promotor Scores until this point. You can read more about them here.

Despite Charlotte Street Capital’s claims the NPS is a a well-known marketing tool, it really isn’t. NPS is an internal management tool for predicting growth and improving processes. It’s not intended for external market research, and certainly not for ranking wildly different accelerator models against one another.

We asked Dave Waddell, a market research professional from Fly Research about the use of NPS in a survey like this:

NPS is a good measure of sentiment in an established market, when conducted among people with experience and perspective. So, supermarket shoppers, car buyers etc where you have a sample size of 1,000.
The metrics of NPS mean that a tiny sample size of 27 will produce data that is fundamentally flawed. You could have interviewed four people from one start-up that all thought the same way.
It’s just bad research. Wrong tool, wrong application, wrong sample, wrong conclusions.

Furthermore, NPS is meant to be conducted once a quarter to enable improvement within a business, not once across the five year lifespan of two different businesses. There is so much controversy about its use, that even professionals who do recommend NPS are at pains to point that execution is critical, even when the tool is used correctly. In the words of Forrester:

Stop using NPS. Or, better said, start using it more properly.

Our survey and results

The only company with any right to use NPS in our name, is us. That’s the point of NPS. So we did. Yesterday, we sent a survey to alumni of every accelerator programme that Ignite has managed over the past five years. Our mailing list doesn’t cover every alumni — people move on, change email addresses, especially in startups — but the majority. Here’s the email (grammatical errors and all) making the request, so there’s transparency about the context:

The survey was anonymous. We wanted founders to feel they could be honest and transparent with us. We received 58 responses, over double the number received by Charlotte Street Capital. We received responses from founders representing every programme we’ve managed over the past five years. It’s still a small sample size, but it represents about a third of the founders we’ve worked with, and we received more responses from founders we’ve worked with from 2013 onwards, under our new company and structure; our data at least feels robust, defensible and, more to the point, relevant.

We asked slightly different questions, since Charlotte Street Capital didn’t share theirs with us ahead of publishing. We asked founders to rate their experience of being on the programme; we invest almost exclusively in first-time founders at the concept or pre-MVP stage, and as a result the programme is often far more challenging than comfortable:

We then asked if the founder would recommend Ignite to another early-stage startup:

Even when founders had a tough time on the programme, more often than not they would still recommend it.

Charlotte Street Capital rated Ignite at -48, with just 4% of respondents classed as promotors, and 52% as detractors.

To be clear, we’re are not saying our results are any more valid than theirs.

The point is this: if two surveys ask the same people the same types of question but show zero correlation, there is clearly something very wrong with the methodology of the survey.

But…

If the same questions were asked of every accelerator in the report, why do we feel unjustly penalised?

For one thing, just because you can call a bunch of programmes “accelerators”, it doesn’t mean you can or should compare them. Wayra, Techstars, Ignite and Seedcamp are barely comparable as either businesses or programmes. Anyone with any clue about the market can tell you this. That the report even attempts to compare the output of a fund to that of a programme is idiotic.

For another, we had a hunch that whether a team succeeded in raising investment post-programme would dramatically affect the NPS scores. So we asked the question, and filtered the scores accordingly.

Teams that raised post-programme:

Teams that didn’t:

The difference is far more detrimental to Ignite than the report suggests, because Ignite invests almost exclusively in first-time founders at the concept or pre-MVP stage. Our failure rate will always be higher than programmes (or funds) that prefer to invest in established entrepreneurs, or teams that have already raised.

In other words, we are always going to score lower than other programmes, because we take more risks. We shouldn’t be penalised for supporting founders that other programmes won’t, or compared using any methodology that doesn’t distinguish between such diverse investment strategies and business models.


The point of this response isn’t that we’re upset we didn’t rank more highly. It’s that the results simply have no grounding in reality. If the only difference between a strong, high-growth business with founders backing it to the hilt, and a low-quality programme with unsupportive alumni is the person asking the questions, then you have to not only question the methodology, but reject it outright. Furthermore, we still have no clue who was sent the survey; amongst the list of alumni on the report’s website, there are teams that never graduated the programme, and at least one startup that never even participated.

If you’re considering applying to a programme, the best due diligence you can do is to talk to our alumni yourself. We’re going to publish a full list of all our founders in the near-future, because we think our best evangelists are the people who have lived and breathed our business and our values, not an ill-conceived, poorly executed report published for the sake of a headline or two.

Show your support

Clapping shows how much you appreciated Paul Smith’s story.