What I Learned From Mission Disrupt’s Lean Startup Experiment

My name is Alex Martino, and I recently completed my first lean startup experiment for my marketing company, Mission Disrupt. The people who helped me to run the experiment are my co-founder, Dean, our account strategist, Tim, our intern, Nick, and our creative director, Ilana. The link to my original hypothesis is https://trello.com/c/fWu84QSB/103-my-name-is-alex-martino-my-twitter-handle-is-tallchainz-i-saw-this-on-medium.

I hypothesized that out of 90 nonprofit companies we reached out to with our campaign, 10 of them would call us for a marketing consultation. We have received a call from 0 so far.

The non-profits we reached out to were from the Long Island Browser online catalogue, a directory that advertises non-profits from a variety of industries. To learn more about the companies we targeted, you can see the list here.

The most surprising part about the results of the experiment is that the results have contradicted the level of interest we’ve received from other sales techniques. Out of the services we offer, we’ve received the most interest in website development and we found this to be the greatest need in the nonprofits we reached out to, yet we didn’t receive a call. We made sure to create a clear call to action by using screenshots of their websites and directly show areas that could be improved. Furthermore, some of the people we mailed the postcards to did not have a website at all. We were confident that these companies would at least call for more information. We are unsure if using mail instead of cold calling is too indirect to capture their attention and convert to our goal.

Before this campaign, we learned about the need for nonprofit website development by working with one client who hired us to help with their website, by talking to another nonprofit that requested a proposal that involved a new website, and by talking to or submitting a proposal for eight other small businesses which closely represented these nonprofits who requested website development.

In this experiment, there were several variables that may have had an effect on the results.

  • The way that we designed the postcards may not have reflected our web development skills enough and therefore was not a strong enough call to action.
  • The mail may have been received by someone in the company who is not a decision maker in the marketing of the non-profit.
  • The mail may have been received on a Friday, which could have been forgotten over the weekend.
  • Nonprofits may not be the right target customer for our services because their website may be a less essential tool to their business. They may not want to allocate funds to internal development rather than their cause. While we have spoken to nonprofits who did want a new website, they may not represent the larger majority who don’t have this need.

As a result, we will try to salvage as much information as we can from this experiment, likely by calling the businesses we mailed the postcards to and trying to talk to the people who received them. If we’re unable to reach these people or they do not wish to give us information about their decision making process, we will try another experiment to see if we can validate demand for an outbound marketing technique. The reason why we would like to try again is because, from direct referrals and other sales opportunities, we’ve validated demand for website development in small businesses and therefore believe there may be a better response if we try to market ourselves in a different way.

Our next step will be to develop a similar campaign, but to develop it through email rather than through physical mail. The reasons for this are:

  1. We can track results much faster using analytic tools available through our email delivery platform.
  2. We can have much more precise delivery, and send it to decision makers.
  3. It’s much easier to iterate on variables in the future with the increased control we have through an email campaign.
  4. It’s much cheaper than paying for postage and printing.

With $500 for another experiment, we would update our hypothesis and be able to purchase the necessary tools to carry on with another experiment.

The main tool we’d purchase to achieve the results is a MailChimp Pro subscription for $199 plus a variable subscriber list rate, which has features that allow us to be smarter about our delivery. Features that are particularly useful are multivariate testing, which ensures that the subject line and content that is performing best will be the one delivered to the majority of our audience. We will be able to analyze this data so that we can make decisions about our experiment in real time, as compared to mail where we sent all marketing material in the same way and at the same time. With this and further graphic design work, we expect to see better results with our next experiment.

We look forward to the opportunity to test once again!

--

--