Save Time & Money in Your Startup

Personal experience of launching the web app and applying the validated learning principle

Mike Kulakov
Everhour Stories
7 min readJun 10, 2014

--

Every single startup inevitably goes through a huge uncertainty of not knowing if the product will actually make it.

The first product version represents just your own and your team’s vision of the problem and its solution, and this vision may be quite far from being what customers need.

As soon as you get your first users, you’ll be able to look at you idea from their perspective, and that’s the most interesting part.

We at Everhour are still moving towards the so-called product/market fit and learning through the build — measure — learn cycle. Here I’d like to share our experience of launching the app and applying the validated learning principle.

The story

Right after we had come up with the product idea, we immediately started to attract people to our project-to-be so that when we launched, we’d already have early customers. Going the easy way, we tried to create a coming soon page with http://launchrock.co/. The process turned out really quick and simple at first but later we faced some issues and eventually decided to stick to the custom design.

Our first coming soon page

When the page was ready, our first thought was to submit it to Beta List that helps to gain subscribers for early-stage startups. The service is free but it can take a while till your startup will be reviewed and published which we weren’t aware of then.

Betali.st notification email

Of course, we hurried to pay for the expedite review. It was only possible to do that via PayPal (not sure if it’s still true) and unfortunately, we didn’t have it and even worse, PayPal didn’t operate in Belarus at all.

Lesson learned: test the waters in advance not to get into tricky situations.

Though financially we were powerless, we hoped for the human factor to step in. I reached out to Marc Köhlbrugge @marckohlbrugge, Beta List founder, explained the case and asked if there was any other option. Guess what? The reply was received immediately and Everhour was published right the next day. Mark, many thanks again for your help!

Lesson learned: never underestimate the power of communication.

Beta List brought us 1,000 visits and about 800 subscribers which was awesome especially in comparison with less than 300 visitors from Startupli.st and only 3 from New Startups.

At that time we didn’t have detailed analytics to track traffic acquisition but we felt that the majority of these new users were simply “wanderers” who just went to a new place, hung around and then left. The number of actual users appeared to be lots smaller which might be partially due to our app purpose. Snapchat or Instagram haven’t experienced the lack of users, that’s for sure☺ But anyway, every opportunity to get early feedback is a great opportunity.

In the next couple of days after the launch, we got dozens of tweets, emails, questions as well as a few submitted bugs. Besides, users started to suggest tons of ideas and new features through UserVoice and have been doing it ever since. But we’ve never rushed into adding every new functionality piece asked for. Why?

First of all, feedback from non-paying users is different from what paying users say, so don’t be misguided at the early stages of development.

You can learn more about this and other valuable product pricing principles in an awesome article by Intercom.

Image from awesome blog post by @intercom

Second of all, we’ve already seen a lot of other developers packing tools with useless features, endless pivoting, and even worse, products launched only for the team to realize that users simply don’t need them. Obviously, we wanted to go through none of that.

Besides, as a lean startup, we couldn’t afford wasting a fortune on building dozens of app versions to test the concept. Instead of that, we chose having thorough and detailed analytical data as the best way to learn about what our customers really needed.

What do our users like/dislike? Is the main app concept (single-line, geeky time input) popular with them? What features do they use the most/the least? How do they behave themselves while browsing through the app? — these were the questions to find answers to.

As the first analytics option, we considered Kissmetrics which we’d already used for one of the consulting projects and got quite familiar with the system. But as soon as we checked the Pricing, our mood was swayed. They offered a starter package with 500K limit on events/month for as much as $150. I don’t know about you guys, but for our newly born startup it was a heck of a sum to take out of pocket monthly.

Looking for more startup and budget friendly alternatives, we decided to start small and try Google Event Tracking first. It has some limitations as well, but it is free and worth trying.

Unlike the general Google Analytics, Event Tracking will require a bit more efforts, i.e. writing a piece of custom code which is a child’s play actually. You attach an event to the necessary HTML page object or element, be it a click or a scroll, and refer it to your Google account via special JavaScript.

The first thing we did was making up a Google Spreadsheet to list all metrics (events) we wanted to keep track of.

Our initial list included about 60 events but we decided to test only the most important ones first and see how the whole thing worked. The entire process from the brainstorming till the integration took us 22 hours.

Below you’ll see the example of how the analytics looks like (Google Analytics => Behavior => Events).

Since the setup, we’ve been getting tons of helpful info on our customers behavior such as:

  • people use the timer way more often with the timer/manual time input proportion 3:2;
  • the average time entry ~ 2 hours;
  • lots of users edit the logged time and comment;
  • on average, an entry contains 1-2 tags;
  • “this week” is the most popular time range selected for reports;
  • and lots more.

Examining all these results, we’ve seen that features and functionality that we thought would be useful are in fact underrated by the customers:

  • few users export reports to Google Drive (we’ve relocated the corresponding button);
  • few share reports via links (that’s why we’ve removed this option);
  • saved filters are rarely used (we’ve got rid of this functionality as well);
  • only a few reports types are used and usually for small time periods (we’ve significantly changed our reports based on this);
  • billable vs non-billable time differentiation is unpopular (the feature has been removed).

Don’t get me wrong — we didn’t quickly deprecate some functionality right after seeing the stats. Every time we think about removing a feature, we make a list of people who have ever used it and send a letter with detailed explanations asking for feedback.

Thanks to the priceless data we receive, we have an opportunity to understand what is really used in our app and what features might be reconsidered.

We can make deliberated decisions based on actual numbers instead of our intuition, guesstimates or whatever you call it.

Besides, the stats act as a perfect ground for prioritizing tasks for every update: features used the most must be improved in the first place. As I see it, having the knowledge of customer behavior is #1 criterion for success, and Google Event Tracking is a real helper here.

It’s funny to watch how at first a feature seems so-o-o vital and indispensable and after finally removing it, you come to realize that the app actually feels much better without it. Besides, it helps to remind yourself from time to time that every feature needs to be supported and tested with new updates.

No matter how strong your gut feeling is, I’d recommend using actual stats for analyzing your product app and learning more about the customers, improving what is important and getting rid of what is useless.

--

--