How to Conduct a Closed Beta Testing with Zero Budget
1.5 Months, 100+ Companies, 20 Industries — KeepSolid Sign Story.
The goal of this post: incrementally show how we conducted a closed beta test for KeepSolid Sign from scratch — within a limited amount of time and with limited resources. So if you are a startuper, marketing specialist or an entrepreneur, interested in giving your MVP or beta product into hands of first users, stay awhile and listen.
When developing your own product, it would certainly be cool to do everything like Hotjar did.
But what if you’re more limited in time, people, resources, and budget?
And if, despite that, you still want to let first users test your product to gather valuable feedback from them, while you’re still in the development phase?
This is exactly the situation we had, when we were deep in the development of our latest app, KeepSolid Sign.
It is a cool (obviously) service for e-signing contracts and other documents. At that point, we had a bit more than MVP for three platforms — macOS, iOS, and Android, so we decided it’s time to conduct a closed exclusive beta.
We’ve only had up to 2 months (actually took us only 1,5) to complete it, since we’re planning an open beta for 4 platforms (Windows included) and a web application by the end of September.
So here is how we managed to get these modest yet decent results step-by-step.
First off, we determined our goals for this closed beta as follows:
- Give the product into hands of first users and get feedback about current product look and feel, its advantages and disadvantages.
- Understand how users deal with signing documents and papers now — what solutions they use, what are the pros and cons about the current solution, etc. Learn how we fare against the solution they use now.
- Get a better picture of the future product development and build a roadmap.
- Build the customers Personas for our product.
After setting the goals, we developed a plan, or a concept, of our beta test. We thought over the stages of the beta, decided how we were going to interview the participants, created a landing page and other content, and developed communication system. More details ahead.
Having done with the beta concept and the planning stage, we started inviting users. For this, we used three channels:
- Our own email base of the users of our previous products (brought us ~ 80% of participants)
- Specialized services like Betabound (~18%). We gathered a full list in this google doc, feel free to use it, add others services or share it.
- Reddit (~2%)
This resulted in a whopping 232 beta testers (> 62% conversion rate (CR) from those who finished our introductory survey to those who applied).
And while the first two sources are quite self-explanatory, we’ll explain the Reddit part a bit.
It’s important to notice that we chose subreddits that attract the kind of audience we were looking for — small business owners, realtors, finance experts, etc. We didn’t prioritize the subreddits for beta-testers, because we needed a broad feedback rather than professional one. Then, after asking their administrators’ permission, we posted an invitation to our closed beta.
Here is an example of such post:
Also, we chose proper instruments:
- To create a landing page for beta, we used the Tilda service. We chose it because it has a free crippled version (we actually ended up buying a subscription because we needed some additional functions), and what’s more important, it allows to create beautiful pages, and to perform this quickly.
- For the survey itself, we used Survey Planet. It’s a decentish solution, though it has certain flaws that we will possibly discuss in another article someday.
- We used Calendly (with free 2 week trial) for scheduling calls with those testers who agreed for this type of survey. Didn’t turn out to great by the way, but we’ll get back to this later. And for calls we used Skype, big surprise =)
- For bulk email sending, we used reply.io. Again, simply because it has a free 2 week trial. It turned out a really good service, and we are considering to subscribe to it.
- Another tool that we used and that users were especially happy about was Zendesk — a ticket-based customer support software. Obviously, our testers would most often use it to send us bug-reports and such. But they also really appreciated the opportunity to contact us from within the app.
Down to business
So after getting all armed and dangerous, we launched the beta test.
At first we planned to assign two of our team members as beta managers — one for call surveys and one for online surveys, but… Read on to learn what happened.
We started with sending all participants an application form and an introductory survey. There we asked some general questions like occupation and job position. The most important question here was which platforms can a participant use for testing. Since we only had KeepSolid Sign available for macOS, iOS, and Android at that time, we had to ask all Windows users to wait for our upcoming open beta.
After users filled the application, we granted them free 1 year subscription for KeepSolid Sign.
Next, we had them download the app and use it for 1.5 weeks.
After that, we provided testers a choice: whether they’d like to pass a text survey online, or via Skype call. And that’s where we made a mistake.
Users were still too cold, they didn’t know much about us or our app. Because of that, even though a number of them agreed for a call, most changed their mind afterwards and leaned towards the online survey.
So, from classical Psychology of Persuasion, if you like a user to make a specific action (fill the survey, if it is your real goal), offer him/her a worse alternative (like having a call with you). If you manage to get people jump on the call with you, make sure you follow-up after each call and summarize what you have been talking about.
BTW, it is a good idea to record your call with a user (try Callnote if you use Skype for calls), but always ask if you can do so, beforehand. Normally people do not mind you recording the conversation, it would be polite to send them the recording in your follow-up.
The first online survey (25% CR to # applications) consisted of various questions about the participants and the ways they used our app — how often, for which purposes, that kind of stuff. We read the results as soon as possible after testers passed the survey, so that we could contact them for additional info if needed.
After that, we had them use the Sign for another 1.5 weeks.
In the second and the last online survey (>23% CR to # applications) we asked testers about our product — what they liked, what they didn’t like, feedback, etc. We also asked to rate us via a special form on our website, since they’d used Sign for some time now and could give a balanced review.
In both surveys we had a lot of mandatory questions. We are aware that some marketing guys are afraid of them, but our testers responded positively. We also asked questions multiple times in various forms to ensure that the answers cover all aspects that we were interested in. Average time for both surveys was 11–12 minutes.
After the beta was over, we made a short sweet thank-you video, shot personally from our beta managers and sent it over to our users, along with asking for testimonials ;) This small neat hack triggered a wave of positive feedback from users, that we can use now for our website.
We conclude that the beta test was a success and we managed to reach our goals.
- We’ve learned a lot about how users perceive our product, what they like, what they understand, and what they’d like to see in it. For example, we realized that users really enjoyed the fact that Sign is a native app for various platforms, not only a browser solution. This alone enhanced their experience due to better performance speed and offline availability.
- We have got a lot of insight on how users sign their documents. For example, over 30% of all respondents say that they still use manual “printing — scanning — signing — sending” approach. Though it still depends a lot on the industry and state / government regulations, the % of electronic and digital signature acceptance is much higher in US and UK comparing to, say, Poland.
- There wasn’t enough testers to build a legit statistical sampling and draw a solid roadmap based on it, so we are looking forward to open beta to fix this. However, we clearly understood what to add to our core functionality (like adding more options for making and editing signatures, or adding digital signature certificates).
- We’ve managed to define and describe three Personas: small business owner, IT director, and self-employed / freelancer.
But the greatest thing that we finally ascertained (to our relief) is that Sign is actually commercially viable and able to meet competition. This is arguably the most important lesson you can learn from conducting a beta test!
Among the lesser lessons:
- Never propose call surveys too early, this might scare off your audience.
- But at the same time — don’t ignore call surveys. Despite we only had a few calls, they were really helpful and informative, and we highly recommend this method.
- Be ready to edit your survey on the go. As we started receiving first answers, we had to adjust our questions after realizing users didn’t perceive them the way we expected.
- The fewer testers you have, the fewer quantitative questions you should include, since the answers will not be representative enough.
Overall, it was an exciting beta and surely a helpful one.
As we’ve mentioned before, we are planning an open beta soon (launching the app on all platforms at the end of September, btw you can sign up to our launch list here.) We’re looking forward to writing another article about it once that beta is finished. So consider this piece a beginning of developer diaries where we describe our work with Sign from the very beginning.
Do you have an experience of arranging a beta testing? What part of it was the most challenging for you? And if you don’t — would you like to hear more about our experience?
… and share would encourage us to write more for you. ❤