11 Data Findings from A/B Testing & Growth Marketing for a SaaS Product
Sitting before you are a group of test tubes filled with mysterious, colorful liquids. A hand dips down, picks a tube and tips its contents into its neighbor.
Bubbles, white smoke. BOOM!
Experimentation (or “A/B testing”) is fun. It lets us test out our hunches. Some tests let us build from successes; others prove us wrong, humbling our guesses and feeding us data to move on.
Below are 11 findings from growth experiments that we carried out here at Yesware — including tests on our website, emails, and more. Winner or loser, takeaways are listed so you know exactly what to try on your own.*
Finding 1: Don’t narrow website language to one target audience when others are visiting your site.
We’ve traditionally sold to salespeople. Which is why we’ve used the slogan “Sell more. Work less” to explain the benefit of our product.
But the truth is, we organically acquire customers on teams outside of sales. Like recruiters and PR people. So we wanted to see if we could increase our signup conversion rate by pulling back our messaging to a wider audience.
The old:
The new:
After 22 days of testing and 3,101 page views, the variation won with a 9.1% higher conversion to trial signup (clicking the orange button and filling out a form on the next page).
The takeaway: Be inclusive.
Finding 2: Don’t just talk about yourself on your signup page.
As a marketer, it’s easy to make two assumptions about a landing page: 1) That a visitor knows what to do based on a headline and form, and 2) that description text should focus on benefits.
We had been making both of these assumptions, and wanted to see if they were right. So we made a variation that did two things differently. It:
- Told people what to do, and
- Explained what we were actually offering (a trial of what?)
vs.
With 5,608 visitors over 7 days, the variant saw a 3.2% higher conversion rate. It turns out, people are more likely to react to an imperative verb and to a clear offer.
The takeaway: Keep landing page language action-oriented, and make sure that your description answers the question: “What is ___?” (In our case, we needed a noun with an action — “plug-in that adds superpowers” — to tell people what Yesware is, and the directive “check out.”
Finding 3: Use your support pages for current customers to also pull in new ones.
Something easily forgotten:
Your help pages that describe how to use your product are being crawled by Google.
What it means: People searching online (prospective customers) for the end benefit you offer see these pages in their search results and are clicking them.
Which means dedicating your support pages entirely to customers, jumping right into the nuances of how to get started, is a missed opportunity.
Because visitors coming to your brand for the first time need to know what they can get in the first place and the fact they can try it now for free.
To test this angle out, we added two things to the top of our help pages: 1) benefits in bullets, and 2) a trial signup link.
Like this:
What happened over the next nine months:
- 242 trial signups from our Email Tracking help page
- 181 signups from our Mail Merge help page
- 60 signups from our Salesforce Integration help page
The takeaway: Add a trial CTA to your help pages if you haven’t already.
Finding 4: Add filters to your trial onboarding emails to lower your unsubscribe rate.
Here are two realities I think you’ll agree with:
- Not everyone who fills out a form to start a trial on your site actually starts one.
- Of those people who do start a trial, a percentage stops actively using your features mid-trial.
Our onboarding email program in Customer.io historically triggered to send to all email addresses that filled out our signup form, resulting in an 8.02% unsubscribe rate. We wanted to lower this.
So we added a condition for secondary emails beyond our initial “Welcome to Yesware” email. This filter told our email program to only send feature-specific emails if a user had sent one tracked email (the best action to identify active usage of our product).
After testing for 2 weeks, our unsubscribe rate dropped by 45.5%, to 4.37%.
The takeaway: If your onboarding emails assume that people who signed up are actively using your product, narrow your send triggers to this audience. Then, segment out people who a) never started their trial, and b) started but fell off so you can add re-engagement and win-back emails.
Here’s a re-engagement email we send to people who aren’t using their trial:
Finding 5: Shorten preview text on your emails with white space to increase open rates.
We all know it’s easy for marketing emails to blend together in an inbox. Which is why we — like you — wanted to stand out from the noise.
This one was a great idea from our graphic designer. We had already been customizing preview text for our marketing emails, recognizing its effect on open rates. But we realized that a part of each email’s body was also being included in the preview text, after our custom copy.
So we wanted to try to shorten ours to stand out. The test:
What happened: the variant saw a 16.4% increase in the open rate. 🙌
We went on to test three other email subject lines and saw similar results:
The takeaway: Run an A/B test on your own to see if adding white space at the end of your preheader text increases your open rate. You’ll need to add these characters to your HTML:
Finding 6: Using BYAF language on your Plans & Pricing page could drive more trial conversions.
If you think your pricing page is being looked at just by current trial users or customers looking to upgrade, you’re wrong.
Website visitors new to your product are on there, too — many ready to trial.
As long as you don’t push them.
Here’s what our original Plans & Pricing page said to newbies:
But cornering someone is a huge turnoff. The threat of losing our control makes us more likely to avoid a commitment in the first place.
That’s why we used the But You Are Free technique to A/B test whether it led to a higher conversion to trial. Here’s the new copy we tested:
The result? A 30.3% increase in signups with 85% statistical significance:
The takeaway: If you don’t have a trial call-to-action (CTA) above the fold on your Plans page, add one. If you have one, test the language to encourage a trial without any pressure to subscribe.
Finding 7: “See For Yourself” > “Get Started For Free” as a free trial CTA on a Product Tour page.
Once “See For Yourself” won as the language in the experiment above, we wanted to take this a step further. We’d been using “Get Started For Free” across most of our website pages, and we wanted to challenge it.
So we tested the “See For Yourself” CTA as a button change to our Product Tour page.
The result: We were right! After 7 days of testing and 1,078 unique visitors, we saw the conversion rate of viewing the Product Tour page to Signing Up for a trial increase by 52.5%, from 7.85% to almost 12%.
Finding 8: Don’t automatically apply a test winner to other pages without testing there, too.
Before taking this new language “See For Yourself” and replacing the CTA “Get Started For Free” on all feature pages of our marketing site, we wanted to do another test.
So we tried it out on our Homepage.
Here’s the interesting part:
The “See For Yourself” button saw higher overall engagement (aka clicks on the page) with 80% statistical significance. But the signup conversion was lower than the control…🤔
What that means: while people clicked to go to our signup page at a higher rate, they filled out and submitted the signup form at a lower rate.
Our guess why: Based on where a web visitor is in the conversion funnel, the intent to sign up isn’t there yet. Homepage visitors haven’t seen the product. So “See for Yourself” could lead them to believe they’ll get an intro to the product, when they ultimately get pushed to a trial signup page. This is a bit misleading, skipping a step in the tour flow and driving them right to trial.
In comparison, the CTA “Get Started For Free” on the Product Tour page (from Finding 2) makes sense. We already provided a tour so it aligns with the intent of people who click: to now try out the product.
What to do: When you’re developing a new page or optimizing an old one, keep in mind the context of your audience.
What sources are they coming from? What prior knowledge do they have? Base your CTA language on the most TOFU (top of funnel) context possible, because gunning for the end conversion could actually hurt your conversion rate.
Finding 9: Switch up your CTA colors, and don’t ask someone if they’re ready to start a trial.
This one was a multivariate experiment. We built out a pillar page about email subject lines, and we wanted to try to drive more trial signups from it.
So we created this Sumo slide-in ad to appear at the bottom-right of the page:
But less than 10% of people seeing the ad clicked on it. So we jumped into optimize mode.
We kept the image, the second line of description text, and CTA language the same while experimenting with the left button color and the headline.
Both variants had a different headline that added context. We tailored it to the intent of someone viewing the page: To know which subject lines to use.
So “Ready to track your own emails?” → “Which subject lines work for you?”
Then we added a third variation changing the button from orange to green:
Here’s what happened:
- Version B (with just the headline changed) increased the CTR from 8.12% to 11.3%
- Version C (both the new headline and the green button) saw a 56% improvement in click-through with 98% statistical significance.
The takeaway: Never settle for the click-through rate you’re seeing. You should always have a test running on a slide-in (given other elements of your page remain constant).
Finding 10: Keep website page meta descriptions action-oriented.
Our content team focuses at least 10% of its time on revisiting old posts and optimizing CTR and SEO. That way, we make sure all of the work that we put into researching, writing, and publishing our posts doesn’t go to waste. And we contribute to the bottom line by driving organic traffic that converts viewers to trialers (and later, trialers to paying users).
With that in mind, getting more views on our blog posts is one of our goals.
So we tested a hunch that action-oriented meta description text would increase our SERP click-through rate (CTR).
As marketers, we’re taught to use verbs in our CTA buttons. Why shouldn’t we do the same with text that’s tied to an action, even if it isn’t hyperlinked?
Here’s what happened over the course of a month (see % change in green):
The takeaway: Set up a quick spreadsheet to test this on your own. Using Google Search Console, choose a time window of the past four weeks. Sort the view from lowest to highest CTR, and pick 10–20 posts with low CTR to test. Come up with action-oriented new meta descriptions, make the changes, and flag yourself with a reminder email for 28 days from now to follow up.
Finding 11: In videos, real footage > Shutterstock footage.
When an issue comes up, you work around it.
Which was what we did on film-day for a promo video launching Touchpoints. We were budgeted to one film day and didn’t grab shots of everything we’d scripted out by the end of the day. So we filled in the rest with stock footage.
What happened when it went live?
The engagement where the stock video comes in drops pretty significantly. (In the screenshot below from Wistia, the blue line shows unique views; the difference between red and blue line signifies rewatches). Engagement was projected to finish around 75%, but dropped down to 50%. This means that 25% of people who could have seen the Contact Sales CTA at the end of the video dropped off instead. For 5,140 views, that’s 1,285 viewers lost.
Watch the video to see for yourself.
The takeaway: When you have the option of filming real people in your office (even if it’s low budget), go with that over stock photography or video footage. Think about the opportunity cost of what you lose (for us, it’s a matter of multiplying: (viewers lost) x (viewing video → contact sales CR) x (contact sales → closed deal CR) x (average closed deal revenue).
That’s all for now! Do you have any conversion optimization experiments that you’ve run on your company’s website, email programs, or product? Please, do share 🤗
*You’ll notice that not all tests above hit 95% statistical significance. While that is the gold standard, we operate in an environment where experiment ideas are vying for activation attention and the cost of betting on a winner at 80%+ significance is low.