Retaining Users and Enhancing Engagement: A Deep Dive into A/B Testing

Guillermina Giovanelli
4 min readDec 13, 2023

--

In the competitive tech landscape, retention, user engagement, and A/B testing are key for a company's sustained success. Retention metrics indicate a company's ability to keep users interested and committed to their platform or service over time. High retention rates signify not just initial interest but sustained value and satisfaction, crucial for long-term growth and profitability.

User engagement signifies the depth of user involvement with a product or service, driving customer loyalty, referrals, and heightened lifetime value. It acts as a barometer for customer satisfaction and fuels sustainable business growth.

A/B testing empowers companies by comparing product or feature versions, unraveling invaluable insights into user behaviors and preferences. This optimization process elevates the user experience, fostering increased retention and engagement rates.

Over the last few months, Columbia Business School and Engineering students worked closely with Slate, an all-in-one content creation and brand management platform, for teams to analyze their data and identify opportunities to improve user engagement and reduce churn. We conducted a comprehensive analysis of user data, performed an A/B test, and developed recommendations for future iterations.

Churn Analysis: Decoding Customer Dynamics

Our journey into understanding churn dynamics involved the following steps:

  1. Data Cleaning and Consolidation: We cleaned and consolidated over 750,000 rows of data from 500+ accounts, ensuring data uniformity and reliability.
  2. Comparative Analysis: Analyzing churned vs. non-churned customers revealed intriguing insights. Hypothesis tests validated the observation of higher average usage among non-churned customers, affirming their higher daily engagement.
  3. Feature-wise Frequency Analysis: Zooming into churned customer behavior uncovered notable patterns. We found that some clients exhibited significantly higher product usage pre-churn, presenting prime targets for retention strategies.

Challenges Encountered: Data Limitations

However, delving deeper into churn prediction posed challenges. Limited historical data for certain features and sample size restrictions hindered robust conclusions, emphasizing the need for more extensive data sets.

A/B Test: Impact on User Engagement

Our A/B test aimed to measure the impact of email communications on driving user engagement, specifically focusing on the utilization rate of a select feature.

Methodology and Structure

  1. Objective Definition: Our primary goal was to measure the impact of email reminders or video guides on platform utilization. In particular, we looked into utilization of Slate’s desktop platform, Web Creation Studio.
  2. Target Audience and Grouping: To ensure unbiased representation, we divided client accounts into Control and Test groups and adopted a grouping methodology to segregate accounts into Sports and non-Sports categories, eliminating biased industry distributions. We first separated all accounts into Sports & non-Sports, due to seasonality and user pattern differences. For both categories, we ranked all accounts in a descending order according to average web exports per day. After that, for each category, we assigned an indicator value of either 0 or 1 for accounts with an even index and calculated the indicator for odd index accounts based on the previous even row’s value. In this way, we ensured that each pair of accounts would be assigned to different A/B groups.
  3. Group Validation: After separation, each group had 295 accounts, and to ensure there was no difference between both groups, we assessed the mean of average web exports for both groups to validate the similarity, ensuring a balanced distribution for the experiment.
  4. Email Strategy and Timing: Two distinct emails were deployed: a standard text email about the Web Creation Studio sent to both A & B groups on 11/8, followed by a founder video email sent exclusively to the B group on 11/10.

Analysis and Results

We divided our analysis into two parts to better understand customer behavior and engagement, one focused on email performance and the other focused on usage of Web Creation Studio.

Email Performance

We tracked clients’ responses to the standard text and founder video emails and used Two Proportion Z-Tests to scientifically measure statistically significant differences in open and click-through rates between the A and B groups.

Clients responded significantly better to the video email, registering notably higher open and click-through rates compared to the standard text email. This highlighted the efficacy of the video email and repetition in capturing attention and driving engagement, directing users to Slate’s platform.

Platform Usage

To understand how these emails influenced user actions on the platform, we examined four critical metrics:

  • Total Exports: Aggregate number of export events triggered per account.
  • Unique Users Triggering Exports: Number of unique users under each account triggering export events.
  • Percentage of Active Users: Proportion of active users per account.
  • Exports from Active Accounts: Total export events limited to active accounts.

Using z-statistical tests, we computed daily averages for each metric across both A and B groups and compared these averages to identify any significant differences in engagement patterns post-email distribution. The statistical tests revealed p-values above 0.1 for all metrics, indicating a lack of substantial variance between the engagement levels of the A and B groups. This outcome suggested that despite the success of the video email in driving attention, it did not notably impact user actions on the platform, particularly content exports.

Key Insights and Future Recommendations

While the A/B test revealed the effectiveness of engaging communication methods in driving attention to Slate’s platform, it failed to translate this heightened engagement into increased content exports. This discrepancy suggests a possible impediment in the customer journey, warranting further exploration into barriers hindering users from completing their assets on the Web Creation Studio.

Moving forward, we recommend an in-depth examination of user interface functionalities and exporting capabilities through focused A/B tests, interviews, historical trend analysis, and comprehensive user feedback mechanisms.

--

--