SaaS Retention Metrics: Lessons from Free-to-Play Games
This article covers how free-to-play games monitor and analyze the player base and retention metrics. Many of these same metrics are applicable to other SaaS products as well, though it may make sense in certain circumstances to modify them a bit depending on the product usage cycle from the strong daily and weekly focus of free-to-play games.
One of the important sets of core metrics for free-to-play games (and any SaaS products) are related to retention. The product manager for a product in live ops, especially as the product matures, needs to maintain a strong focus on maintaining and even growing the user base over an extended period of time through continually improving the retention metrics to give the product the longest life possible. Obviously, a large user base is essential for the ultimate goal of monetization because it is very hard to get someone to monetize in a product they aren’t using. But it is also crucial for social reach activity since a larger user base has more social activity that reaches more potential new users as well as touches existing users in compelling ways more times per day. Thus, user base management is one of the most important elements in a product’s long-term success. Ultimately, if you can’t retain your users, you will have no one left to monetize in a much shorter timeframe than you would like and any new users you do get will bleed right out in short order.
This article begins with the conceptual framework for understanding what factors contribute the product’s user base, then covers the rigorous set of metrics and visualizations that provide the numbers to continually monitor for issues and that are used to develop informed user retention strategies. Finally, this article describes a few different methods for developing models that are useful for developing active user size projections as well as doing sensitivity analysis on what specific retention metrics have the potential to provide the greatest impact.
At the end of this chapter, you should be able to:
- Describe the basic active user metrics (DAU, WAU, MAU), what each signify, and what specific forces are behind their movements.
- Describe the three user return rates (NURR, RURR, CURR), why and how they are used, and how they are calculated.
- Understand how to investigate channel DAU and channel installs and correctly discern between send side, click side, and cross-channel cannibalization issues.
- Understand how to analyze each active user metric and provide value by interpreting what each is telling about the product.
- Model the future size of the user base using multiple methods to develop more accurate projections.
- Use the retention DAU model to inform your feature strategy on what specific metrics should be targeted for improvement based on potential.
Active User Theory
Active user theory describes the study and analysis of the fundamental components that influence and comprise your current active users. In essence, it is simply a way to explain where the active user base comes from and what is causing it to increase or decrease.
Reachable Player Onion
The population of products, especially freemium products, is diverse, composed of users at many different engagement levels and phases of usage. We call the users who are engaging with the product with some frequency the reachable players. There are many layers of these reachable players depending on usage frequency. Together they form the reachable player onion.
The Daily Active Users (DAU) are the innermost layer and include all the users on a particular day. These players provide the new revenue for the day and include the most highly engaged daily users. Thus, significant focus is spent on maximizing DAU.
The Weekly Active Users (WAU) form the middle layer and include all the most immediately reachable users. This group contains all the users who are still active in the core product and includes both the highly and moderately engaged users. The WAU is a superset that includes the deduplicated DAU of each day in a seven-day period.
The Monthly Active Users (MAU) form the outermost layer and are all the users your product messaging should still be regularly and actively targeting. It includes all the highly and moderately engaged users as well as the infrequent users or those who have recently lapsed. The MAU is a superset that includes the deduplicated DAU of each day in a thirty-day period as well as the deduplicated WAU of each seven-day period contained within it.
Users who have lapsed out of the MAU are considered to have quit the product and are referred to as quitters. They are no longer as likely to return as they have ignored thirty days worth of messages targeted at them, though you should never totally give up on them, but rather just target them differently.
While DAU is the lifeblood of the day-to-day operations, its movement is significantly noisier than WAU and MAU due to any number of contributing factors on a given day. Since many SaaS products (especially games) have a weekly cyclical nature to the DAU and since the WAU includes the most immediately reachable players (the highly and moderately engaged player base), we use WAU as the basis for our analysis of the overall growth and decline of the product instead of DAU.
Relating DAU to WAU: Weekly Engagement
Since we analyze the growth or decline using WAU, we need to translate it to DAU using what is called the weekly engagement metric. Weekly engagement is the ratio of daily to weekly users and represents the percentage of weekly users that are using the product on any particular day:
Weekly Engagement = DAU / WAU
If we rearrange the equation:
DAU = Weekly Engagement * WAU
We can see that one can inflect DAU by either increasing engagement (existing users use more frequently) or by growing WAU (more users using). While there can be some potential in improving weekly engagement, unless the product has a significantly low engagement number, there is more often greater upside in increasing WAU than improving weekly engagement.
The Leaky Bucket Model
When analyzing what contributes and detracts from the number of weekly players (WAU), a leaky bucket is a useful mental model. The leaky bucket has water pouring in from faucets that represent users coming into the product. There is also water sitting in the bucket representing people who are currently using the product during a given seven-day period. However, there are holes in the bottom of the bucket out of which water is leaking, representing users leaving.
Using this mental model, it is easy to envision that the determining factor as to why the water level in the bucket would increase or decrease is whether the holes in the bottom of the bucket are leaking out water faster or slower than water is pouring into the bucket from the faucets. In other words, the change in the water level is equal to the difference between the amount of water pouring in and the amount of water leaking out. Therefore, the goal is to patch the holes in the bucket that are leaking water and increase the water flow from the faucets pouring water into the bucket in order to keep as much water in the bucket as long as possible.
Taking it to the specifics of WAU, there are two faucets pouring water into the bucket and three holes leaking water out of the bucket. These represent the five control factors we have over WAU.
The two faucets providing inflow to the product for the week are new installs and returners. New installs are users who came into the product for the first time. Returners are users who have lapsed (7+ day absence) but have come back to user the product. Included in these are a special class of returners called reactivators, which are returners who have quit (30+ day lapse) and decided to come back and try the product again. Overall, the user inflow is controlled by many factors including marketing spend, exposure within social networks, organic factors, app updates (mobile), and app store featuring (mobile) among numerous others.
The three holes leaking water out of the bucket are separated by type of user. New user lapse is how many users who used the product for the first time the prior week did not continue to user the product the current week. Current user lapse is how many users who used the product two weeks ago and the prior week did not continue to user the product the current week. Returner lapse is how many users who used the product the prior week, but not two weeks ago, did not continue to use the product the current week. In general, however, we look at each of these lapse rates as their complement probability that we retain each type of user called the user retention rates.
In summary, plain and simple, if your outflow is consistently greater than your inflow, your WAU will decrease. Thus, it is essential to maximize your user return rate metrics to minimize the weekly user loss. Each user type (new, returners, current) has its own unique and often proven strategies for positively influencing the retention rates (the specifics of which are beyond the scope of this article).
Active User Base Analysis
Now that you have a basic understanding of the forces that influence the size of the active user base, we can look at the set of essential metrics that illuminate the current state of the user base and what is driving its changes. These metrics provide the framework for rigorous top-level analysis that, in many cases, provides direction for deeper investigations.
DAU vs Projections Graph
Since DAU is the lifeblood of the product, it is imperative to have an eye on DAU every day, not only in comparison to the DAU yesterday, a week ago, and two weeks ago according to the product’s natural weekly usage cycle, but also to the projected or quarterly target DAU. To that end, a simple annotated line graph vs projected DAU and annotated with important events on the timeline. This is an important graph to understand where you are versus goals on a daily basis.
Of course, there are numerous useful sub-slices of DAU such as DAU by device platform (iOS vs Android vs Facebook, Chrome vs Firefox vs Safari vs Edge, Mac vs PC vs mobile, etc.) or DAU by geography. However, those are generally useful mostly for doing a deeper dive to investigate specific issues rather than the day-to-day player base monitoring and management.
WAU Flow Graph
The WAU Flow Graph is another one of the most important active user base visualizations because it shows the daily positive and negative forces influencing WAU, which are the most immediately reachable actively engaged users. Oftentimes, WAU is the preferable measure to calculate any weekly net loss in users as a target for user acquisition activities to keep the user base at a steady number. The graph is shown as a stacked column graph with the resulting daily net delta as a line graph. It is often useful to pair it with a WAU line graph so that the line graph contextualizes the WAU movements and the WAU Flow beneath gives the underlying details.
Going back to the leaky bucket model, the two positive forces that comprise inflow are seven day lapsed returners and new installs and the negative forces that comprise outflow are seven-day lapsers of existing users, return users, and new users. The goal is to keep inflows greater than outflows so the black net WAU change line remains at or above the x-axis as much as possible to keep WAU expanding rather than in the negative and shrinking. While there is often normally minor weekly cycles to the inflow and outflow, the root cause of larger changes to the inflow and outflow need to be identified and annotated.
The inflows are more straightforward to analyze for the root cause of changes because they represent users being drawn into the game from outside. If the platform has the capabilities to identify installs and returners by channel, it greatly helps since the change can be narrowed down to the underlying cause of the increase by the channel through which it came. Otherwise, greater care has to be taken to know what external changes (new install or reactivation targeted marketing campaigns, app store featurings, new app updates or content releases, new viral features, etc.) are taking place that could be the root cause of the changes and supplemented with any external data available.
The outflows are more difficult because not only do the channels need to be analyzed for changes to their effectiveness at bringing players back into the product, but also that the various retention rates for new, returning, and current users did not drop due to other internal reasons (lack of content, economic or systemic forces, new feature, etc.) which may require finding behavioral correlations in user data. Also, keep in mind that significant positive inflow is oftentimes accompanied by an increased outflow seven days later as a number of that inflow fails to retain and lapse at the seven day mark.
MAU Flow Graph
The MAU Flow is useful for tracking the total current user base of all the users, including the moderately to low engaged users, with whom the product is still actively communicating and thus represents the totality of the product’s current user base. The MAU Flow is similar to the Annotated WAU Flow but tracks the daily positive and negative forces on MAU. The MAU Flow is shown as stacked column graphs with their net result as a line graph in black and it is also paired with an MAU line graph for context. Due to the thirty-day time horizon of MAU, the positive forces are installs and thirty-day lapsed returners and the only negative force is quitters (thirty-day lapsers).
Just like with the Annotated WAU Flow, the inflows are more straightforward because they are directly related to the channel that brought the person to or back to the product that day. Similarly, the outflows are more difficult because the root cause of an increased outflow could be due to a channel losing effectivness, a change in retention rates, or an inflow thirty days prior that is lapsing out now.
While user acquisition analysis looks at the new user retention from a milestone standpoint and a consecutive daily retention standpoint to optimize the new user flow, the overall DAU retention analysis looks at all player retention, which is critical for managing the player base for an extended life. For new players, the Dx curve retention (D1, D7, D14, etc.) is aggregated into a single new user retention metric. We also add a current user and lapsed returner retention metric that can be independently optimized and help model longer-term user behavior. Using these three retention metrics is essential for products that regularly engage users for the long term beyond the two-week new user experience.
The three user return rates (new, return, and current), also known as churn, are expressed positively as the complement of the corresponding lapse rates to indicate the weekly retention of users that needs to be maximized. They are all expressed as week over week retention rates because user analysis is done using WAU and thus they are all basing their retention on comparable units. Collectively, these user retention rates are referred to as the *URR retention metrics.
New User Return Rate (NURR) is how well we retain new users week over week. Quantitatively, NURR calculates of all the users who used the product for the first time between t-7 and t-13, what percentage returned to use between t-0 and t-6.
Return User Return Rate (RURR) is how well we retain users week over week who return to the product after a break of seven or more days. Quantitatively, RURR calculates of all the users who used the product at some point in the past, did not use the product between t-14 and t-20, and did use the product between t-7 and t-13, what percentage returned to use the product between t-0 and t-6.
Current User Return Rate (CURR) is how well we retain current users week over week. Quantitatively, it calculates of all the users who used the product between t-14 and t-20 and who used the product between t-7 and t-13, what percentage returned to use the product between t-0 and t-6.
CURR, in particular, often has the greatest potential impact on the long-term profitability and success of the product because if you can’t retain your players to monetize for the long term, you user acquisition costs will easily overwhelm your player lifetime value (LTV).
Dx Retention vs *URR Retention
The Dx retention rates (D1, D7, D14, D30, etc.) and the *URR Retention are complementary. While NURR overlaps with much of the most interesting parts of the Dx retention curve, it does not provide the level of detail that the Dx retention curve provides when attempting to pinpoint new player retention issues. Whereas, longer term, Dx retention curves provide less insight beyond the new player experience that the RURR and CURR metrics provide. Therefore, it is critical to understand how to use both the Dx retention curve and the *URR retention metrics.
*URR Retention Graph
The visualization for the *URR Retention is a line graph for each to make an annotated timeline marking important events and explaining the movements in the metrics. The movements are generally small enough that the scale will likely require zooming in on the y-axis so they are visible. Additionally, for at least the most recent weeks, it is often useful to highlight each of the previous three weeks since the *URR metrics indicate week over week retentions with an average retention rate called out for each week.
When investigating retention metric changes, how stimuli affect *URR retention can be difficult to understand since the underlying cause can be anywhere within a three week period, so great care is needed to properly uncover the root cause of the movement.
Category Active User and Category Install Analysis
For products that have multiple ways to enter that can be tracked, it is useful to analyze DAU by source to make sure all entry points are working at bringing users in to the product as well as how the most recent release has impacted or shifted those traffic patterns. While this section discusses category DAU, the exact same analysis works for category installs.
Currently on mobile, the useful channels are much more restricted to primarily push/local notifications and directly choosing the app. By contrast, on the web there are numerous channels including viral messages on social network platforms, platform bookmarks, platform directories, and cross-promotions from other products among multitudes of others.
If the platform allows for category DAU attribution, it is essential to monitoring each channel into the product (and sub-channels as deep as the platform will let you). While a person may enter the product through multiple sources in a day, each category DAU gets credit for a user only when it is the first channel that user uses to enter the product for the given day.
The performance of each category DAU is generally compared to the DAU that the same category drove the previous day (called Day over Day, abbreviated DoD or d/d), the same day the previous week (called Week over Week, abbreviated WoW or w/w), and/or the same day two weeks ago (called week over two week, abbreviated Wo2W or w/2w).
When investigating Category DAU for a general (non-viral) channel, the number of unique channel users or uClickers is important because it indicates whether the change in a particular category DAU has its underlying root cause in a proportional change in channel usage.
Viral Categories DAU
Viral category DAU sources are unique because they depend on users themselves to participate in spreading invitations to interact in your product, most commonly through a social media platform. Therefore, finding the root cause of changes requires us to analyze the funnel of each step of the viral behavior starting with the number of users eligible to participate in viral activity at the top all the way until another user responds to the viral invitation to interact and enters the product. What makes the viral channels so powerful is the positive feedback loop from these new installs and returning users who respond to the viral invitation back up to the top with increased DAU and more potential eligible participants to send out more viral invitations to engage.
Because we have a funnel of viral engagement behavior, it is possible to look at the funnel to see the root cause of an increase or decrease in a viral category DAU. Changes higher in the funnel will have a cascading effect to the lower levels, thus it is important to determine the origin step of a change then use secondary metrics as a starting place to gain more insight into the root cause. However, the product manager needs to always keep platform changes need always be at top of mind as well because those can have a significant impact as well apart from updates to the product.
Each step of the viral funnel is described below from to bottom with useful related secondary metrics.
1) DAU — Total users in the product is top of the funnel as the total potential of users who are using the product and thus could potentially be able to participate in viral activities.
2) Eligible Senders — Not everyone who uses the product is necessarily eligible to participate in a particular viral feature, thus it is useful to know the actual users who could participate in viral activities based on feature restrictions (feature gating, etc.).
3.1) % Eligible Senders — Calculated as Eligible Senders / DAU, changes in % Eligible Senders helps distinguish between whether a decline in eligible spenders is due to a general change in DAU or a more specific change in Eligible Senders.
3.2) Send Opportunities / Eligible Senders — Weakness here could help determine whether weakness in eligible senders is due to underlying surfacing issues.
4) Unique Senders (uSenders) — Unique users who performed one of the viral actions under consideration.
4.1) % uSenders (uSenders / DAU) — Useful to distinguish between changes to uSenders and general DAU changes across the whole population of users.
4.2) uSenders / Eligible Senders — This is the sender conversion rate and is largely incentive driven: users will often only send if it is perceived to be mutually beneficial to both themselves and the recipients.
4.3) Sends / uSender — This is send volume and changes here are often related to the volume of allowable viral activity.
5) Unique Recipients (uRecipients) — Measure of viral reach to new, former, and existing users.
5.1) Total Sends — For one-to-many virals such as Facebook feed posts, uRecipients is often not available, so total sends is used as an alternate measure of viral reach.
6) Unique Clickers (uClickers) — Number of unique responders to published virals. A viral is not good unless the recipients act upon it.
6.1) Clicks / uClicker — How many virals on average a unique clickers acts upon.
7) Installs, Returns, Current Players — How many users of each type come into the product through the viral actions and count as viral category DAU.
For each channel, it is important to determine what portion of change to source DAU is due to true underlying changes in performance in a channel and what portion is from cannibalization to or from another channel. Thus, we need to keep track of daily unique channel users (unique clickers or uClickers) as well as, to a lesser extent, total usage volume (total clicks) in order to determine the underlying stability or lack thereof in the channels the users are utilizing despite a change in the DAU each channel is contributing.
Thus, we use the ratio Category DAU / uClicker as a second level metric to gain deepr insight into a movement in Category DAU. When the Category DAU is changing, but the unique clickers are relatively stable, it is indicative of cross-channel cannibalization to or from another channel. However, if the category DAU is stable, but the unique clickers are changing, it is indicative of changes in underlying channel engagement. Only when both Category DAU and uClicker move together in the same direction is there actual true performance change in the channel.
Needless to say, in real data the answer is not always as clear because both category DAU and uClicker will have some natural movement from day to day and week to week, but the above chart is useful for determining the primary and contributing factors beneath a given movement in category DAU.
DAU or Install Walk Graph
When the platform allows DAU broken out by channel, an annotated DAU walk is useful for visualizing and summarizing the biggest contributors and detractors to the week over week (WoW) seven-day average DAU change. A similar graph works for install sources.
In this type of graph, the leftmost column contains the starting value and the rightmost contains the new value with the deltas of the different contributing channels which net the overall change listed in between. The contributing channels are ordered from largest positive to largest negative and they are annotated with their percentage change and the numeric change they contributed as well as callout notations as to what underlying factors drove those changes from the specific channel analysis. The starting and ending values as well as the contributing channels are all based on seven-day averages. As a general rule, any channel that changes more than three percent in either direction deserves a callout.
The other metrics are essential to know and understand where they stand for your product, but have limited day-to-day usefulness. Nevertheless, there are circumstances where they are an important link to understanding the root cause of a trend or for modeling user behavior.
If you remember from before, weekly engagement is the ratio of daily to weekly users and represents the percentage of weekly users (still active in the core loop for a game or highly engaged for another type of product) that are using the product on any particular day:
Weekly Engagement = DAU / WAU
When analyzing weekly engagement, it is helpful to bucket the players into engagement buckets based on frequency of usage and track the movement of users between buckets when seeking to inflect weekly engagement or examine a significant change in weekly engagement.
Weekly Engagement buckets:
- Low engaged users: 1–2 days per week
- Medium engaged users: 3–5 days per week
- High engaged users: 6–7 days per week
In particular, if DAU is dropping significantly faster than WAU, it is an indication that users are moving from more highly engaged buckets to lower engagement buckets and thus likely run the greater overall risk of lapsing and thus need more compelling reasons to return more often.
Monthly engagement is calculated as DAU / MAU and is also referred to as “stickiness”.
Less than 10% — retention issues
Greater than 30% — really addictive
Session length measures the average length of the average usage session and sessions per day measures the number of user sessions per day. Like with the weekly engagement metric, it is useful to look at the session metrics by buckets. Both of the session metrics are useful for progression modeling and simulation in games in addition to their usefulness as secondary metrics with weekly engagement that provides further insight into user engagement.
Building accurate DAU projections are an essential task for the product manager. It is often prudent to build multiple projections models using different methodologies to verify the case for your DAU projections. The more agreement there is between projections that use different methodologies, the greater confidence you can have in your projections.
Regression DAU Model
Oftentimes, a basic regression using is more than sufficient to project DAU, especially for established products. The trick is to try a few regression models and pick the one that best fits the existing data and thus is most likely to accurately predict future DAU, thus you will need to compare the R2 to determine best fit between models as well as use common sense when looking at the data and projections.
Oftentimes, a linear regression model (y=m * x+b) is a reasonable starting place to form a DAU projection by projecting the WAU decline over a quarter and then use the weekly engagement ratio to determine daily DAU. See Excel functions SLOPE(), INTERSECT(), and RSQ() to help calculate m, b, and the R2 respectively. Another frequently used model is an exponential WAU model (y=s0 * (1 + r) ^ x) which projects WAU using a small average percentage change each week. In this model, s0 is the starting WAU and r is the percentage change each day and x is the number of weeks elapsed.
Retention Rate DAU Model
A retention rate model is a more sophisticated model built around the user return rates to project the weekly users using historical averages or comparable data. The advantage to a retention rate DAU model is that it allows for sensitivity analysis to show which of the five forces on WAU have the biggest effect on inflecting the user base and allow that to inform your feature strategy. It also allows for more granular week-by-week control of the model to allow one to build a model from launch through cadence and maintenance with a single model that can handle both the initial growth and then then long term decline of the user base.
Assuming data or projections for week(t), week(t+1) can be projected as follows below. Historical averages or comparables are used for the following six metrics that underline the model: weekly engagement, RURR, NURR, CURR, weekly installs/WAU, and weekly reactivations/WAU.
Critical Importance of Retention
Now that we have the base level understanding of DAU Theory and the metrics used to analyze the user base trends, it is important to develop a clear and consistent strategy to make the desired impact on the metrics you are seeing.
While it goes without saying that you want to keep your DAU as high as possible as long as possible, it is important to understand what acceptance of even the smallest weekly decline means for the product over the course of a year.
- 1.0% WoW decline every week turns into a 40.7% decline over a full year
- 3.0% WoW decline every week turns into a 79.5% decline over a full year
- 5.0% WoW decline every week turns into a 93.1% decline over a full year
As you can see, seemingly small WoW declines compound into significant losses over the course of a year. Thus, the primary job should be to keep the number of users steady or growing week over week.
Product management live ops of a software as a service product is a mix of art and science. You should now have the understanding of the science behind long term user retention and are ready to go practice the art of using the tools presented here as a starting point to understanding exactly what forces are behind the changes in your user base.
Thanks for reading! I’d love to hear your thoughts on this topic in the comments below. What was unclear? Are there additional retention metrics you have found useful? Let me know!