What are A/B Tests in Advertising?
How we use it to test our targeting strategy
Many companies that develop mobile apps, from time to time, encounter the necessity to change something in the functionality of their apps, and after making changes, evaluate the results of the accomplished work. One of the tools for such evaluation is A/B testing.
In our iFunny mobile app, A/B tests are usually used to introduce a new feature or make changes to the interface design. And since advertising is an integral part of our app, we do A/B tests for our advertising inventory.
A/B testing in iFunny
Any of our A/B tests start with a hypothesis. Hypotheses are developed by our product team based on analytics and numbers. In iFunny, A/B testing is a process where the app’s audience is divided into two groups. The first group is called the “control” group. For this group, the product remains unchanged. Users from the second group see a new version of a feature or design of the app.
Next, we describe the A/B test in the documentation. Based on that, we run our testing.
After the A/B test goes into production, we monitor the product metrics. If the test shows promising results in one group of users, then we run it in another group, checking whether the metrics would remain the same and observe their dynamics over a certain period. If the dynamics remain positive, we roll out the experiment to all users.
If the test shows negative results within a given period, we either look for errors in the hypothesis, and if found, we change the conditions of the test and conduct it again, or delete the test.
A/B tests in advertising
Since advertising is an integral part of our application, any changes may result in changes in product metrics.
We’ve already run several A/B tests to check hypotheses about changes in ad revenue metrics. For example, we’ve run an A/B test in which we set different ad parameters for other user groups, such as:
- Frequency of ad impressions
- Ads displayed by specific ad partners
- Values of a timeout to network ad requests
Thanks to this test, we’ve got improvements in technical performance, such as reduced power consumption and RAM consumption by reducing the number of network ad requests, as well as a slight increase in revenue from ad impressions.
A/B test with the new native ad design
On iFunny, native ads are ads that are inserted into the content feed.
Native ads in iFunny
A native ad can include the following UI components:
- Ad header
- Icon or logo of the partner who placed the ad
- Advertising text
- Picture or video
- Additional buttons: call-to-action, audio, and video buttons (if the ad contains them).
We can change the color and size of fonts, the background color, the location of elements in ads, and set everything up however we want. But like any other UI component, native ads have their guidelines and best practices. Here are the main ones:
- Native ads should look and be perceived as app content.
- All UI components of an ad should be within its container view, placed organically and aesthetically, and be highly visible.
Both the content and the native ad container are placed in their cell on our apps. The cells form the feed. Since ads are embedded directly into the content feed, they can affect product metrics. For example, if users see an inappropriate ad, they may leave the app forever. Because of the awkward arrangement of elements, the user may accidentally click on the ad and open either the app store or the site to which the ad leads, after which he may never return to the app.
We decided to check if the change of the design of native ad cell could improve the product metrics of our app, and we came up with this hypothesis:
- If native ads look more like a content cell, they will be less annoying to the user, and the depth of view will increase.
An increase of view (that is, the amount of content viewed per session) can increase the number of ads in the content feed.
In this experiment, we’ve divided users into two groups:
- A — “control” (the design of native ads without changes)
- B — new design
We’ve changed the icon position, making it look like a user avatar, changed the background color to white, increased the font, and added a new style for the buttons. For comparison, below is an example of a content cell in the iFunny feed.
Content cell in the iFunny feed
Over the course of the experiment, we’ve been monitoring the following product metrics:
- Depth of view
- Views of native ads
- Banner ad views
- Retention
We’ve launched the experiment and began to monitor the metrics. As a result, our hypothesis hasn’t been confirmed: the observed metrics haven’t changed.
Thus, we haven’t seen the expected changes in the metrics within our hypothesis. We thought this was strange, and then we checked the other metrics and found differences in them:
- The number of clicks on native ads has increased (more clicks by 77%).
- Save rate (the amount of content saved by users) has decreased by 2%.
- The smile rate (the number of “likes” of the content) has decreased by 1.5%.
- The average length of a session has decreased.
The conclusion was that an increase in clicks on ads led to a decrease in session length as the user left the app by following a link to the store or the website. Reducing the size of the session led to fewer user actions.
Based on this, we’ve decided to stop the experiment and return to the previous design.
A/B test with sound in native advertising
There are two main types of native ads in iFunny:
- Static
- Video ads
As a rule, video ads are more expensive than static ads, but they are harder to get. Fill rate (the number of successful requests in relation to all requests) depends on how willingly advertisers will send us video ads.
Advertisers will send video ads more often if:
- The app has a high CTR (click-through rate), which means users often click on video ads when viewing it.
- Users watch video ads for a longer time.
To improve these metrics, we’ve decided to conduct an A/B test in which we’ve changed the logic of pressing the mute button in video ads so that it has become the same as the mute button in the video content cell in the iFunny feed.
In iFunny, you can mute or unmute video content by pressing the mute button. The audio state that the user determines by pressing this button remains common to all video content in the feed. But this rule doesn’t work for video ads. The sound in it is turned off by default.
We’ve developed a hypothesis that if video ads were played with sound, users would pay more attention to them: the CTR and depth of views of video ads would increase.
We’ve made and launched an A/B test and began to monitor product metrics, but the first results of the experiment have come to us in the form of feedback from the AppStore and PlayMarket. Users have started writing negative comments complaining about the sound in the ads. As we suspected, they did start paying more attention to video ads, but the problem was that the sound became annoying, distracting them from watching the content.
We’ve decided to stop the experiment and not wait for product metrics to worsen.
Summary
Even though the two experiments conducted in advertising haven’t confirmed the hypotheses, they have shown that changes in ads could directly impact product metrics and user experience. We are sure that the lessons learned will help us make fewer mistakes when integrating ads into a product in the future.