Creating a mechanism for large scale A/B and multivariate testing on websites using Google Optimize

A little bit on how the Viva Real and ZAP Imóveis websites team developed a mechanism for a large scale A/B and multivariate tests using Google Optimize.

Google Optimize

Before digging into how the website team developed a mechanism for large scale experimentation, a brief introduction to Google Optimize.

Google Optimize is an online split-testing tool from Google that plugs into your website and enables you to experiment with different ways of delivering your content.

And how does it works? How does it split website traffic to validate experiments?

Image for post
A fluxogram that sums up how Google Optimize works when using activation events

Below, an important Google Optimize behavior to highlight.

Experiments are independent from each other. A single visit can participate in multiple experiments and you need to avoid side-effects. While it is perfectly normal to run multiple experiments on the same page, these should be done on different page areas to avoid the introduction of bias (e.g. where the variant of one experiment promotes a variant of another experiment).

How to create a mechanism to enable large scale experimentation?

A framework for experimentation

First of all, there needs to be a clear experimentation framework, so experiments are built with strong hypotheses, based on clear fundamental pieces of evidence.

Here, you can find examples of A/B and multivariate experiments documented using a framework built by me and the Viva Real product team, based on clinical trials frameworks. The simplified version was created by the marvelous Suelen Fenali =]

Enabling large scale experimentation

Google Optimize offers a Google Chrome extension to facilitate experimentation on websites’ visual designs. However, this limits hypotheses that can be tested. An example of this limitation is experimentations to validate new ranking algorithms, in order to tackle real estate marketplace problems or to increase a product or business metric.

In order to broaden a website experimentation hypotheses testing, there needs to be a way to run tests in any service that serves the content to the user. In other words, experimentations cannot be limited to validate hypotheses that only a single or a few specialties of a team can work on — like product management, product designing, and front-end engineering. Build hypotheses that can be tackled by using the whole knowledge of a team (like back-end engineering, data science, and so on) can lead to better experiments and faster validations!

How does Viva Real and ZAP Imóveis enable large scale experimentation

Google Optimize is only used for drawing traffic to experiments and splitting it into its variations.

Each experiment variant adds only a single script to it.

<script>
var actual_location = document.location.href;
var actual_referrer = document.referrer;
var new_referrer = "";
var new_location = "";
if (actual_referrer.indexOf("google.com") > 0 && actual_location.indexOf("gclid") > 0){
new_referrer = "&utm_referrer="+ encodeURIComponent(actual_referrer);
}
if (navigator.userAgent.indexOf("ooglebot") < 1) {
if (actual_location.indexOf("utm_referrer") < 1 && actual_location.indexOf("#") < 1 && actual_location.indexOf("__variation_parameter") < 1) {
new_location = actual_location + ((actual_location.indexOf("?") > 0) ? "&" : "?") + "__variation_parameter=experiment_name:variation";
location.href = new_location + new_referrer;
}
}
</script>

The script above redirects the user to the URL she or he is, adding two parameters to it:

  • utm_referrer adds the document referrer to the URL, in order to avoid traffic attribution issues with Google Paid Traffic (if you use Google Analytics to track your data);
  • __variation_parameter adds a parameter that Viva Real and ZAP Imóveis websites use to control which running variation must be rendered to the user — experiment_name is replaced by the experiment name given by the team who created the experiment; variation is replaced by an alphabet letter that indicates which variation must be rendered (like b, c, d and so on).

The value within the __variation_parameter parameter is then broadcasted to all the services consumed by the websites.

If a service is running the experiment designated in the __variation_parameter parameter, it then returns the content variant that is currently being tested.

Google Optimize also affirms that:

The visitor will remain in the assigned variant until the end of the experiment while actions that happen any time later (i.e. goals) are attributed to that variant.

In order to avoid unnecessary redirects during user navigation, whenever the __variation_parameter parameter is appended to the URL, every single navigation link in the website also appends it, assuring that a user will see the drawn variation in hers or his whole session.

The redirect script can however cause bias to the experiment, due to the redirect time spent on each variation whenever a user is drawn to it. And how to solve that?

Every experiment also duplicates the original control group on Google Optimize, just like the example below.

Image for post
A multivariate test with a duplicate control group

The script code added to this duplicate control group is similar to the example below.

<script>
var actual_location = document.location.href;
var actual_referrer = document.referrer;
var new_referrer = "";
var new_location = "";
if (actual_referrer.indexOf("google.com") > 0 && actual_location.indexOf("gclid") > 0){
new_referrer = "&utm_referrer="+ encodeURIComponent(actual_referrer);
}
if (navigator.userAgent.indexOf("ooglebot") < 1) {
if (actual_location.indexOf("utm_referrer") < 1 && actual_location.indexOf("#") < 1 && actual_location.indexOf("__variation_parameter") < 1) {
new_location = actual_location + ((actual_location.indexOf("?") > 0) ? "&" : "?") + "__variation_parameter=experiment_name:a";
location.href = new_location + new_referrer;
}
}
</script>

Experiment data is then compared to the duplicate control group data, instead of comparing results between variants and the default Google Optimize original control group.

The redirect script also limits users to see only a single experiment by visit. In order to avoid data bias, every Google Optimize experiment also adds a negation rule to the page targeting section of the experiment details.

Image for post
Negation rule on page targeting

If you desire a mechanism that enables users to be drawn to multiples running experiments, you can use a script similar to the one below on each experiment variant on Google Optimize.

<script>
var url_parameters = new URLSearchParams(window.location.search);
var variation_parameter_value = url_parameters.get("__variation_parameter");
var actual_location = document.location.href;
var actual_referrer = document.referrer;
var new_referrer = "";
var new_location = "";
if (actual_referrer.indexOf("google.com") > 0 && actual_location.indexOf("gclid") > 0){
new_referrer = "&utm_referrer="+ encodeURIComponent(actual_referrer);
}
if (navigator.userAgent.indexOf("ooglebot") < 1) {
if (actual_location.indexOf("utm_referrer") < 1 && actual_location.indexOf("#") < 1) {
if(actual_location.indexOf("__variation_parameter") < 1) {
new_location = actual_location + ((actual_location.indexOf("?") > 0) ? "&" : "?") + "__variation_parameter=experiment_name:variation"
}
else {
new_location = actual_location + ((actual_location.indexOf("?") > 0) ? "&" : "?") + "__variation_parameter=experiment_name:variation," + variation_parameter_value;
}
location.href = new_location + new_referrer;
}
}
</script>

A live example

Viva Real is currently experimenting on approaches to mitigate a possible negative impact on conversion rate since the whole website is being tested to be compliant with the LGPD (Lei Geral de Proteção de Dados Pessoais, a law similar to GDPR on Europe).

After extensive research to fully understand LGPD, benchmarking on European real estate websites, and solutions conceived and refined by the whole team, two approaches were selected to start the experimentation on Viva Real’s lead form.

Image for post
LGPD consent checkbox
Image for post
Single CTA to send a lead and get user consent

After a user has already given hers or his consent, the consent action is replaced by a disclaimer that reminds the user about its previously given consent. A CTA link to the user to review hers or his consent is also presented — if a user clicks on it, the checkbox or the single CTA is presented to the user once again.

Image for post
Reminder disclaimer and consent reviewal link CTA

After about 10 days of experimentation, we were able to collect form usage data and user feedback, in order to iterate on new solutions to mitigate the observed negative impact on conversion rate. Once new solutions are ready to be tested, the running experiment is turned off and a new one, exploring different solutions to the same hypothesis, is turned on. This iteration process goes on until an acceptable solution that matches expected or estimated results is found.

You can see how the described mechanism works by browsing in the following pages:

Thanks for reading! If you have any questions on how to replicate the mechanism for large scale A/B and multivariate tests in products you work on, do not hesitate to ask in the responses section ❤️

Bonus tip

You can add a CSAT (Customer Satisfaction) measurement component to your experiment variants, in order to collect user feedback and help you to support and conceive new solutions to validate a hypothesis!

Written by

Sharing a little bit about my work as a product manager and also sharing thoughts about my readings, binge-watchings, politics, life, universe and everything.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store