Designing with Science

Darren Geraghty
Facebook Design: Business Tools
11 min readJul 21, 2015

--

Last year at Facebook, a small team was given a big problem to solve.

This was nothing new, except the problem we were given was unique for both Facebook and the industry. The story of how Conversion Lift came to be is an interesting one since we needed to convey a complex concept at scale — scientific proof of the real-world value of running ads. Telling the story required some inventive design, we somehow had to figure out a way to ship the science — to deliver a product that communicated a scientific methodology in a simple, meaningful way.

Our journey over the following months led us into a forest of ideas, prototypes, and learnings in an effort to come out at the other end with an effective, elegant product.

Not everything that matters can be measured, and not everything that is measured matters. — Elliot Eisner

In order to understand why we needed to design this product, let’s begin by surveying the landscape.

Today, much of the advertising industry measures the effectiveness of ads by using last-click attribution. In this model, the last click before an actual sale is given credit for the purchase or conversion.

Pretty straightforward? Well, not quite …

It’s widely acknowledged that in many ways, last click is a suboptimal way of measuring ad effectiveness. Consumers are often subject to a multiplicity of influences that lead to an eventual purchase, and many purchases wouldn’t have occurred if the person hadn’t seen that very first ad. To make matters worse, sometimes a person was going to purchase regardless of seeing the ad, and sometimes the ad that most influenced their behavior was seen but never clicked.

A Substitute for Reality

Simply counting clicks fails to indicate whether online campaigns will impact consumer attitudes or even offline sales. As a result, ‘click-through rate’ is not the right metric to measure brand impact — since virtually no relationship exists between clicks and brand metrics or offline sales. Regardless of whether a person clicked on an ad or not, the real goal should be to capture the full value of the ad.

For example, imagine that you see a compelling ad for a new pair of shoes, which you immediately fall in love with. You might then decide to conduct some online research on the product, read some reviews, and carry out a few searches to see which colors are available. During your internet travels, you see another ad for those very same shoes, and since you’ve already decided to buy them, you click on that ad and make your purchase. Happy customer!
But the last-click model would give all the credit for your purchase to the ad you clicked right before you bought your shoes. And no credit to the ad that introduced you to the shoes, nor to the content you viewed during your research. When you consider that a consumer now interacts with an average of 18 pieces of content before making a purchase decision, the “last click” model appears to be a very outdated means of measuring overall ad effectiveness.
For consumers, it’s easy to forget what brought the product to your attention, and in most cases advertisers have no idea, either. So they use the last-click model as a substitute for reality.

Made to Measure

While the industry’s approach to measurement is inadequate, no product has addressed the problem, and so the model has continued to exist — until now. In order to help shift the industry from this archaic means of measuring, our product needed to do one thing well:

Scientifically prove the real-world value of running Facebook ads.

The proposition was simple; the means by which we achieved it were not.

Facebook’s marketing science team had been working on solving this problem for some time, and they’d developed a way to accurately measure the additional incremental business driven by Facebook ads. This was achieved by running holdout tests.

Basically, when creating a Facebook ad campaign, a randomized test group (people who see ads) and control group (people who don’t) are established. By holding other variables such as gender, age, and interests constant, we can then measure the true effect of the ads. The advertiser then securely shares conversion data from the campaign. This data typically comes from sources such as our conversion pixel or secure point-of-sale data and is anonymized to protect customers’ privacy.

We then compare the number of conversions in the test and control groups. The difference is the sales lift generated by the campaign.
Now that we had established a rigorous process, the time had come to productize this solution, so more of our customers could understand and take advantage of this means of measuring the effect of their ads.

What’s good for people is good for businesses: if we could help marketers better measure and improve their ad campaigns, consumers would win.

Possible Futures of a Product

At the beginning of a new product initiative there are infinite possibilities, and for designers this can often feel simultaneously exciting and overwhelming. Before the convergence of a solution, there is the divergence of the exploration. This is especially true for complex problem spaces like the one we faced working on Conversion Lift.

In order to explore a new problem you first must throw a lot of clay onto your spinning wheel, considering the many possibilities.

Over time you pare down those possibilities, gradually shaping a solution.
Bill Buxton nicely captures this notion, describing it as overlapping funnels, where the the constant generation of new ideas that open up opportunities is balanced by the reduction that results from decision-making — in order to arrive at a more thorough design.

Bill Buxton’s description of an exploratory design process.

With this approach in mind, the team fully immersed ourselves in the problem by engaging with the various stakeholders, familiarizing ourselves with the scientific concepts, and exposing ourselves to the breadth of possible solutions.

Metrics, Metrics, Metrics

Once we fully understood the problem and the technical implications of what we needed to communicate, we spent a lot of time considering the various ways in which we could express lift to a broad audience in a simple manner:

How should we communicate, within a reporting framework, the incremental value of running an ad?

After much white-boarding and cross-functional consultation, we identified a set of core metrics that could help summarize the concept.

We then began to sketch out possible ways of communicating these metrics, along with the ancillary numbers that might also support parts of our product’s story.

First Comes Elaboration, Then Reduction…

In searching for an effective solution we found ourselves considering a myriad ideas — the product’s possible futures. At first we explored a lot of dashboard-related solutions; it seemed like the obvious route. When a product has a lot of value to offer from its conception, it’s often difficult to “say no” and rein in the scope of the story it tells.

For one of our earliest working prototypes, we created a dynamic reporting dashboard that comprised our core metrics wired into some advanced filtering components. Essentially, it enabled a person to see the results of running a lift study and to facet or drill-down based on various attributes and metadata.

Early prototypes of our product embraced the many possibilities in lieu of focusing on what mattered the most.

We incorporated features such as confidence interval mapping, dynamic metric visualizations, and browsable product taxonomies.

In other words, we put everything in there, including the proverbial kitchen sink.

The team knew this was potentially a powerful tool, but we were also nervous that it might be a tad confusing — that it might overwhelm a user by trying to say too much at once.
The initial user-testing confirmed our suspicions: the report was too complex for some people. Though we abandoned this general direction, it helped shape our thinking for future iterations. We took our learnings and entered a war room to reboot the reporting design for our product.

Simplifying the “Science Bit”

For the next design sprint, we decided that less would indeed be more. Collectively, the team agreed that the new goal for the report should be to educate — to introduce the concept of lift to our customers and to convey the effectiveness of Facebook ads in a no-nonsense, digestible manner. We elected to pare down the experience and focus on the essential questions the product needed to answer in order for users to zoom in on the core value proposition of the product.

Our new goal enabled us to focus on providing the most valuable information a customer needed to know. We decided to express a reduced set of metrics and explain the means by which they were calculated. Once the general concept had been introduced and widely understood, future iterations could reveal greater detail.

How Should Our Product Say Hello?

Now that we had a stronger grasp on the story our product needed to tell, we next needed to figure out how design might enable us to convey the story in a meaningful way. In a way that would establish trust in our platform and methodology — but especially in our product.
The product involved some complex concepts and methodologies such as incrementality, power calculation, and statistical significance. We needed to discover a way to express the results simply, without undermining the rigor of the process.
By using design as a tool to reveal the reports’ complexity when needed, we would be better positioned to inform, not overwhelm.

Ways of Showing

This scene from a Coen Brothers movie nicely captures how math looks to many people: mysterious, intimidating, and even unknowable. One of the fundamental design challenges our team faced was how to make the mathematical concepts embedded within our product easier to comprehend for those not numerically inclined.

Saying It with Words

We understood that there was a certain amount of complexity we needed to express. It was simply unavoidable. Our product’s reporting surface would have to house many numbers supporting how our key metrics were derived. We needed to discover a means of doing so that was presentable.
Given the complexity of the information, how could we leverage design to simplify?

Too often, dashboards absolve themselves of the responsibility of communicating what matters, by putting all of the data in front of the customer.

Many times we had observed our customers wrestle with a screen full of numbers, trying to identify what was important. To avoid falling into this trap on our next design, we devised a means of succinctly summarizing the key take-aways into what we called Natural Language Summaries — a set of plain English expressions that combined our product’s key metrics into an easy to digest sentence.

This way, customers who were not mathematically inclined could still quickly read the key takeaways from the product’s reporting interface — something that could be easily socialized within an organization. While we knew that these simple text-based summaries would enable many of our customers to understand the important insights from the report, we were also aware that they wouldn’t be enough to satisfy everyone.

Clarifying Complexity

Our product would be used by a wide spectrum of users in various fields of expertise. At one end are the more casual users who want a simplified summary. On the other end are the more advanced users, who want a deeper, more thorough explanation of what occurred. An executive primarily concerned with top-line metrics might be satisfied with a high-level overview, while a data analyst will want to examine the math behind each calculation.

To serve those advanced users, we next needed to provide access to the scientific evidence behind the report. One of the Facebook Business design team’s core principles, which we recently wrote about in detail, is to bring clarity to complexity. That imperative was especially applicable to the challenge our team faced in surfacing the math within the product.
When it came to explaining how the metrics in our report were calculated, they often involved multiple steps, along with models such as scaled control groups. We had no pre-existing patterns by which to express such concepts within a reporting interface.

Though contemporary approaches to certain problems are often sufficient, many similar problems have already been cleverly solved in the past, even if in different mediums. In order to figure out ways of communicating these models, we looked for ideas from outside the traditional channels.

We sought inspiration from design artifacts that had reduced complexity in a novel way, such as Dieter Rams’ work on learning electronics, The Lectron System; Tom Kamifujis’ Pascal syntax poster for Apple; and Harry Beck’s groundbreaking work on the London Underground map.

Over time we began to develop a framework of visual patterns that let us express each of our core metrics. We called them mathematical explanations, or to coin a portmanteau, mathsplanations.

Simple data visualization can be very helpful when understanding data, but sometimes the visualization alone does not suffice. So we also paired each visualization with a step-by-step textual explanation.

The goal was to provide a condensed expression of a complex set of steps in an easy-to-read manner. For each metric, users are presented with a high-level summary. If they wish to see how the numbers are derived, they can click on “How is this calculated?” By providing complete transparency into our methodology, we hoped to establish trust in the product.
We now had a much simpler reporting model, by which we could introduce a wide audience to the concept of conversion lift.

First Steps, Long Journey

Initial user testing indicated that we’d struck a good balance between simplifying the value of the product and communicating the inherent rigor of our methodology. Phrases we received in user feedback included “easy to understand” and “playful yet comprehensive.”

This product is a result of the combined efforts of a diverse multi-discliplinary team comprised of a designer, researcher, content strategist, several engineers, product manager, data scientists and a product marketing specialist. However, it could be said that it takes an entire company to enable a small team to take on such a big problem.

Though this is simply the first step in what is a very long journey, we think it’s a solid foundation on which to build the future of our product. More importantly, it gives our customers a more accurate understanding of how Facebook ads are driving additional business and thus bringing better experiences to consumers.

--

--