Advertising finds itself in a tenacious spot these days serving two masters: creativity and data.
On the one hand, it values creativity; and it’s not hard to understand why. Creativity helps make brands distinct from each other and meaningful to consumers, it wins awards and improves recognition of the brands, agencies and people involved. And according to Hurman (2016) — who analysed three decades of data — more creative advertising is more effective advertising, driving sales and revenue.
On the other hand, we are seeing the rise and rise of data. From its beginnings in advertising as post-campaign performance measurement to prove the effectiveness of advertisements, and as an input to strategy development; data has morphed into a powerful force that can influence every aspect of advertising that cannot be ignored. We now have programmatic advertising, personalisation at every point of contact, content-matching algorithms and interactive OOH.
But the two are not easy bedfellows, as Tim Nudd of Adweek says: “data has always been a bit terrifying to creative people. It’s often seen — sometimes fairly, sometimes not — as a replacement for intuition rather than a way to supplement it”.
Yet data science — as a discipline — is not just about ‘data’. It includes research, analysis and ultimately method and reasoning. These other aspects of data science could be easily applied to creative decision making without taking away from intuition, expression and creative experience. As Payam Cherchian of AKQA says: “more and more clients are coming to the table expecting agencies to have data/ research on every slide to back up their creative ideas, assumptions and strategies”.
The Creative’s Problem
The time between receiving a proposal request and submitting it is short, and creatives need to perform a number of mental tasks in that timeframe. They need to understand the brief, ideate, decide on an idea (or two), develop the idea, detail how to make it happen, and budget the idea.
Because time is limited the idea(s) presented in the proposal to the client are not always fully fleshed out. The general idea is there, but either the intricate details are missing or assumptions are made; and once the client has signed off on the idea, it needs to move into production. This is where conjoint analysis could be employed to give some data to the creatives decisions.
Intro to Conjoint Analysis
Conjoint analysis is a choice modelling technique used in market research primarily to find the optimal combination of attributes for a new product. The methodology helps uncover the preferences individuals have by designing different versions of the same product (i.e. 3 ice cream flavours and choice of cone = 9 combinations) and asking potential consumers to rank the different versions of a product from best to worst.
Choices can be influenced by many psychological, situational and social factors such as habit, inertia, experience, advertising, peer pressure, opinions, etc… but underneath all that it is assumed there is this thing called utility (or value) in the consumers mind that represents how important/ preferred/ desirable each option is (Louviere et al, 2000).
While it would be nice to assume we could just analyse the rankings of product versions with each specific attribute (i.e. how high is chocolate ice cream ranked?), conjoint analysis calculates utilities (called part-worths) for each attribute based on the rankings of consumers (Härdle and Simar, 2012: 413) that are used to:
- determine the optimal combination of attributes to maximise utility
- determine the importance of each attribute according to consumers
- design alternative combinations and analyse their utility
This is how market research is able to determine consumers are interested in a zero-interest, red credit card with a loyalty program that gives you 300 bonus points upon sign-up vs a 5% interest, blue card without a loyalty program.
What if we made one… small… change…
Instead of comparing product attributes we were comparing the components of a creative idea that we are thinking of pitching to a client? This would tell us which combination of attributes for our creative idea have the most appeal to the target audience. It would also tell us which attributes are most and least important in their choice (which we should emphasise).
Let’s assume we’ve received a RFP and the creative team have struck upon an idea to run a secret concert that people have to locate. As the turnaround time for the proposal is incredibly short, that’s about as far the creative team have come with developing the idea. Many questions remain:
- how many bands are playing at this concert?
- is there just one concert or several, in different cities?
- do we provide clues or should people follow a signal?
The team could make decisions based on their opinions or budget, or we can use data to see what consumers think using traditional conjoint*.
The first thing we need to do is design the choice model. In this example we have three attributes each with two levels (options):
Because the number of attributes and levels is small, the total number of combinations is small, 8 in fact, leading to a factorial design (which represents every possible combination):
- one band, in Melbourne, send out clues
- one band, in Melbourne, follow the signal
- one band, in 5 capital cities, send out clues
- one band, in 5 capital cities, follow the signal
- several bands, in Melbourne, send out clues
- several bands, in Melbourne, follow the signal
- several bands, in 5 capital cities, send out clues
- several bands, in 5 capital cities, follow the signal
Side note: when the number of attributes and levels start to increase, the number of possible combinations gets very high. Too high for anyone to rank the combinations in a sensible order. In these cases we take only a subset of combinations — such that will still give us enough information to run the analysis on all the attributes. This is called a fractional factorial design. As a rule of thumb, when there are more than 10 combinations, you should use a fractional factorial design.
Now we can design a survey and administer it to a group of consumers in the target audience. According to Peduzzi et al. (1996), the minimum sample size for a choice experiment can be worked out with the following formula:
where q is the number of questions asked in the survey, a is the number of combinations per question, and c is the maximum number of levels of any attribute in the choice model. Given we have 1 question, asking respondents to rank 8 combinations and the most levels of any attribute is 2, this reveals we need a sample of at least 250.
We can ask each respondent to rank these combinations from best (1) to worst (8), plug the raw data into a spreadsheet or R to run the analysis. I’ve setup a Google Sheet to run this type of conjoint analysis that can handle up to 10 combinations of 6 attributes and a sample size of 500, but you could do more in R or specialist software like Sawtooth and Q.
Assume we get the following results:
The importance metric tells us how ‘important’ or ‘preferred’ each attribute was in the consumers ranking of the options. It tells us that the number of bands was the most important factor in their decisions.
The part-worths breakdown the attribute preference further to show which levels ‘increase’ perceived utility (consumer preference) and which detract. Here, the results suggest the best combination is:
- one band, in Melbourne, follow the signal
This now tells the creative team not to worry about sourcing several bands for the concert nor worry about finding locations in other cities; giving them more time and energy focus on developing a better experience that people actually want.
Conjoint analysis doesn’t replace the role of the creative in ideation, nor does it leave data as an end product to evaluate creative performance. It helps support creative decision making with rigour and enables data to play a larger role than just an API connection to a database for personalising advertisements.
Making it happen
Implementing conjoint analysis to creative decision making does not require much change to current processes. In fact it requires a bit more communication between creatives and analysts and some patience:
Step 1: create a list of attributes and levels the creatives are interested in
- Keep to 2–6 attributes and 2–4 levels per attribute
- Avoid using attributes that are hard to specify or quantify — like high/ low quality
- More attributes provide a more accurate picture, but becomes too much information for a respondent to think about
- Check how much did your mental image of the idea change with the inclusion of each component, if not much, then it might not be relevant
Step 2: design alternative combinations that people will rank
- With a small number of attributes and levels, you could have people rank every possible combination
- In cases that the number of combinations is large, you will need to select a smaller number of combinations using a special methodology (your analyst will need to do this)
- Avoid prohibited pairs (i.e. speed and fuel economy) because that is not realistic
- You can work out the minimum number of combinations needed (Cordella et al, 2013) where t is the total number of levels across all attributes and a is the number of attributes
- Because choice experiments are dependent on the design of the study, you cannot simply remove faulty combinations from the survey. You cannot add or remove attributes or levels either
Step 3: determine the sample size and send out survey
- Who are you surveying? They might be existing customers from the clients CRM, or leads from a web form, or fans from the clients Facebook page. You might be fortunate enough to run it via Google Surveys for a representative sample of the general population.
- The data collection should be quick given that the survey is rather short (1 question). I’d expect a turnaround time of 1–2 weeks and close it after that point.
Step 4: run the analysis and compare results to creative team’s initial thoughts
- The two key metrics are importance and part-worths. Importance tells you to what extent each attribute influenced people’s preferences and the part-worths break that down into the individual levels of each attribute.
- The analysis should also produce a little simulator that allows you to calculate the utility value of any combination of attributes and levels. This is useful when creative ideas are constrained by budget or client (i.e. can no longer use the follow the signal hint, so what’s our next best option?)
I hope that I’ve given you a new opportunity to introduce data and analytical thinking to your creative work, and shown you that the two can be harmoniously integrated in a much better way than they currently are. If you enjoyed this post, please share it with your friends.
Cordella, P., Ammerlaan, S. and Joffre, A. (2003). Menu-Based Conjoint: A new method. SKIM webinar.
Härdle, W. K., and Simar, L. (2012). Applied Multivariate Statistical Analysis, 3rd ed. Springer-Verlag, Heidelberg, Germany.
Hurman, J. (2016). The Case for Creativity, 2nd ed. Cannes Lions Publishing, London.
Louviere, J. J., Hensher, D. A. and Sawit, J. D. (2000). Stated Choice Methods: Analysis and application. Cambridge University Press, Cambridge, UK.
Peduzzi, P., Concato, J., Kemper, E., Holford, T. R. and Feinstein, A. R. (1996). A simulation study of the number of events per variable in logistic regression analysis. Journal of Clinical Epidemiology, 43, pp 1,373–1,379.
*Of the three conjoint methods: traditional (TC), menu-based (MBC), and choice-based (CBC) I think traditional is probably easiest to apply to creative problem solving because (1) it can be setup in a spreadsheet (MBC and CBC require specialist software), and (2) the number of attributes they would be interested in would be low (3–5)