Lean marketing: Pause, rewind and go small

By Jon Hoar

A little while back we launched a global content marketing campaign designed to be delivered in 18 markets simultaneously with hundreds (maybe thousands) of individual assets and complex media plans. The project delivery followed a waterfall methodology and although we were building on validated user insights and known channel performance metrics, nothing about the campaign itself had been directly tested before we planned to go live with a full global launch. There was nothing unusual about this — it was the old way of delivering marketing campaigns.

On the brink of launch, we hit pause. We asked ourselves, could we be doing this better? Could we apply small batch sizes to marketing campaigns? Could we test and learn our way to more efficient execution, more impactful (more certain) results? Invariably, the answer was yes.

We hit rewind

Our first test was to validate the proposition. Our campaign was all about how Skyscanner could help you find the Perfect Flight. We had insights to prove that a ‘perfect flight’ was something people understood but was it something they cared about? This was our riskiest assumption. If the answer was ‘no’, then the campaign was built on sand.

We ran a survey on our target market — we showed people a series of related statements, one of which related to the importance of ‘finding the perfect flight’. The results came back positive. Out of the tested statements, ‘finding the perfect flight’ proved to be one of the most desired outcomes. We had validated our first assumption. People cared.

One small batch

Next, we chose to run the campaign in full but in the smallest, most contained way possible. We decided to go live in one of our smaller markets using our primary channels: Paid Media, Social and PR. We set target ‘pass/fail’ metrics for each channel and campaign asset and deployed as quickly as possible.

We learned a great deal from this short test. Probably more about how to test well than the campaign itself. So here are some of our leading conclusions:

  1. Don’t fly blind: For various reasons (too involved to describe here), we were not able to track many of our key metrics ‘live’. In such a short campaign, it’s fine to review most pass/fail metrics after the campaign has run. But you also need to identify (and be able to respond to) some basic health metrics whilst the campaign is in-flight. For example, our campaign landing page exhibited an unusually high bounce rate but calculating it was an incredibly convoluted process. We did make changes at the half-way point but, if we’d had live visibility of this number, we would have paused activity much earlier and iterated through some changes to optimise this.
  2. Make sure the channel matches the campaign: Although we were able to optimise the Google Display Network KPIs down to something close to our ‘pass’ metrics, the high bounce rate suggests that the people clicking through were not getting what they expected when they responded to the ads. The campaign’s value was in rich, educational content but, perhaps (be it because of the message, or the channel context), respondents to paid marketing seemed to be expecting something far more transactional.
  3. Testing in a single, small market for a short period is imperfect: Although our small market makes sense from a size point-of-view, it was also a difficult market to prove the campaign in. Content and Social channels were hobbled by the lack of local Skyscanner news site and social profiles, so we were over-dependent on paid channels. Plus good PR results can be more challenging in smaller markets (without a distinct ‘local’ hook).
  4. But launching to one market saves a lot of work: Various changes were made to the campaign assets in the days running-up to launch. The PR team got some invaluable feedback from friendly media contacts and changed the Perfect Flight formula (and associated assets) as a result. A basic usability review also threw up the need for tweaks across social landing pages, ads and beyond. Making these changes to one set of assets was simple and quick. If we had to apply them across 18 markets, it would have been very heavy-going or (worse) we would have been forced to plunge ahead and ignore the need for improvements.
  5. Think carefully about how you use a landing page: The Perfect Flight campaign was brought to life with a selection of relevant content and all traffic was directed to this hub. This makes sense intuitively (there are all sorts of ways we can help you understand and get the perfect flight) but with the page delivering a 90% bounce rate (and two of six links attracting <1% of clicks), perhaps not so much to the user. Thinking (with added hindsight) through the user journey: someone has chosen to click on an ad somewhere online, with the promise of ‘finding their perfect flight’. They are then confronted with a page containing over six different choices (all related to the perfect flight). It’s just too much to think about. The ‘reward’ for the user has become too vague and too much work to access. So they hit the back button and continued with their day. A landing page should have just enough information (and no more than that) to persuasively convert the user to the next stage in the funnel. It should not be used as sticky tape to hold a campaign together.
  6. Start with a Minimum Viable Test (MVT): Ours was an unusual case. With the history of this project, we had a fully-formed campaign ready to roll. Hence the decision to launch it as a small batch in one market. But the reality is, we still couldn’t see the wood for the trees. For example, take one ‘fail’ metric: Is the bounce rate low because the channel choices were wrong? Or the messaging was disconnected? Or the landing page usability is poor? Or the content is not compelling? Or the targeting was off? Starting with an MVT and validating assumptions one at a time, through quick, iterative tests would have avoided this uncertainty.

We want to share more

For more tips, Growth Hacks and job roles from across our global offices, sign up for Skyscanner Growth hacks right to your inbox!

About the author

Hi, thanks for reading my post. My name is Jon and I am the Head of Product here at Skyscanner. Come and work with us, check out our latest Developer and Marketing roles within our Growth Tribe.