How to brief a data analyst

Nic English
11 min readDec 20, 2021

--

Note: Quite a few people have correctly pointed out that this piece only talks about how to shape briefs to data analysts — not necessarily when, with which people, in which format and if an intake form is needed. All of these factors are very important and I hope to cover them in a future piece.

When it comes to data, there most definitely is such a thing as a stupid question.

To be clear, we’re not talking about “What is that x-axis exactly?” or “Sorry, is that thousands or millions?”. No, those are important, sensible questions that you shouldn’t have to ask because the charts should be labelled properly.

I’m talking about stupid, time-wasting, pointless questions that data experts get asked by people who want one thing and one thing only — the data to prove them right.

But I’m also talking about ill-defined, vague, time-consuming questions where the person asking has barely thought about why they need data and just wants a nice line graph on slide 17 and end up with someone doing this

“What are you?” “A laptop sandwich…”

One of the biggest obstacles to a business using consumer data effectively is asking good questions and most importantly setting good briefs.

Bad briefs will:

  • Get you something which is totally different to what you wanted
  • Waste a lot of time (and money)
  • Frustrate your data analyst and make them not want to work with you
  • Clog up your data resource with asks that might not be worth their time

Good briefs will:

  • Get you what you want
  • Really quickly
  • A data analyst who will likely be so happy to be briefed properly that they will do extra bits off their own back.

I’ve seen a lot of bad briefs in my time working in the data field. I’ve also seen a lot of complaints from people who work with data teams saying it’s inefficient/slow/difficult to get what they want.

Here is some advice to the latter group that might help.

What does a bad brief look like?

First of all it’s important to note that your requests to analysts might not feel like briefs, but they are. If an analyst gets a question via email, no matter how small it may seem, that’s a brief.

How a typical data analyst views work requests they receive

The second, super-important thing to note is that your request might seem small but it might actually be a really complicated ask.

Let’s look at an example(that I’ve made up, but is close to personal experience) which illustrates the above.

Helena is a strategist at a digital publishing start-up which manages monetisation for a portfolio of websites covering different verticals such as fashion, wellbeing and baking. She’s been asked by a potential client how the website performs on engagement so she can make a comparison with other competitors. Helena needs an answer pronto so asks the new wunderkind junior analyst Rach who joined a few months ago.

Hey Rach,

I need to know what our average site engagement is? Do we perform well on engagement? Is it more than usual at the moment? Can you look at that quickly?

Thanks,

Helena

Seems fairly harmless but I can only imagine the pure panic Rach is feeling as that hits the inbox.

There are so many problems here, let me try to explain from an analyst’s point of view:

  • What is defined as engagement here? Engagement is a subjective term at the best of times.
  • There is likely no agreed methodology/metric for measuring engagement within the business.
  • What is defined as “performing well”? versus what exactly? The market, other digital platforms?
  • Engagement probably varies wildly depending on the publication we manage, so this will make the numbers all weird when you average them.
  • “More than usual?” that means that you need a benchmark. If you need a benchmark you need a really good understanding of the data which Rachel might not have if she’s never looked at such a dataset before. That takes time.
  • The whole tone of the email is “get me this stuff that you should know about now”. Rach is fairly junior and new, so that’s pretty scary.
  • Erm, what timeframe? Am I looking at a year, a month, a week’s worth of engagement?
  • Finally — how quick is “quickly?”.

There’s so many problems here that Rach (the analyst) ends up sending a really long email reply asking for clarification on all the above. All told she spent half an hour thinking about the problem and writing the email. She’s not looked at any data.

Problem is — Helena (the strategist) doesn’t know the answer to most of these questions. Furthermore, she needs something by tomorrow and she’s wasted 30 minutes of analyst resource by asking the wrong questions.

Not only this, the communication pattern is already broken because both parties are on completely different tracks with regards to how to solve this problem. Rachel thinks she needs to figure out how to define engagement for the whole business, Helena is just looking for a good “stat” to put in an email to satisfy a client request.

Here’s how that email could have been sent instead, without any extra information from the client.

Hey Rach,

A potential client is asking us how we perform in terms of site engagement. I’m not sure I know what that is?

Do you have any idea and if you do, can you give me a high-level summary? I need to get back to her tomorrow.

Thanks

Helena

This email seems more uncertain and less confident to the sender, so why is it so much better? Well again, let me give you a data analyst’s point of view:

  1. Immediately, the vagueness of the situation is introduced. Instead of “I expect you to know this” we go to “I don’t expect you to know this, that’s ok”.
  2. Saying stuff like “I’m not sure I know what that is?” implies that they’re looking for advice on how to collaborate to solve the problem.
  3. The scale of the task and the deadline is made very clear. “I want basic information, by end the of tomorrow”.

This conversation now gravitates naturally towards a debate about which engagement metrics are relevant, which aren’t and what can be provided in the available timeframe. Helena might even have time to review the work and ask for some quick builds.

Here are some more examples of terrible briefs which you should avoid at all costs (and why they are terrible)

Can we build a model to see how often our users do X?

Terrible because: Using the word model means the complexity of the work in the analyst’s mind has gone up 100-fold, just because you wanted to sound clever.

What are the main triggers of X behaviour?

Terrible because: “Triggers” or “indicators” are words which over-simplify a really complicated task. You’re basically asking the analyst to look at every single piece of information available to find which is most likely to result in your preferred behaviour. This will require some serious heavy data lifting which you might not need.

Can you find any data which indicates that people do X because of Y?

Terrible because: What if you’re wrong? Using analyst resource to try and validate your gut feeling is unfair and a waste of company money. If you’re right then you’re just lucky.

What does a good brief look like?

In the world of consumer data — most briefs fall into three categories:

  1. Prescriptive briefs
  2. Interpretive briefs
  3. Exploratory briefs

Good and bad briefs are possible in all these categories — so it’s important to understand the difference.

A prescriptive brief is one where specific data is needed and only that specific data. Generally it will come in the form of numerous bullet points with sentences like

% YoY increase in distribution vs current market leader for Cat A products.

These briefs carry an excellent whiff of “give me my f**king data” when read.

An interpretive brief is when we have an established fact or set of facts and we need an analyst to dig into what might be the reason for this. Most often it will look like a bit like this

We know that our customers are buying less frequently since we rolled out our new product set but we don’t understand why — can you help us look into this?

Exploratory briefs are ones where you don’t really know what you want but you have some ideas but you don’t really know where to start. A good example of this would be

We want to understand what the latest consumer trends are in this category — can you help us?

How to write a prescriptive brief

If you know the dataset that the analyst works with and you feel relatively comfortable reporting on these figures or you have a very tight turnaround and need numbers immediately — then the ideal brief is the prescriptive one.

When writing a prescriptive brief, it’s a really good idea to follow a structured approach which the data analyst can match to their thought process when building a response. Generally you should brief in the following order:

  • Metrics (the actual number you want e.g Visits or Orders or Likes)
  • Dimensions (what type of breakdown you want e.g country, category, pages)
  • Filters (which one of the dimension values you want e.g UK, France or screwdrivers, ladders)
  • Timeframe (what timerange we are looking at e.g last quarter, year-to-date, past 7 days)
  • Time interval (what frequency of measure do you want e.g daily, weekly, monthly)
  • Comparisons (what we want to compare to e.g vs last year or vs page X or vs product Y)
  • Charts (yes or no — which do you want)

Aside — if you work with data people a lot you may just want to memorise what the above terms mean as it will help a lot.

An example of this in human would be:

Hey genius,

I was hoping you could help me, I need data on visits, cart adds and sales detailed out by category and country specifically for products defined as ladders and screwdrivers in the following markets: UK, France and Germany.

I’m looking for the this data over the past 24 months on a weekly breakdown. Also, is it possible to compare it to the previous 24 months?

Could you also chart up the visits and sales for all three countries on a weekly basis over the past year please?

Another way of doing this if you’re handy at doing basic data work — which may seem impersonal but secretly your data person will love — is if you send a spreadsheet with a template of exactly how you want your information. Just make sure you feel comfortable doing it this way.

No need to include everything — a basic format which includes all the combinations at a summary level is generally enough and your data guru will ask for any clarification

Remember these are the “bread and butter” asks of your data analyst. If you can’t get these right then you’re really going to struggle to get them on board to do the trickier stuff…

How to write an interpretive brief

Interpretive briefs tend to revolve around an established understanding of a business problem and a need to understand the consumer behaviour driving it.

As an example:

Finance have informed us that returns are up 40% and this is affecting the bottom line. We want to know if this is a shift in consumer behaviour or something else — can you help us solve this.

The above is largely fine and no doubt there will be back and forth but it can be improved a lot from the off.

The key thing here is not to fall into the trap of trying to solve the whole problem with data. This can lead to huge data complexities, serious resource-drain and unnecessarily sophisticated statistical models or dashboards as well as leaving you open to potentially making a business-defining decision entirely off the back of one analyst’s work while ignoring good business instincts. That and you might stress the hell out of your analyst.

Instead it’s best to think about data can help inform the next steps, which are mainly:

  • Validate/debunk your hypotheses
  • Come up with new ones
  • Identify recent trends/changes in consumer behaviour
  • Estimate the impact of future changes in behaviour
  • Potentially inform some decisions

Applying the above to this

Hey Rach,

Finance have informed us that returns are up 40% over the past three months and this is affecting the bottom line. We have some theories, namely X, Y and Z.

Can you help us validate (or invalidate) X, Y and Z please?

It would be also great if you could have a think about other factors that cause returns and how the data has trended over the past 12 months?

In an ideal world, we’d be able to know how changes in a certain behaviour affect return rate in a calculation tool we can play around with.

Thanks

The great thing about this brief is that Rach knows exactly what to do next and the briefer is getting three seperate pieces of work:

  1. Data backing up or debunking their hypotheses
  2. A new list of potential hypotheses with trends and analysis
  3. A “model for dummies” where they can input numbers and get their impact on the key metric — return rate

How to write an exploratory brief

Exploratory briefs are the most difficult ones. Partly because you don’t really know what you’re asking for, partly because you probably already know that the data doesn’t exist and if it does it won’t be easy to find.

If thinking in Rumsfeldian logic, we’re firmly in the “Unknown knowns” and “Unknown unknown” territory.

Rumsfeld was probably evil incarnate but this is a hell of a thought dynamic

A good example of this is one I was set a few years ago when working in a more market-research oriented role:

“Young people are drinking far less alcohol than they used to — why?”

If you don’t work hands-on with data, you won’t know this but generally data people love being set this sort of wide-ranging question that almost seems too hard to answer. It’s literally carte blanche to go off in any direction without any specific end-goal or answer in sight.

A good exploratory brief will not expect a definitive answer to a question but instead encourage imperfect answers and more of a collaborative process. It will also avoid stringent output requests such as charts or decks. An example:

Hi Rach,

I’ve been asked by our senior shower gel client to look into what songs people sing in the shower?

I literally have no idea if you can help me or how you would go about this, any ideas? We have until week after next until our next meeting with them if that helps…

Thanks

Ok so chances are low you work in an industry where this sort of question pops up but they key thing here in the brief is that there is zero expectation of a solution, acknowledgement that this is not an easy thing to tackle and a generous timeframe. Basically it’s “go play — get back to me with something or nothing”.

Aside — I’ve always been baffled by people trying to rationalise these sorts of ridiculous questions by senior people into formal briefs — as if data analysts have databases of songs sung in the shower without falling foul of GDPR.

My key tip with exploratory briefs is to actually try them out on your analyst. You’d be amazed how often it actually comes off or they find some curiosity along the way that informs something else or a brand new technique that can be used at another time.

If you’ve made it this far then I think it’s only fair to share a cheat sheet of “The Key dos and don’ts of briefing”

Do

  • Make expected outputs and timeframes clear
  • Be prescriptive when asking for detailed data
  • Leave the door open for questions/corrections
  • Be honest about your level of understanding/expectation
  • Share what you know, and what you don’t about the problem

Don’t

  • Ask for a “model”
  • Use the word “predictive” unless you know exactly what that will entail
  • Ask “Do we have any data that shows x?”
  • Say “Can we prove that x?”
  • Pretend you understand data better than you do
  • Rely on data to solve everything — at best it informs your next decision

--

--

Nic English

I am a consumer insights and experimentation expert with over ten years experience working at the intersection of data, digital, marketing and strategy