Dashboards technology has sped up the analytical/discovery process for analysts and business owners, but applying some low-tech best practices really drives analytical and business success. I am a 15-year vet of the analytics space, meaning that I experienced the productivity boost brought to us by better tools, first-hand. But even with this, the bulk of inefficiency analytical of shops remains in the form of rework that can be prevented by some low-tech (or no-tech) best practices. I recently did these exact steps when helping Xandr’s buy-side product team in monitoring the health of their business.
#1: For the hundredth time, requirements meetings, people!
First and foremost, a requirements sync-up is necessary. Analysts may say they don’t need it because they will “boil the ocean”, or business users may say they don’t need it, because they know how to request things from analytics, but both would save time if they would simply meet.
Here are the things that analysts need to do, and business owners should insist from analysts in the requirements meeting:
Understand the real-world process you are modeling or streamlining in this job
Business owners are masters of their domains, but not their data. They shouldn’t be the architect of the data or dashboard solution at the beginning. Analysts like cookbook requests (with a simple list of dimensions and measures) until they realize that requirements are missing, or their work didn’t deliver value to the business.
Understand the magnitude and get a sense of what is closest to the edge of the business’s capacity to care.
Is a 100% YoY growth of single client’s spend important for your senior leaders? It depends on whether that client went from 5 to 10 dollars or 5M to 10M. Knowing how to watch your long-tail phenomena is an important aspect to quality dashboards.
Understand your true north.
If there is something in your business owners’ worldview that is either a deeply held belief about the business, or part of their informational diet that you will end up commenting on, know that up front. This means that you either start from that place and drill-down in the details, or if there is something that doesn’t conform that they have held on to, perhaps for years, you want that contradiction to come out as soon as possible in the build.
Understand the factorial impact of dimensions. What is interesting? what isn’t? What is redundant?
Sure, analysts and business owners alike take short cuts of boiling the dimensional ocean, but at what cost? Clarity of signal. The most important metric should be the biggest, most obvious, inescapable piece of information they see, hence why it is a dashboard, not a mystery novel.
Understand the facts/metrics, what will they tell you, and how will they drive action
It seems overly simple, but knowing the cadence of calculation (are quarterly goals important to the business?), whether nominal change or percentage change should lead, and what constitutes a trend in your business will save everyone time and money.
#2: For bigger projects, create a conceptual ERD diagram
If you do this, you will know the likelihood that your end solution will require multiple fact tables (and possibly dimensional bridge tables) and you will answer technical questions around granularity, ordinality and cardinality. Sometimes, this step will make you realize that the relationship of certain metrics has mixed granularity, which adds complexity to dashboards, regardless of toolset.
#3: Send a sample of outbound data.
In dashboards or building reporting solutions, a really common use case is data exports, but there is an important side bonus: Knowing how data actually looks gives everyone an opportunity to sync up on retrieved values. The exercise of creating a human readable piece of data will often reveal gaps in an analyst’s understanding of the requirements or business definitions that didn’t come out in your requirements session, because it was assumed that everyone had the same definition at that point.
#4: Build a prototype dashboard for the business, focusing on the “north star” metrics
Cross-filtering, drill downs and sub reports are a key part of the value proposition for dashboards, but analysts shouldn’t dive in before demonstrating to the business that they understand the world in which they operate. If the analyst throws the business owner a curve ball in just delivering a well-established KPI that happened to slice in a new way, this is a great time to pause for discussion. Otherwise a lot of analyst time can be wasted on enabling features for a metric that has lost relevance.
#5: Analysts: Ask and answer five questions on your own
The point of most dashboards is to not only present KPIs, but also give some analytical depth that can point toward an action. In the real world, my car may be losing speed, but is it because of a low-pressure tire, or an empty tank? You need to know at least a little information beyond the top-level number to give the business real value.
Here is a treasure hunt as generally as I can define it to provide ideas on how to “kick the tires” on your mid-build dashboard:
· Pick a metric, preferably a top-level/true north one.
· Establish the top ranking/best for business value in that metric, aggregated on a dimension of your choosing. How easy was this to find? Find the lowest/worst ranked value, based on these same criteria. How easy was this to find?
· Looking at each of these high/low categories, try to find a metric where that same category ranks directionally opposite, but not mathematically/causally opposite. Do you see any interplay of metrics? Does this tell you anything new about your business?
· Now, pick a rate or index, if you have one in your dash. Look at highs and lows like you just did for your true north metric.
· In addition to examining a high- and low-categorical values (similar to the previous exercise), now find two categories where the rate/index is similar. Look at the two categories, and now compare the constituent parts of the rate/index. Are they also the same? If not, what does this tell you about the relevance of the rate/index and/or the parts that make it up? Are you reporting enough information for one to be the proxy of another? Is there room for less, or do you need room for more?
#6: Analysts: Work with the end user as they try to ask their own questions & break things
This is far and away the emotionally hardest part, but in a lot of ways this kind of testing is just as critical as the requirement gathering and design steps in getting your work across the finish line. It will ensure that the dashboard will require less maintenance or rebuilding in the near term. This test can go one of two ways:
· You can throw the “keys” over to the business owner, and say, do your worst, and I am here to bail you out if you get stuck, or need an immediate explanation. Be ready to take control, because you may not be able to see clicks or keystrokes that manipulate the dash in unforeseen ways.
· You can say, let me drive, because I know where everything is, and can save you some time. If you decide to go this route, which is really good, because if you built some cool functionality into your tool, you get a chance to show it off, you absolutely need to go slow, and call out exactly what clicks and keystrokes do what, as you go, and you want them to learn it as well as you have come to know it. Best case scenario, you will want to be ready to get out of their way, because they will be excited to drive.