Key ingredients to being data driven
Written by Kristen Kehrer, Founder, Data Moves Me
Companies love to exclaim “we’re data driven”. There are obvious benefits to being a data driven organisation, and everyone nowadays has more data than they can shake a stick at. But what exactly does an organisation need to be “data driven”?
Just because you have a ton of data, and you’ve hired people to analyse it or build models, does that make you data driven? No. That’s not enough.
Although we think a lot about data and how to use it. Being data driven needs to be a priority at the executive level and become part of the culture of the organisation; more so than simply having a team with the necessary capabilities.
Here are the baseline qualities that I believe are necessary to be effective in your “data driven-ness”. Now I’m making up words.
To be data driven:
- Test design and analysis is owned by analytics/data science teams.
- Dashboards are already in place that give stakeholders self-serve access to key metrics. (Otherwise you’ll have low value ad-hoc requests to pull these metrics, and it’ll be a time sink.)
- Analytics/Data Science teams collaborate with the business to understand the problem and devise an appropriate methodology.
- Data governance and consistent usage of data definitions across departments/the organisation.
- You have a data strategy.
You’ll notice that there is a lack of fancy hype buzzwords above. You don’t need to be “leveraging AI” or calling things AI that are in fact hypothesis tests, business logic, or simple regression.
I don’t believe fancy models are required to consider yourself data driven. A number of the points listed above are references to the attitudes of the organisation and how they partner and collaborate with analytics and data science teams . I love building models as much as the next data scientist, but you can’t build next level intelligence on a non-existent foundation.
To clarify, I’m not saying every decision in the organisation needs to be driven by data to be data driven. In particular, if you’re going to make a strategic decision regardless of the results of a test or analysis, then you should skip doing that test. I’m a big advocate of only allocating the resources to a project if you’re actually going to USE the results to inform the decision.
Let’s take a look at the points from above.
Test design and analysis is owned by analytics/data science teams:
Although data science and analytics teams often come up with fantastic ideas for testing. There are also many ideas that come out of a department that is not in analytics. For instance, in eCommerce the marketing team will have many ideas for new offers. The site team may want to test a change to the UI. This sometimes gets communicated to the data teams as “we’d like to test “this thing, this way”. And although these non analytics teams have tremendous skill in marketing and site design, and understand the power of an A/B test; they often do not understand the different trade-offs between effect size, sample size, solid test design, etc.
I’ve been in the situation more than once at more than one company where I’m told “we understand your concerns, but we’re going to do it our way anyways.” And this is their call to make, since in these instances those departments have technically “owned” test design. However, the data resulting from these tests is often not able to be analysed. So although we did it their way, the ending result did not answer any questions. Time was wasted.
Dashboarding is in place:
This is a true foundational step. So much time is wasted if you have analysts pulling the same numbers every month manually, or on an ad-hoc basis. This information can be automated, stakeholders can be given a tour of the dashboards, and then you won’t be receiving questions like “what does attrition look like month over month by acquisition channel?” It’s in the dashboard and stakeholders can look at it themselves. The time saved can be allocated to diving deep into much more interesting and though provoking questions rather than pulling simple KPIs.
Analytics/Data Science teams collaborate with the business on defining the problems:
This relationship takes work, because it is a relationship. Senior leaders need to make it clear that a data-driven approach is a priority for this to work. In addition, analytics often needs to invite themselves to meetings that they weren’t originally invited to. Analytics needs to be asking the right questions and guiding analysis in the right direction to earn this seat at the table. No relationship builds over night, but this is a win-win for everyone. Nothing is more frustrating than pulling data when you’re not sure what problem the business is trying to solve. It’s Pandoras Box. You pull the data they asked for, it doesn’t answer the question, so the business asks you to pull them more data. Stop. Sit down, discuss the problem, and let the business know that you’re here to help.
Data governance and consistent usage of data definitions across departments/the organisation:
This one may require a huge overhaul of how things are currently being calculated. The channel team, the product team, the site team, other teams, they may all be calculating things differently if the business hasn’t communicated an accepted definition. These definitions aren’t necessarily determined by analytics themselves, they’re agreed upon. For an established business that has done a lot of growing but not as much governance can feel the pain of trying to wrangle everyone into using consistent definitions. But if two people try to do the same analysis and come up with different numbers you’ve got problems. This is again a foundation that is required for you to be able to move forward and work on cooler higher-value projects, but can’t if you’re spending your time reconciling numbers between teams.
You have a data strategy:
This data strategy is going to be driven by the business strategy. The strategy is going to have goals and be measurable. The analyses you plan for has a strong use case. People don’t just come out of the woodwork asking for analysis that doesn’t align to the larger priorities of the business. Things like “do we optimise our ad spend or try to tackle our retention problem first?” comes down to expected dollars for the business. Analytics doesn’t get side-tracked answering lower value questions when they should be working on the problems that will save the business the most money.
In summary:
I hope you found this article helpful. Being data driven will obviously help you to make better use of your data. However, becoming data driven involves putting processes into place and having agreement about who owns what at the executive level. It’s worth it, but it doesn’t happen over night. If you’re not yet data driven, I wish you luck on your journey to get there. Your analysts and data scientists will thank you.
Originally posted here
Originally published at digileaders.com on March 20, 2019.