Top 25 Mistakes Corporates Make in their Advanced Analytics Programs

Pedro Uria-Recio
5 min readSep 28, 2018

Most organizations are making mistakes when putting in place their advanced analytics programs. These mistakes can be grouped into strategy, people, execution, technology, and finance.

Strategy:

1. Day-dreaming that analytics is a plug & play magic wand that will bring very short-term ROI. Well-executed basic excel models might have brought quick wins in the 2000s, but advanced analytics requires some time. Analytics is never plugged & play because plugging data into models is exceptionally lengthy, learnings are not transferable across companies or markets, and they require a high OPEX in people and high CAPEX in systems.

2. Solving problems that are not worth solving results in wasting time and resources. Analytics is not about solutions looking for problems but issues looking for answers. Questions like “What can we do with blockchain?” do not make sense. “How can I solve my marketing problem” is a question that makes sense. The worst mistake of the Chief Data Analytics Officer is not having a clear view of what key challenges and opportunities each functional area is confronted with.

3. Relying solely on vendors or consultants for analytics, especially model creation. The post-mortem of how corporates fail to develop capabilities with consultants is as follows: the client hires a consultant to deliver a project and, simultaneously, develop internal capabilities. The client has far too unrealistic expectations about the project's impact, and consultants never say “No” and oversell the project. The effect does not materialize, and one day, the client tells the consultant, “If you do not get some impact in the next month, I will stop your contract.” That day capability development officially dies, if it had ever existed. RIP—a few million dollars in the trash bin. Anyway, analytics is the brain of the company. How could corporates even think they could outsource it?

4. Not developing a fully comprehensive list of priorities. Since you can only count with five fingers in one hand, management should pick at most five metrics rather than making everything seem essential.

5. Saying yes to random management requests, like pet projects or glamorous visualizations and reporting often results in analysis-paralysis syndrome.

6. Not using external data. Companies have been exposed to their internal data for years, maybe not at its full potential, but most are somehow familiar with it. Internal data is not going to transform the business radically. Only external data from other companies or the public domain can change the industry: social media, maps, competitors’ products, prices, digital advertising records, etc. But most companies are not doing enough to build external data assets.

People:

7. Organizing analytics under functions that do not drive the business daily, such as IT or strategy. Analytics is only powerful if it is coupled organizationally with daily operations.

8. Letting multiple analytics teams flourish with organizational siloes among them. Analytics needs to keep an integrated view of the business.

9. Attracting talent only through base compensation. Instead, it is necessary to build a sense of purpose, create a powerful employer brand, and develop internal talent.

10. Hiring a bunch of PhDs who strive to develop highly nuanced models instead of directionally correct rough-and-ready solutions. Hence they fail to prove actionable insights. So, don’t hire PhDs; hire highly coachable fast learners.

11. Hiring a technical Chief Data Analytics Officer or a non-technical Chief Data Analytics Officer. Instead, he needs to be specialized enough to coach his team and be business-driven to understand business problems.

12. Not bringing domain experts and internal business consultants to the analytics teams to bridge the gap between business leaders and analytics teams to ensure an end-to-end journey from idea to impact.

13. Neglecting the creation of a data-driven culture through active coaching across the whole organization, from sales agents to the CEO, especially sales agents and the CEO.

14. Not being objective and remaining biased to the status quo or leadership thinking. Analytics teams deeply embedded in business functions or BUs are more likely to have these troubles than centralized ones. This is why some organizations create quality control teams.

Execution:

15. Not embedding analytics in the operating models and day-to-day workflows. This will fail to integrate technology with people. Using analytics as part of their daily activities helps users to make a data-focused judgment, to make better-informed decisions, build consumer feedback into solutions, and rapidly iterate new products; instead, many are still relying on gut feelings and Hippos on decisions (Highest Paid Person Opinions)

16. Not collocating data scientists with the business teams they support. Otherwise, they will not talk to each other.

17. Managing analytics projects in a waterfall. Parameters of a model cannot be known upfront. They are determined through an iterative process that looks more like an art than a science. Therefore analytical projects need to be iterative by following, for example, the Agile Framework.

18. Not being able to scale analytics pilots up. Analytics often starts piloting use cases Companies often kill pilots as soon as they need to reallocate funding for other shorter-term initiatives.

19. Neglecting data governance as a fundamental enabler. Data governance refers to the organization, processes, and systems that an organization needs to manage its data consistently and adequately as an asset, ranging from driving data quality to handling access control or defining the architecture of the data in a standardized way.

Technology:

20. Trying to create data science models without refining your data engineering infrastructure: cleaned repositories, efficient engines, and streamlined extract-load-transfer processes. Data engineering without actual use cases to model is also wrong. Both modeling and engineering must go in parallel or in an iterative way.

21. Not using any essential technologies: Hadoop, Spark, R, Python, an advanced visualization tool of your choice, and a granular self-service reporting system open for the whole organization.

22. Having technological siloes among data repositories makes integrating different kinds of data into a model challenging. The power of analytics increases exponentially with the diversity of data.

23. Not automating analytics through A.I., which can be a brilliant assistant to data scientists. A.I. automation help data scientists cleanse data, check for correctness, deploy models, detect relevant prediction features and obsolescence of models, or even generate hundreds or thousands of variations of models. All in all, the analytics strategy of the business has to be a subset of the whole A.I. strategy since the datasets need to feed the A. I systems.

Finance:

24. Not allocating enough budget for analytics platforms, yet still keeping Shangri-La dream expectations. And the opposite is also an error, giving more than enough money without direct correlation to business outcomes.

25. Not measuring the ROI of analytics initiatives. We know ROI is mid-term, but that does not mean you don’t measure it.

Disclaimer: Opinions in the article do not represent the ones endorsed by the author’s employer.

About the author:

Pedro URIA RECIO is a thought leader in artificial intelligence, data analytics, and digital marketing. His career has encompassed building, leading, and mentoring diverse high-performing teams, developing marketing and analytics strategy, commercial leadership with P&L ownership, the leadership of transformational programs, and management consulting.

--

--

Pedro Uria-Recio

Chief Data & AI Officer | ex-McKinsey | Forbes Tech Council | Monetize data & AI