Pedro Uria Recio, Vice President, Head of Axiata Analytics, Axiata [MYX: 6888]
Most common mistakes corporates make when it comes to putting in place their advance analytics programs can be grouped into strategy, people, execution, technology, and finance.
1. Day-dreaming that analytics is a plug-and-play magic wand that will bring short term ROI. Well executed basic excel models might have brought quick wins in the 2000s but advanced analytics requires some time. Analytics is never plug-and-play because plugging data into models is extremely lengthy, because learnings are not transferable across companies or markets and because they require a high OPEX in people and high CAPEX in systems.
2. Solving problems that are not really worth solving; which results in a waste of time and resources. Analytics is not about solutions looking for problems but problems looking for solutions. Questions such as “What can we do with deep learning?” do not make sense. The worst mistake of the Chief Data Analytics Officer is not having an extremely clear view of what key challenges and opportunities each functional area is confronted with.
3. Relying solely on vendors or consultants for analytics, especially on model creation. Analytics is the brain of your company. How could corporates even think they could outsource it?
4. Not developing a fully comprehensive list of priorities, since you can only count with five fingers in one hand. Therefore, we should pick at most five metrics rather than making everything seem important.
5. Saying yes to random management requests, like pet projects or glamorous visualizations and reporting which often results in an analysis-paralysis syndrome.
6. Not using external data. Companies have been exposed to their internal data for years, maybe not at its full potential, but most of them are somehow familiar with it. Internal data is not going to radically transform the business. Only external data from other companies or from the public domain can transform the business: social media, maps, competitors’ products, and prices, and digital advertising records, etc. But most companies are not doing enough building external data assets.
7. Organizing analytics under functions which do not drive the business on a daily basis such as IT or strategy. Analytics is only powerful if it is coupled organizationally with daily operations.
8. Letting multiple analytics teams flourish with organizational siloes among them. Analytics needs to keep an integrated view of the business.
9. Attracting talent only through base compensation. Instead, it is necessary to build a sense of purpose, to create a powerful employer brand and to develop internal talent.
The worst mistake of the Chief Data Analytics Officer is not having an extremely clear view of what key challenges and opportunities each functional area is confronted with
10. Hiring a bunch of PhDs who strive to develop highly nuanced models instead of directionally correct rough-and-ready solutions, hence they fail to prove actionable insights. Preferably, hire highly coachable fast learners.
11. Hiring a technical Chief Data Analytics Officer. Hiring a non-technical Chief Data Analytics Officer. Instead, he needs to be both: technical enough to coach his team and business-driven enough to understand business problems.
12. Not bringing domain experts and internal business consultants to the analytics teams to bridge the gap between business leaders and analytics teams to ensure an end-to-end journey from idea to impact.
13. Neglecting the creation of a data-driven culture through active coaching across the whole organization from sales agents to the CEO.
14. Not being objective enough and remaining biased to the status-quo or leadership thinking. Analytics teams deeply embedded in business functions or BUs are more likely to have these troubles than centralized ones.
15. Not embedding analytics in the operating models and day-to-day workflows. This will result in a failure to integrate technology with people. Using analytics as part of their daily activities, help users to make data-focused judgement, make better-informed decisions, build consumer feedback into solutions and rapidly iterate new products. Instead, many are still relying on gut feelings and highest-paid-person opinions (Hippos) on decisions.
16. Not collocating data scientists with the business teams they support. Otherwise, they will not talk to each other.
17. Managing analytics projects in waterfall. Parameters of a model cannot be known upfront. They are determined through an iterative process that looks more like an art than a science. Therefore, analytical projects need to be iterative by following, for example, the Agile Framework.
18. Not being able to scale analytics pilots up. Analytics often starts piloting use cases Companies often end up killing pilots as soon as they need to reallocate funding for other shorter-term initiatives.
19. Neglecting data governance as a fundamental enabler. Data governance refers to the organization, processes, and systems that an organisation needs to manage its data properly and consistently as an asset, ranging from managing data quality to handling access control or defining the architecture of the data in a standardized way.
20. Trying to create data science models without refining your data engineering infrastructure: cleaned repositories, efficient engines, and streamlined extract-load-transfer processes. Data engineering without real use cases to model is also wrong. Both modelling and engineering must go in parallel or in an iterative way.
21. Not using any of the following basic technologies: Hadoop, Spark, R, Python, potentially a cloud platform of your choice, an advanced visualization tool and a granular self-service reporting system open for the whole organization.
22. Having technological siloes among data repositories which makes it difficult to integrate different kinds of data into a model. The power of analytics increases exponentially with the diversity of data.
23. Not automating analytics with A.I., which can be an extremely smart assistant to data scientists they help them to cleanse data, to check for correctness, to deploy models, to detect relevant prediction features and obsolescence of models, or even to generate hundreds or thousands of variations of models.
24. Not allocating enough budget for analytics platforms, but yet still keeping Shangri-La dream expectations. And the opposite is also an error, allocating more than enough money with no direct correlation to business outcomes.
25. Not measuring the ROI of analytics initiatives. We know ROI is mid-term but that does not mean you don’t measure it.