Getting Started with Analytics: The Actionable and Measurable Way
This article is the last in a series of four, where we mention some of the most discussed points to keep in mind before taking the big leap towards analytics. You can read the first 3 articles by clicking below-
Impact at scale needs implementation at scale, and that is what has been missing from the big picture of big data. ‘IT integration’ and ‘limited adoption’ have been cited as two key challenges to the success of enterprise analytics transformation. According to a recent study by EY, while 81% of companies accept that data should be at the core of all decision-making, only 31% have reshaped their operations significantly to achieve this.
Think of the most disruptive businesses today and their moat. It is, of course, Data – the ‘ever-evolving and everlasting’ use of data to almost continuously improve user-experience, processes, and create differentiating capabilities before everyone else. While ‘going digital’ is a revolutionary idea, the real potential lies in the continual and self-sustaining application of analytics to generate new revenue streams, gain insights into consumer behavior, improve returns (RoI) on marketing initiatives, and make supply chains efficient.
There is a need to move away from treating data collection as just a hobby. Once the analytics roadmap is in place and data maturity has been achieved, there is a need to move from implementing small scale pilots or ad-hoc projects and towards embracing analytics across the organization. Here are some challenges that organizations may face in this respect and some ideas on how to resolve them.
Collaboration Between Stakeholders
Ownership of implementation is not clearly defined between analytics, IT, and business units. Hence, insights from analytics get lost somewhere on their way to becoming outcomes. Often, what can be easily analyzed might not be so easy to implement and vice-versa. Poor communication between departments might also arise from non-standard sources of information, programming languages, metrics, and business goals.
Since operational projects are not as flexible as analytics, it becomes important to adopt a collaborative approach. Specialized resources with complementary skills across multiple functions are needed to ensure successful collaboration and alignment between analytics and business teams. Such ‘translators,’ as shown in the infographic below, can help bridge the gap between functional teams.
The Elephant in The Room
Historically, decision making in organizations has been dominated by gut and intuition rather than by data and evidence. This made sense so far because data was either sparse or not available, and computational powers were limited. Hence, executives leaned on their years of learning (experience) and wit (intuition) to process information and drive business. This rather old school approach of HiPPO – ‘the highest-paid person’s opinion’ of ‘giving the right answer’ needs to give way to the approach of ‘asking the right question’ from data and letting the data talk. To move away from dependency on intuitive and conventional wisdom, driving change management from the top and inculcating ingrained analytics across the organization should be a top priority.
At the same time, managers need to be sensitive to the fact that a sudden change for everybody can be overwhelming. Hence, a realistic, phase-wise shift should be planned, targeting a few chosen, business-critical processes first. Along with a mindset shift, employees need to be trained on how to make sense of data in their day-to-day jobs. Employees should be empowered to do basic data wrangling and reporting operations themselves using MS Excel, SQL, and R/Python to reduce dependency on the IT and analytics teams.
The Trust Gap
A Pandora’s Box
Another major challenge in frontline adoption of analytics finds its roots in a lack of understanding and trust in analytics. Analytics is conceived as a ‘black box’ by business leaders, and this lack of transparency leads to them undermining analytics outputs. This trust gap can be attributed to complex deep learning models and a lack of interest and skills in traditional, ‘business-focused’ leaders to interpret analytics. To win their confidence in data findings, data scientists should form a habit of explaining prediction models and approach-creation processes of business hypotheses, handling data nuisances, assumptions behind models, and the different approaches and algorithms that were tried out.
Employees are likely to be more open to change if they are prepared for the transition to analytics-embedded processes. Workflows and business apps also need to be redesigned. For example, the deployment of a chatbot that addresses customer complaints and queries requires the entire customer experience process to be drawn up again. Analytics can significantly change the distribution of time across activities, making most manual tasks redundant. Employees might be taken aback by this abrupt change for fear of job losses, and hence, this transition needs to be gradual.
For such employees, job responsibilities and expectations need to be re-aligned to develop a future-ready workforce. For instance, automated reporting and sales forecasting may free up a lot of time that a regional sales manager spends on setting sales targets and tracking the performance of sales reps. This time could be spent on more profitable activities, like engaging with high-value customers, analyzing market factors, and sales incentive planning.
Measurement and Attribution
Can’t Measure, Can’t Manage.
While the model evaluation process often focuses on its performance in terms of accuracy, precision, and recall, there is only a minimal understanding of the factors that contribute to the outcome. In some use-cases, e.g., fraud modeling and autonomous driving, model accuracy is of utmost importance. However, for most common commercial processes, like inventory planning or sales forecasting, it is important to understand the key factors and influencers driving the predictions. Unfortunately, while many analysts focus on ‘what did/does/will happen’ to make impactful process changes, the ‘why’ is often the most important. This helps understand the underlying causal relationships, and consequently, ‘Which factors can be changed and to what extent?’ and ‘What will be the outcome of such changes?’
If the business impact is not measured, justifying investment in any project becomes difficult. An effective measurement framework should span across all levels of the organization. Metrics should be tied directly to tangible business value and indirectly to frontline adoption, employee productivity, and other soft goals. Along with observed historical metrics, an evaluation of expected future values would help in prioritizing long term analytics initiatives. One such framework covering frequently used function-specific metrics is showcased below.
Maintenance and Improvement
Circle of Life
Changing business scenarios and processes lead to a decay in model performance over time and might render older models faulty or obsolete. Hence, the measurement of RoI and other metrics in analytics projects is a continuous process and should be followed up with model adjustments to reflect changing trends. E.g., given the evolving and dynamic nature of frauds, fraud detection models should be fed new rules and/or the latest data periodically to uncover new patterns of fraudulent activities. Wherever possible, dynamic ML models like online learning and incremental learning models should be deployed to avoid frequent changes or fine-tuning.
Further, one should be on the lookout for newer business cases and advanced algorithms to improve analytics efficiency. Analytics should be conceived as a continuous journey of value discovery through the exploration of new business problems, analytical tools, and frameworks. Projects with a proven potential should be rolled out to production and scaled across functions and regions, while failures attributed to IT limitations, data unavailability, or execution hurdles should not be pursued further.
Businesses take months or even years to develop data systems and analytics from concept to action. Major roadblocks can be organizational change management and finding the right implementation partner. To enable a broader transformation, leaders need to develop a robust ecosystem of analytics developers, integrators, and enablers. Internal capability development can be undertaken to empower decision-makers with data management and enhance workforce proficiency for both – the technical and business sides of data.
The transition to organizational analytics requires the collaboration and technical expertise of cross-functional stakeholders during different stages. A well-instituted ecosystem also needs the support of reliable, integrated, and automated processes, managed by a motivated and well-skilled team. Bringing in the right combination of external partners, including software licensors, AI experts, and domain consultants, can help bridge the gap and expedite the process.
About the Author
Amit, a Data Science and Artificial Intelligence professional is currently working as a Director at Nexdigm (SKP), a global business advisory organization serving clients from 50+ countries. He holds over 15 years of experience across industries and has worked from both perspectives, as an internal functional expert (in Vodafone, Aviva Insurance, GE) and as a consultant. A passionate advocate of data science, Amit constantly endeavors to create optimal, actionable solutions, that help derive measurable business value from data.
Connect with Amit – LinkedIn.