Across the broader Data & Analytics (D&A) eco-system we have seen a number of transition shappening last couple of years (e.g. democratization of data and augmenting people with “machine”) and how organizations had to and have to adjust in how they can create an environment where data can play a key role in terms of driving competitive advantage. In addition, the need to inject data into rapidly evolving digitalization efforts and outcomes puts significant pressure on the ability to execute especially where macro-forces like evolving customer expectations or disruptive entrants can change the playing field very quickly.
Within that context in mind many (if not all) organizations have started to experiment with advanced analytics to drive either specific business outcomes or aim to transform and future-proof their business model and operations. However, there is significant research and empirical evidence out there to suggest that still potentially 80% of all advanced analytics projects (ML / AI) fail to deliver or meet expectations.
Meeting expectations of any program requires upfront consideration of what are we trying to achieve (e.g. improving customer experience vs. top line impact vs. time savings through process automation) and how can we ensure we can measure attribution. Once that is clearly articulated several key factors come into play along the journey of inception to experimentation to piloting and ultimately scaling.
First, technology in the broadest sense from transactional systems to ML Ops environments all must be an enabler and should be as friction-less as a possible in terms of talking to each other. The challenge is not every organization is digitally native nor has the ability to architect greenfield. Thus, organizing and continuously evaluating components in your tech stack from a holistic perspective (across all touch points) is a key strategy to ensure you’re not limiting your execution now or in the future.
This foundation of the house is important along with Data. Having a solid strategy in place to understand your strategic assets across the enterprise and successfully unifying them in domains that can be activated in a very agile way is another key piece to be mindful about. Data wrangling should not represent an inordinate amount of time for any initiative.
On top of both technology and data organizations must spend most of their focus on process and people as ultimately that’s where the execution and impact can drastically increase if done effectively.
Process limitations can arise from multiple areas. It is critical to understand minimum expectations and dependencies across teams as one transitions from proof of value / concept to piloting to effectively productionizing at scale. Moreover, how is the consumption going to happen? Application integration (think B2C customer personalization across the MarTech stack) vs. people consumption (how to prescribe the action to be taken by an individual and how to convey) is very different in how to orchestrate and overall change management. Process failures can happen at any stage and having an ability to iron that out pre-emptively or through collective experience is not straight forward.
People and alignment and blending across vested stakeholders across the initiative also contributes to both focus and execution. When organizational structures (across competences) are integrated well and in synch with the overall strategy it can alleviate a lot of hurdles. It is equally important to understand how human capital and talent fits into the equation and how the strategy defines D&A as internal IP (or not) to be harnessed and invested in to drive competitive differentiation and strategy.