Data is a highly valued asset, yet data projects remain a critical challenge for most organisations.
Few of us would deny that the 2021 Tokyo Olympics were a triumph for organisers, athletes, and their support teams as they were able to finally compete on the international stage.
World-class athletes invest critical preparation time as part of the strategy to enable them to win, despite knowing that it won’t guarantee them a medal. When the Games were postponed due to COVID, few stopped preparing, despite not knowing if the events would eventually go ahead. As it was, the Tokyo Games gave us some truly inspirational moments – and showed that the road to Tokyo was a marathon effort combined with sprints.
Just as setbacks are nothing new for athletes, the business community has had to adapt to ever-changing market conditions and significant uncertainty as the pandemic has continued to disrupt and influence how organisations are able to sustain performance and manage the complexities of their operations.
One of the bigger challenges for business is being able to tap into and leverage analytics tools to make the kinds of decisions that are anchored in trustworthy and accurate data.
Just as athletes rely on past performance data to improve their stride or time, business needs the ability to look at what improvements it must make to grow and remain sustainable. The democratisation of business data across the organisation is no small undertaking yet it is critical to enable stakeholders and decision-makers to act on trusted, valuable, and contextual insights. Ultimately, data is a vital business asset.
As COVID continues to disrupt performance and market confidence, we’ve seen a distinct shift from retrospectively looking at past performance data to applying a predictive lens to analytics that enables organisations to tackle the uncertainty of the economic, workforce, supply chain and financial challenges head-on.
We are also seeing a distinct change in how decisions around data intelligence project investment are being made as business leaders make measured investments that can deliver quick wins that must also create and sustain the momentum for wider adoption and change.
Moving fast and slow: Finding the right cadence in data projects
The pandemic has shown us that organisations can move quickly when they achieve the right balance of cadence and speed of delivery and a commitment to doing the preparation needed to drive change into the business. Projects that move with haste rather than speed often run into significant challenges.
Daniel Kahneman’s eponymous book Thinking, Fast and Slow outlined the dichotomy of two modes of thought: System 1 as fast, instinctive, and emotional while System 2 is slower, more deliberative, and more logical. The delineation – and how they complement each other – is an interesting lens to apply to the data intelligence journey.
The complexity of delivering quick results to achieve sustainable change is about preparedness. Quick wins in data intelligence projects when they are typically long and difficult projects seems a paradox and yet to go faster, some projects need to start slow and then accelerate when the business case is robust enough.
Given the dependencies that organisations have on their data, it has become increasingly vital to demonstrate value early in the data intelligence journey to reassure the executive and stakeholders that the business is on the right path.
Cultivating the use case
As purchasing thresholds change and stakeholders lament the prevalence of ‘death by statistics on a slide deck’, one of the most effective ways to gain buy-in from the executive team is engaging a delivery team that can help them to develop the unique use case(s) that are most relevant to their business before making a substantial data intelligence investment.
Pilot projects are nothing new yet can be invaluable if it supports the business to qualify the investment before a wider – and more expensive – data intelligence rollout. Pilot projects enable organisations to de-risk data intelligence projects early and what’s important to the business in terms of outcomes and benefits realisation.
Smaller projects can be designed to demonstrate and measure the potential of data analytics to solve and identify challenges before the larger-scale investment occurs while at the same time testing and validating project readiness, governance, and in-house capability to deliver. The process can also be invaluable for bringing in the right skills and resources at the right time to ensure the data intelligence program is set up for success.
The ability to deliver quick wins at a smaller scale can create momentum for continued, productionised releases into the business that help nurture adoption and manage the change that is required across users. Incremental successes support stakeholder buy-in to the data intelligence journey as the capabilities of the business increase and the data challenges start to disappear. Issues such as protecting data from unauthorised access or misuse, data corruption, and – arguably, most importantly – ensuring the quality of data is sustained can be flagged early and addressed at the right time.
Establishing the data management framework
Managing multiple data sources and disparate systems that are becoming legacy, due to be replaced, or need to be upgraded to meet the needs of the business is a well-trodden path. And while ML-driven approaches might be the approach du jour, the reality is that the data intelligence journey remains a significant challenge for most organisations.
As more organisations migrate their data to the cloud, digital intelligence project success needs to be anchored to a data management framework (DMF) with components such as quality control, clear governance, the right architecture, user scenarios, effective analytics, and appropriate security.
There is a growing trend of technology vendors integrating and embedding machine learning into their products to:
- Detect anomalies in the data’s profile
- Generate data quality rules automatically
- Reconcile replication errors; and
- Discover hidden data relationships
While machine learning allows organisations to centralise and automate data quality workflows to comply with regulations and streamline their data and analytics processes across the enterprise, a DMF provides the key function of defining clear and simple strategic objectives such as improving data quality, simplifying the data sources coming into the business, and industrialising data management.
Manage the sprints to sustain the marathon
Delivering clear benefits and change into any organisation takes time and focus. As stakeholders invariable shift in and out of longer-term project delivery, maintaining focus on the business case can be difficult as can managing the project amongst BAU demands.
While the speed of digital transformation is shortening the data intelligence project timeframe, it’s important that organisations are able to manage the balance of sprints against the marathon that is typical of large scale, organisation-wide data projects. Ultimately, it’s about finding the right cadence.
We believe that quality thought leadership is worth sharing and encourage you to share with your colleagues. If you’re interested in republishing our content, here’s what’s okay and what’s not okay.
To speak to our team about how we can help your business deliver better projects, please contact us.