Roughly two-thirds of U.S. gross domestic product growth in the first half of 2025 was driven by business spending on software and equipment designed to fuel AI adoption.
With so much investment and energy focused on unlocking the potential of AI, it only stands to reason that many businesses, investors and commentators are starting to question when we might expect to see a payoff.
They point to a growing “AI paradox” where, despite increased investment and adoption of AI, companies are still not able to show concrete benefits.
What’s causing this growing gap between potential and real-world value? Increasingly, it’s come down to a lack of understanding of the critical interplay between data, domain expertise and orchestration of AI workflows.
While most business leaders understand by now that they need to get their data estates in order before they can truly harness the power of AI, many are going about that process the wrong way.
They are undertaking massive data migration initiatives or focusing on small parts of their business to make the process more manageable, but few are using AI to solve the data problem. They’re looking at AI as the end goal of big data harmonization projects instead of tapping the power of AI to unlock their data.
Data-readiness makes or breaks AI initiatives
I have seen this phenomenon play out in countless businesses that have prioritized AI-driven innovation at all costs without first addressing the data and technology foundation needed to support AI-enablement.
I have also seen the flip side, where businesses have been able to reengineer enterprise workflows from the ground-up and extract enormous value from AI. In every example, the single defining characteristic that separates those that get the AI formula right from those that are struggling to get past the proof-of-concept phase is data-readiness.
In order for AI to really deliver value at the enterprise level, it needs to be able to draw on a wide variety of structured and unstructured datasets from across all aspects of the business. In most cases, these datasets are siloed, incomplete, incompatible or inaccessible to different parts of the organization.
That fragmentation is what limits the potential of so many AI projects today. It is also a problem that AI is uniquely suited to help solve. With today’s agentic AI, it is possible to use AI to automate data discovery, curation and transformation, making the right data AI-ready across the enterprise.
Reengineering enterprise workflows from the ground up
For example, the property and casualty (P&C) insurance industry has emerged as one of the most fertile test beds for AI because it is a business that thrives on unstructured data and requires a series of standardized and repetitive, yet highly complex, processes to function properly.
By all measures, it is a workflow that is ripe to be transformed by AI. Yet, many insurers have struggled to fully integrate AI in this process because they cannot get their underlying data estates in order to support a truly seamless flow of information.
To be fair, it’s a complex process. From underwriting to claims payout, insurance workflows depend heavily on data—yet each stage handles and stores it differently. Underwriters, customer service, finance, and account management all use separate data sets, and the process is often driven by manual hand-offs.
Despite advances in technology, insurers now take an average of 44 days to process a typical homeowners claim—from filing to payout. Instead of streamlining workflows, the process has become slower than ever.
Embedding AI agents at critical points in complex workflows
It does not need to be that way. Insurers that are embracing AI not as a point solution or proof of concept, but as an opportunity to reengineer critical data connections and embed AI agents across their entire data architectures, are seeing remarkably different results.
For example, one major, national insurer recently embarked on a project designed to accelerate their first notice of loss processing, with the goal of reducing the time between a customer reporting an incident to receiving payment.
By embedding agentic AI at key points in the process, and – critically – structuring all of the data so that it could be handled in a consistent, standardized format, they were able to reduce the claims cycle time from several weeks to two days.
The secret sauce for the entire project was not a bigger, better large language model or more investment in AI. It was getting better orchestration of AI in the workflow.
The insurer needed to be able to choreograph the full spectrum of accident images, audio calls, adjuster notes, claims data, historical context, vehicle identification numbers, data quality checks, coverage reviews and fraud checks to support a harmonized, AI-powered workflow.
For example, the insurer was able to introduce AI agents focused on data quality working in tandem with operations-focused AI agents that monitor ongoing operations and intervene when there are breaks or irregularities in the workflow.
This ability to have AI agents work together, as opposed to working in vertical silos, is what enables a complete, enterprise AI workflow that delivers truly transformative results.
By deploying coordinated AI agents across key datasets, businesses can build a unified data backbone that supports cross-functional operations. This approach enables systems to collaborate, adapt, and execute complex workflows seamlessly.
Data, domain and AI
As we approach the three-year anniversary of the launch of ChatGPT and the world’s great awakening to the potential of generative and agentic AI to truly transform businesses, many commentators have been searching for cracks in the foundation—examples that suggest the results aren’t living up to the hype.
In fact, the results for businesses that understand the most practical applications of AI and take the time to not just build tools but reimagine core functions are nothing short of remarkable.
For those facing challenges, it’s typically not a function of the technology itself, but of the ability to integrate it effectively.
This is not the kind of technology you buy off the shelf, plug in and wait for results. It requires AI enablers who understand how to work with the data required to power AI and the detailed knowledge of industry-specific workflows that are the best candidates for transformation.
That’s a delicate balance, but the sky’s the limit for those who get the formula right.
We’ve featured the best AI writer.
This article was produced as part of TechRadarPro’s Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro