Almost three-quarters of enterprises now have AI in production. But only a fraction have rewired their operations around it. The gap between those two numbers is where most organisations are stuck, and it's widening.
What You Need to Know
- Most enterprises have adopted AI. Almost none have executed on it. 72% have at least one AI workload in production, but roughly 6% have rewired operations around AI at scale.
- Adoption without execution has no measurable impact on EBIT. 80% of enterprises report no meaningful financial uplift from their AI investments.
- The primary barrier is human, not technical. 63% of enterprises cite people, change management, and organisational factors as their top challenge.
- Point solutions don't compound. Bolt-on AI creates isolated tools. Platform thinking creates an engine that gets smarter and cheaper with each deployment.
- The window is closing. By 2027, more than 40% of enterprise AI agent projects will be abandoned. The organisations that execute now will be difficult to catch.
72%
of enterprises have AI in production
Source: McKinsey, The State of AI in 2025
~6%
have rewired operations around AI at scale
Source: McKinsey, The State of AI in 2025
80%
report no measurable impact on enterprise EBIT
Source: Deloitte, State of AI in the Enterprise, 2026
Adoption Is Not Execution
There's a version of AI adoption that looks impressive in a board report but changes nothing. You've deployed a chatbot. Your marketing team uses generative AI for copy. Someone in finance built a forecasting model. You can tick the "AI-enabled" box.
But your operations haven't changed. Decisions are made the same way. Workflows run the same paths. Your data sits in the same silos. The AI is a feature bolted onto the side of your business, not woven into how it runs.
That's adoption. It's not execution.
Execution means your AI capabilities inform how work gets done. They change who does what, how decisions flow, where human attention goes. Execution compounds. Each AI capability makes the next one cheaper, faster, and more accurate because it builds on shared infrastructure, shared data, and shared organisational muscle.
The gap between these two states is enormous. And most enterprises are sitting squarely on the wrong side of it.
Why Most AI Initiatives Stall
I see the same three patterns in almost every enterprise AI programme that stalls.
1. Bolt-On AI
The organisation buys or builds AI tools for specific problems without connecting them. A chatbot here, a document classifier there, an automation script somewhere else. Each one works in isolation. None of them learn from each other. None of them share data or infrastructure. The total cost of ownership climbs with every new tool, but the total value doesn't compound.
This is the most common pattern. It feels productive because things are shipping. But it's building a collection, not a platform.
2. Strategy Without Execution
The opposite problem. The organisation invests heavily in AI strategy, runs workshops and assessments, produces roadmaps and maturity models, but never builds anything that touches a real workflow. The strategy keeps evolving because the organisation keeps learning more about AI, and so the target keeps moving.
Strategy without execution is just expensive education. At some point, you have to build.
3. Point Solutions That Can't Scale
The organisation successfully deploys AI for one use case, then tries to replicate it. But the first solution was custom-built, tightly coupled to one team's data and one team's workflow. Replicating it means rebuilding from scratch each time. The second deployment costs as much as the first. So does the third.
This is the scaling problem. And it's an architecture problem, not a budget problem.
What Execution Actually Looks Like
The 6% of enterprises that have crossed the execution gap share three characteristics.
Platform thinking. They built shared AI infrastructure, not isolated tools. A common data layer, shared model orchestration, consistent governance, and reusable components. When a new team wants AI capability, they don't start from zero. They extend what already exists.
Compounding modules. Each AI capability builds on the last. The document intelligence module feeds the compliance module. The compliance module feeds the risk module. The data gets richer, the models get better, and the marginal cost of each new capability drops.
The 80/20 model. They don't try to automate everything. They identify the 20% of workflows where AI creates disproportionate value, and they execute ruthlessly on those. The remaining 80% gets handled later, or not at all. Focus beats coverage.
This isn't theoretical. We've seen it work at RIVER Group across insurance, government, health, and professional services engagements. The organisations that move fastest are the ones that invest in the foundation before they invest in the features.
What to Do
If you're an enterprise leader reading this and recognising your organisation in the patterns above, here are three concrete next steps.
1. Audit your current AI for compounding potential. List every AI initiative in your organisation. For each one, ask: does this share data, infrastructure, or learnings with any other initiative? If the answer is no across the board, you have a collection problem, not a scaling problem. Fix the architecture before adding more tools.
2. Pick one workflow and rewire it. Not "add AI to it." Rewire it. Take a process that matters to your bottom line, and redesign it with AI as a core part of the decision-making and execution flow. This is harder than bolting on a chatbot. It requires change management, data work, and stakeholder alignment. But it's the only way to close the execution gap.
3. Stop measuring adoption. Start measuring impact. "Number of AI tools deployed" is a vanity metric. "Reduction in processing time," "improvement in decision accuracy," "cost per transaction" are real metrics. If you can't connect your AI investment to business outcomes, you're not executing. You're experimenting. And experiments without measurement are just hope.
The gap isn't technology. It's execution. And execution is a leadership problem, not an engineering problem.
