Skip to main content

The AI Readiness Assessment: A Framework for Enterprise Leaders

A structured assessment across five dimensions - data, process, team, governance, and leadership - to determine where your AI journey should start.
20 June 2024·7 min read
Dr Tania Wolfgramm
Dr Tania Wolfgramm
Chief Research Officer
Isaac Rolfe
Isaac Rolfe
Managing Director
"Are we ready for AI?" is the wrong question. No organisation is fully ready. The right question is: "Which parts of our organisation are ready enough to start, and what do we need to address as we go?"

What You Need to Know

  • AI readiness isn't binary. It's a spectrum across multiple dimensions. Most enterprises are ready in some areas and not others. The goal is to find the intersection of readiness and value.
  • The five dimensions: data readiness, process maturity, team capability, governance foundations, and leadership alignment. Weakness in any one dimension can stall an initiative; you don't need perfection in all five.
  • This assessment takes 1-2 days with the right people. It produces a clear picture of where to start and what to address. Far more actionable than a generic "AI maturity model."
  • The assessment should be repeated every 6 months as capabilities and readiness evolve.
63%
of organisations don't have or aren't sure they have the right data management for AI
Source: Gartner, AI Data Management Survey, 2024

The Five Dimensions

1. Data Readiness

ScoreLevelDescription
1FragmentedData lives in silos, largely inaccessible, minimal governance
2StructuredKey data is digital and somewhat organised, but not integrated or AI-accessible
3AccessibleData can be queried and extracted programmatically; some quality issues
4ManagedData pipelines exist; quality is monitored; governance is in place
5OptimisedData is AI-ready: clean, integrated, well-governed, continuously updated
What matters most: Accessibility beats quality. AI can work with messy data it can access far better than clean data it can't reach. If you score 2+, you can start an AI initiative. The initiative itself will drive data improvement.

2. Process Maturity

ScoreLevelDescription
1Ad hocProcesses are informal, inconsistent, undocumented
2DocumentedKey processes are written down; some variation between teams
3StandardisedProcesses are consistent, with clear inputs, outputs, and decision criteria
4MeasuredProcess performance is tracked with KPIs; bottlenecks are known
5OptimisedProcesses are continuously improved based on data; ready for AI redesign
What matters most: Process clarity: can an expert explain the "right answer" for a given input? AI needs clear decision criteria, even if the current process is imperfect.

3. Team Capability

ScoreLevelDescription
1UnawareNo AI literacy; no technical AI capability
2CuriousSome individuals exploring AI tools; no formal capability
3LiterateLeadership understands AI opportunity; some team members skilled with AI tools
4CapableCross-functional AI team exists; domain experts + technical skills
5ExpertDeep internal AI capability; can lead AI initiatives independently
What matters most: Domain expertise. You can partner for AI technical capability. You can't partner for deep understanding of your business processes and data.

4. Governance Foundations

ScoreLevelDescription
1NoneNo AI-specific policies or governance
2BasicAI usage policy exists; basic data classification for AI
3DevelopingGovernance framework defined; risk classification for AI applications; monitoring in place
4MatureFull governance aligned with ISO 42001; regular review cycle
5EmbeddedGovernance is integrated into AI engineering practice; automated compliance checks
What matters most: Having something rather than nothing. A basic AI usage policy and data classification is enough to start. Governance matures alongside your AI capabilities.

5. Leadership Alignment

ScoreLevelDescription
1DisconnectedNo board/leadership engagement on AI
2CuriousLeadership is interested but hasn't committed resources
3CommittedExecutive sponsor identified; budget allocated; expectations set
4AlignedLeadership understands AI as business transformation, not just technology; asks the right questions
5DrivingLeadership actively champions AI; organisation-wide mandate and resources
What matters most: Executive sponsorship. One senior leader who understands the opportunity, commits resources, and shields the initiative from organisational antibodies.

Interpreting Your Assessment

Total score 20-25: Highly ready. Focus on execution: discovery sprint → build → scale. Total score 15-19: Ready to start with targeted initiative. Address weakest dimension in parallel. Total score 10-14: Foundational work needed. Invest in AI literacy, data organisation, and governance basics. Run a discovery to build the business case for further investment. Total score 5-9: Significant groundwork required. Focus on data accessibility, process documentation, and leadership alignment before AI-specific investment.
Critical rule: No dimension below 2. A single dimension at 1 will block your AI initiative regardless of how strong the other dimensions are.
The Minimum Viable Readiness
You need: Data readiness ≥ 2, Process maturity ≥ 3, Team capability ≥ 2, Governance ≥ 2, Leadership ≥ 3. That's a score of 12, and it's enough to start a meaningful AI initiative. Don't wait for a score of 25.
Should we run this assessment before or after an AI discovery sprint?
Either works. If you want to validate readiness before investing in discovery, run the assessment first (1-2 days). If you'd rather combine them, a good discovery sprint includes a readiness assessment as part of Phase 1. The discovery sprint produces a more detailed, use-case-specific readiness picture.
What if our score is low but leadership is pushing for immediate AI deployment?
Use the assessment to redirect, not resist. "We're committed to AI and we've identified what we need to address to succeed. Here's our 90-day plan to improve data readiness and governance foundations, with a discovery sprint in month 2." This demonstrates commitment while managing expectations.