"Are we ready for AI?" is the wrong question. No organisation is fully ready. The right question is: "Which parts of our organisation are ready enough to start, and what do we need to address as we go?"
What You Need to Know
- AI readiness isn't binary. It's a spectrum across multiple dimensions. Most enterprises are ready in some areas and not others. The goal is to find the intersection of readiness and value.
- The five dimensions: data readiness, process maturity, team capability, governance foundations, and leadership alignment. Weakness in any one dimension can stall an initiative; you don't need perfection in all five.
- This assessment takes 1-2 days with the right people. It produces a clear picture of where to start and what to address. Far more actionable than a generic "AI maturity model."
- The assessment should be repeated every 6 months as capabilities and readiness evolve.
63%
of organisations don't have or aren't sure they have the right data management for AI
Source: Gartner, AI Data Management Survey, 2024
The Five Dimensions
1. Data Readiness
| Score | Level | Description |
|---|---|---|
| 1 | Fragmented | Data lives in silos, largely inaccessible, minimal governance |
| 2 | Structured | Key data is digital and somewhat organised, but not integrated or AI-accessible |
| 3 | Accessible | Data can be queried and extracted programmatically; some quality issues |
| 4 | Managed | Data pipelines exist; quality is monitored; governance is in place |
| 5 | Optimised | Data is AI-ready: clean, integrated, well-governed, continuously updated |
What matters most: Accessibility beats quality. AI can work with messy data it can access far better than clean data it can't reach. If you score 2+, you can start an AI initiative. The initiative itself will drive data improvement.
2. Process Maturity
| Score | Level | Description |
|---|---|---|
| 1 | Ad hoc | Processes are informal, inconsistent, undocumented |
| 2 | Documented | Key processes are written down; some variation between teams |
| 3 | Standardised | Processes are consistent, with clear inputs, outputs, and decision criteria |
| 4 | Measured | Process performance is tracked with KPIs; bottlenecks are known |
| 5 | Optimised | Processes are continuously improved based on data; ready for AI redesign |
What matters most: Process clarity: can an expert explain the "right answer" for a given input? AI needs clear decision criteria, even if the current process is imperfect.
3. Team Capability
| Score | Level | Description |
|---|---|---|
| 1 | Unaware | No AI literacy; no technical AI capability |
| 2 | Curious | Some individuals exploring AI tools; no formal capability |
| 3 | Literate | Leadership understands AI opportunity; some team members skilled with AI tools |
| 4 | Capable | Cross-functional AI team exists; domain experts + technical skills |
| 5 | Expert | Deep internal AI capability; can lead AI initiatives independently |
What matters most: Domain expertise. You can partner for AI technical capability. You can't partner for deep understanding of your business processes and data.
4. Governance Foundations
| Score | Level | Description |
|---|---|---|
| 1 | None | No AI-specific policies or governance |
| 2 | Basic | AI usage policy exists; basic data classification for AI |
| 3 | Developing | Governance framework defined; risk classification for AI applications; monitoring in place |
| 4 | Mature | Full governance aligned with ISO 42001; regular review cycle |
| 5 | Embedded | Governance is integrated into AI engineering practice; automated compliance checks |
What matters most: Having something rather than nothing. A basic AI usage policy and data classification is enough to start. Governance matures alongside your AI capabilities.
5. Leadership Alignment
| Score | Level | Description |
|---|---|---|
| 1 | Disconnected | No board/leadership engagement on AI |
| 2 | Curious | Leadership is interested but hasn't committed resources |
| 3 | Committed | Executive sponsor identified; budget allocated; expectations set |
| 4 | Aligned | Leadership understands AI as business transformation, not just technology; asks the right questions |
| 5 | Driving | Leadership actively champions AI; organisation-wide mandate and resources |
What matters most: Executive sponsorship. One senior leader who understands the opportunity, commits resources, and shields the initiative from organisational antibodies.
Interpreting Your Assessment
Total score 20-25: Highly ready. Focus on execution: discovery sprint → build → scale.
Total score 15-19: Ready to start with targeted initiative. Address weakest dimension in parallel.
Total score 10-14: Foundational work needed. Invest in AI literacy, data organisation, and governance basics. Run a discovery to build the business case for further investment.
Total score 5-9: Significant groundwork required. Focus on data accessibility, process documentation, and leadership alignment before AI-specific investment.
Critical rule: No dimension below 2. A single dimension at 1 will block your AI initiative regardless of how strong the other dimensions are.
The Minimum Viable Readiness
You need: Data readiness ≥ 2, Process maturity ≥ 3, Team capability ≥ 2, Governance ≥ 2, Leadership ≥ 3. That's a score of 12, and it's enough to start a meaningful AI initiative. Don't wait for a score of 25.
- Should we run this assessment before or after an AI discovery sprint?
- Either works. If you want to validate readiness before investing in discovery, run the assessment first (1-2 days). If you'd rather combine them, a good discovery sprint includes a readiness assessment as part of Phase 1. The discovery sprint produces a more detailed, use-case-specific readiness picture.
- What if our score is low but leadership is pushing for immediate AI deployment?
- Use the assessment to redirect, not resist. "We're committed to AI and we've identified what we need to address to succeed. Here's our 90-day plan to improve data readiness and governance foundations, with a discovery sprint in month 2." This demonstrates commitment while managing expectations.

