Skip to main content

The AI Literacy Gap in NZ Boardrooms

NZ boards are making multi-million dollar AI investment decisions without AI literacy. The gap is real, the consequences are expensive, and the fix isn't another briefing paper.
20 August 2024·9 min read
Tim Hatherley-Greene
Tim Hatherley-Greene
Chief Operating Officer
Isaac Rolfe
Isaac Rolfe
Managing Director
Here's a pattern we've seen across a dozen NZ board presentations this year: a management team presents an AI business case. The board asks questions. The questions reveal a fundamental misunderstanding of what AI is, what it costs, what it risks, and what it can deliver. The business case gets approved, deferred, or rejected for the wrong reasons. And nobody in the room realises.
The AI literacy gap in NZ boardrooms isn't about intelligence. These are sharp, experienced directors. It's about exposure. Most NZ directors built their careers in an era when technology decisions were operational, not strategic. AI has changed that equation, and the adjustment is still catching up.

The Size of the Gap

We don't have comprehensive NZ survey data on board AI literacy (someone should fix that). But based on our direct experience presenting to and advising NZ boards across multiple sectors, the pattern is consistent:
~85%
of NZ board members we've engaged can define AI only in general terms (e.g. 'computers that learn')
Source: RIVER Group, based on direct board engagement across 12 NZ enterprises, 2023-2024
~15%
can distinguish between different AI approaches (e.g. generative AI vs traditional ML) and their implications
Source: RIVER Group, based on direct board engagement across 12 NZ enterprises, 2023-2024
~5%
can evaluate an AI business case on technical merit - not just financial projections
Source: RIVER Group, based on direct board engagement across 12 NZ enterprises, 2023-2024
These aren't failure statistics. They're a description of a market where AI governance capability hasn't caught up with AI investment velocity.
NZ Board AI Literacy Levels
Source: RIVER Group, board engagement across 12 NZ enterprises, 2023-2024

What the Gap Looks Like in Practice

The "It's Just IT" Mistake

The most common board-level error: treating AI as an IT project. Delegating it to the CIO. Evaluating it with the same framework used for ERP upgrades or cloud migrations.
AI is not an IT project. It's a business capability with IT components. The difference matters because:
  • AI ROI depends on business process redesign, not just technology deployment
  • AI risk includes reputational, ethical, and regulatory dimensions beyond IT risk
  • AI success requires organisational change, not just system implementation
  • AI governance needs business leadership, not just technical oversight
When boards delegate AI to IT, they get technically competent deployments that don't change how the business operates. The technology works. The value doesn't materialise.

The Vendor Trust Problem

Boards that lack AI literacy default to trusting the vendor. The vendor says it'll take six months and cost $500K. The vendor says accuracy will be 95%. The vendor says ROI will be 3x in year one.
A board with AI literacy would ask:
  • "What does 95% accuracy mean for our specific use case? What happens with the other 5%?"
  • "Is the $500K implementation cost or total cost including API consumption, infrastructure, and ongoing support?"
  • "What assumptions underpin the 3x ROI? Are those assumptions validated against our data and processes?"
  • "What's the vendor's track record with organisations similar to ours in scale and sector?"
Without literacy to ask these questions, boards approve business cases built on vendor optimism. The inevitable gap between promise and delivery erodes board confidence in AI, making the next business case harder to approve, even when it's well-constructed.

The Risk Miscalibration

Boards tend to either dramatically overestimate or dramatically underestimate AI risk. Both are consequences of the literacy gap.
Overestimation: "AI will make decisions that expose us to liability." This leads to over-governance - committees, policies, and review processes so heavy that AI initiatives suffocate before delivering value. The board is managing perceived risk, not actual risk.
Underestimation: "It's just a tool, like any other software." This leads to under-governance - no AI-specific risk framework, no ethical review, no monitoring of AI decision quality. The board doesn't know what it doesn't know.
The right calibration is somewhere between these extremes, and it requires enough literacy to understand what AI can do, how it fails, and what governance is proportionate.
The goal isn't to turn every director into an AI engineer. And it has a practical solution.
Tim Hatherley-Greene
Chief Operating Officer

Why Standard Board Education Doesn't Work

Most attempts to close the AI literacy gap follow a familiar pattern: hire a consultant to deliver a board briefing. Two hours. Slides about transformative potential. Maybe a live demo of ChatGPT. Directors nod, ask a few questions, and leave knowing slightly more vocabulary but no more capability.
This doesn't work because:
It's too abstract. Generic AI briefings don't connect to the board's actual decisions. "AI can transform insurance claims" is less useful than "here's how AI would change our specific claims process, with our data, for our customers."
It's one-directional. Briefings deliver information. Literacy requires interaction - asking questions, challenging assumptions, making decisions and seeing consequences.
It's one-off. AI moves fast. A briefing from six months ago is already outdated. Literacy needs to be maintained, not acquired once.
It doesn't address the real barrier. The real barrier isn't knowledge. It's confidence. Directors don't ask probing questions about AI because they're afraid of revealing ignorance. The educational format needs to create psychological safety, not just transfer knowledge.

What Actually Works

Based on what we've seen produce genuine capability improvement in NZ boards:

1. Hands-On Workshops (Not Briefings)

Put the AI in front of directors. Let them use it. Give them a realistic business scenario and let them interact with an AI system, see what it does well, see where it fails, and form their own judgement. Two hours of hands-on experience builds more literacy than ten hours of presentations.

2. AI-Specific Board Committee

A standing committee (not a one-off working group) with AI governance as its mandate. Regular updates, regular education, regular engagement with management on AI strategy. This creates a sustained learning environment and signals that AI governance is a board-level priority.

3. Board Composition Review

The hardest conversation. Does the board have any directors with genuine technology depth? Not just "used to work in tech" - current, relevant understanding of AI and its implications. Most NZ boards don't. The skills matrix needs updating.
This doesn't mean replacing directors. It means adding AI-literate directors, appointing technology advisors, or creating formal advisory relationships with people who can translate AI reality for the board.

4. Management AI Reporting Standards

Require management to report on AI in a structured format that forces clarity:
  • What the AI does (in plain language, no jargon)
  • What data it uses (and the governance around that data)
  • How it performs (with specific, measurable metrics)
  • What risks it creates (specific, not generic)
  • What governance is in place (specific controls, not frameworks)
This structure forces management to think clearly about AI, and gives the board a consistent framework for evaluation.

The Urgency

This isn't a "nice to have" governance improvement. It's urgent for three reasons:
Investment scale. NZ enterprises are moving from pilot-scale AI investment (tens of thousands) to production-scale (hundreds of thousands to millions). Board governance of these investments requires AI literacy.
Regulatory trajectory. AI regulation is coming - from Australia, from the EU, eventually from NZ. Boards will be expected to demonstrate governance competence. Starting now is preparation, not premature.
Competitive gap. Organisations with AI-literate boards make better AI investment decisions, faster. The compound effect over 2-3 years is significant.

Actionable Takeaways

  • Conduct an honest literacy assessment. Not a test - a confidential self-assessment that identifies where the gaps are. You can't close a gap you haven't measured.
  • Schedule hands-on AI sessions, not briefings. Quarterly, with real AI tools and real business scenarios. Budget two hours per session.
  • Review board composition for technology depth. If no director has current AI understanding, address that through appointment, advisory, or formal education.
  • Standardise AI reporting to the board. Consistent format, specific metrics, clear risk assessment. Don't accept generic "AI strategy" updates.
  • Start now. The literacy gap widens with every month of AI advancement. The sooner you begin, the smaller the gap you need to close.