New Zealand released "Investing with Confidence" in July 2025, the country's first national AI strategy. We were the last OECD country to do so. That delay tells you something. What you think it tells you probably depends on whether you see caution as wisdom or hesitation.
The Strategy
The approach is deliberately light-touch. No new AI-specific legislation. Instead, alignment with the existing Privacy Act, Human Rights Act, and public sector frameworks. Treaty of Waitangi considerations are threaded throughout, with explicit acknowledgement that AI systems must account for Māori data sovereignty and cultural values.
Two flagship programmes anchor the practical side:
AI Activator provides funding and support for small-to-medium enterprises exploring AI adoption. The programme targets businesses that know they should be doing something with AI but don't know where to start.
GovGPT deploys AI assistants for local councils, starting with routine citizen queries: building consent status, rates information, service requests. The goal is to demonstrate public sector AI use that is practical, bounded, and transparent.
Last
OECD country to release a national AI strategy (July 2025)
Source: OECD AI Policy Observatory, 2025
The Spectrum
To understand NZ's position, look at what sits on either side.
The EU AI Act is the most prescriptive framework globally. It classifies AI systems by risk level, bans certain uses outright (social scoring, real-time biometric surveillance in public spaces), and imposes mandatory conformity assessments for high-risk applications. It is thorough, expensive to comply with, and slow to implement.
Australia sits between the EU and NZ. They built "Matilda," a sovereign large language model trained on Australian data, initially targeting the Australian Tax Office. They quietly dropped the "sovereign AI" label after it attracted more political attention than they wanted, but the investment is real: a government-funded AI capability designed to process Australian data on Australian infrastructure under Australian law.
NZ chose a third path. No prescriptive regulation. No sovereign model. A light framework that trusts existing legislation and focuses investment on adoption support rather than compliance infrastructure.
The Case for Light-Touch
Small economies can move faster with less regulation. When the EU AI Act requires conformity assessments for high-risk AI systems, that creates a compliance overhead that favours large corporations with legal departments and penalises startups and SMEs. NZ enterprises don't carry that burden.
The Privacy Act already covers most AI-related data concerns. Personal information, consent, purpose limitation, and cross-border data transfer rules all apply to AI systems without needing new legislation. Rather than building a parallel framework, NZ chose to enforce what already exists.
Treaty of Waitangi integration gives NZ a genuinely distinctive angle. The conversation about AI governance in most countries is purely techno-economic. NZ's strategy explicitly addresses indigenous data sovereignty, cultural considerations in algorithmic decision-making, and equitable access. That framework, if executed well, offers something no other OECD country has attempted at strategy level.
The Case Against
Being last has consequences. While NZ deliberated, other countries built infrastructure, attracted talent, and established regulatory clarity that gives their enterprises confidence to invest. Light-touch regulation means NZ companies operating internationally may still need to comply with the EU AI Act, Australian standards, or both, without domestic guidance on how to do so.
The absence of a sovereign AI capability is a real gap. Australia's investment in Matilda reflects a strategic bet: that AI systems processing government data should run on domestic infrastructure with domestic oversight. NZ's approach relies on international cloud providers and their compliance frameworks. For routine applications, that's fine. For sensitive government or health data, the dependency is worth questioning.
2
flagship programmes in NZ's AI strategy: AI Activator (SME adoption) and GovGPT (council AI assistants)
Source: NZ Government, 'Investing with Confidence,' 2025
Adoption support without governance infrastructure creates a different risk. AI Activator helps businesses adopt AI tools. But which tools? Under what standards? With what obligations around transparency, bias testing, or impact assessment? Light-touch works until something goes wrong. The first high-profile AI incident involving a NZ company or government agency will test whether existing legislation is genuinely sufficient or just untested.
Our View
We work with NZ enterprises adopting AI every day. The light-touch approach creates real opportunity for organisations willing to move. There is less regulatory friction, less compliance overhead, and more room to experiment than in almost any other developed economy.
But opportunity without infrastructure is fragile. NZ needs investment in three areas the current strategy underweights:
AI governance tooling. Practical frameworks, not legislation, that help organisations assess risk, test for bias, and document decisions. The government doesn't need to mandate these. But it should fund their development and make them freely available.
Data infrastructure. NZ organisations consistently cite data quality as their biggest barrier to AI adoption. The strategy acknowledges this but underinvests in practical support for data modernisation, particularly in the public sector.
Workforce transition. AI Activator addresses adoption. Nobody is addressing the skills displacement that follows. Training programmes, career transition support, and honest communication about which roles will change are all missing from the current strategy.
Light-touch regulation is an advantage if NZ enterprises move fast enough to capitalise on it. The risk is that we confuse the absence of friction with the presence of strategy.
Isaac Rolfe
Managing Director
The strategy is pragmatic, and pragmatism suits NZ. But pragmatism only works if it's backed by investment, infrastructure, and honesty about the gaps. The next 18 months will determine which version of "light-touch" we get: the bold kind or the behind kind.

