Māori health inequities in Aotearoa are well documented, persistent, and unacceptable. AI is being positioned as a tool to address them. It could be. But only if the people deploying AI in health understand that the technology alone is not neutral, the data it runs on is not neutral, and the institutions using it have a history that demands deliberate, sovereignty-centred design.
What You Need to Know
- Māori experience worse health outcomes across nearly every measure in Aotearoa. AI has genuine potential to improve screening, diagnosis, resource allocation, and care coordination. The opportunity is not hypothetical.
- Health data about Māori is not just "health data." It carries whakapapa, it reflects whanau, and it is subject to data sovereignty obligations articulated by Te Mana Raraunga and grounded in Te Tiriti o Waitangi.
- AI systems trained on general population data will reproduce existing inequities. If Māori are underrepresented in training data, the models will underperform for Māori. This is not a bug. It is a structural feature of how these systems work.
- The path forward requires Māori governance of Māori health AI, not consultation, not co-design as a checkbox, but actual governance authority over how data is collected, how models are built, and how outputs are used.
The Opportunity
Louise has spent years working within health systems across the Pacific and Aotearoa, often at the intersection of population health data and community outcomes. The patterns are visible from that vantage point. AI could meaningfully change three areas of Māori health:
Earlier identification. Many Māori health conditions are diagnosed later than they should be. AI-assisted screening tools, particularly in primary care settings where access barriers already exist, could identify risk factors and trigger earlier intervention. Cardiovascular disease, diabetes, and certain cancers all have better outcomes with earlier detection.
Resource allocation. Health funding in Aotearoa is distributed through models that may not reflect the actual needs of Māori communities. AI-driven analysis of health utilisation data, demographic patterns, and outcome metrics could identify where resources are needed most. But this only works if the data accurately represents Māori health needs, and that requires Māori input into what gets measured.
Care coordination. Māori health is best understood through models like Te Whare Tapa Wha, which recognises taha wairua (spiritual), taha hinengaro (mental), taha tinana (physical), and taha whanau (family) as interconnected dimensions of wellbeing. AI systems that can integrate data across these dimensions, connecting primary care, mental health, social services, and community support, could enable genuinely holistic care coordination.
7 years
life expectancy gap between Māori and non-Māori in Aotearoa, a gap that has not meaningfully closed in two decades
Source: Ministry of Health, New Zealand Health Survey, 2023
The Risk
The risk is not that AI will be deployed badly. The risk is that it will be deployed in ways that look good on paper while reproducing the structural dynamics that created the inequities in the first place.
Data That Reflects the Problem
Health datasets in Aotearoa encode the consequences of colonisation, institutional racism, and unequal access. When an AI model is trained on this data, it learns the patterns embedded in it. If Māori access primary care at lower rates, the model learns that Māori have fewer primary care interactions. It does not learn that this reflects access barriers, not lower need.
This is not a technical failure. It is a data governance failure. And it cannot be solved by better algorithms. It can only be solved by ensuring that the data used to train health AI systems is understood in its full context, including the structural factors that shaped it.
Consultation Without Authority
Many health AI initiatives in New Zealand include a "Māori engagement" component. This typically involves consulting with Māori stakeholders, incorporating feedback, and documenting the process. It is well-intentioned. It is insufficient.
Consultation without decision-making authority is not governance. If a Māori advisory group identifies a concern but has no mechanism to require changes, the consultation is performative. Genuine data sovereignty means Māori governance structures have the authority to approve, modify, or reject how Māori health data is used in AI systems.
Models of Wellbeing
Western biomedical models and Māori models of wellbeing are not interchangeable. They do not merely use different terms for the same concepts. They represent fundamentally different understandings of what health means, how it is maintained, and how it is restored.
An AI system designed around a biomedical model will produce biomedical outputs. For Māori patients and whanau, these outputs may be technically correct but culturally incomplete. A system that recommends a medication without considering whanau context, spiritual wellbeing, or the patient's relationship with whenua is providing partial care.
You cannot build AI that serves Māori health by adding a cultural layer to a Western system. All of it.
Dr Tania Wolfgramm
Chief Research Officer
What Good Looks Like
Māori Governance From the Start
Not consultation. Governance. Māori health AI initiatives should be governed by structures that have the authority to determine how Māori data is collected, stored, processed, and interpreted. This means Māori representation on steering committees, data governance boards, and technical review panels, with decision-making authority, not advisory roles.
Training Data That Tells the Full Story
If your training data reflects inequitable access patterns, acknowledge that and adjust for it. This requires collaboration between data scientists and people who understand the structural context of the data. Louise's experience with population health data makes this point repeatedly: data without context is dangerous, and in health, it can be harmful.
Culturally Grounded Evaluation
How do you measure whether a health AI system is working for Māori? Not just by clinical accuracy metrics. By outcomes that reflect Māori models of wellbeing. Are whanau involved in care decisions? Is the system accessible in te reo Māori? Does it support holistic care coordination? These are evaluation criteria that standard AI assessments don't include.
Transparency About Limitations
Every AI system has limitations. In Māori health, those limitations have equity implications. If a model performs less accurately for Māori patients, that must be disclosed, explained, and addressed. Not buried in a technical appendix.
Where We Stand
We are at the beginning of this conversation in Aotearoa. AI in health is still largely experimental, and Māori health AI is even more nascent. That is actually good news. It means there is time to build this properly, to establish governance structures, to create data frameworks, to develop evaluation criteria that reflect Māori values.
The worst outcome would be rushing to deploy AI in Māori health without doing this foundational work. The second-worst outcome would be deciding it is too hard and doing nothing.
The opportunity is real. The risk is real. The sovereignty requirement is non-negotiable. Within those constraints, there is meaningful work to be done.

