Skip to main content

Whanau-First AI Design

AI design that puts whanau first. Not individual users but families, communities, and collectives. A different design paradigm for a different worldview.
20 October 2025·8 min read
Dr Tania Wolfgramm
Dr Tania Wolfgramm
Chief Research Officer
Every AI system I encounter is designed for an individual user. One person, one interface, one set of preferences, one decision. This is so deeply embedded in technology design that it feels like a natural law. It is not. It is a cultural assumption. And for a significant portion of the world's population, it is the wrong assumption.

The Individual Default

Modern technology design assumes the individual as the atomic unit. User accounts are individual. Preferences are personal. Recommendations are personalised. Decision support is directed at a single decision-maker.
This maps well to Western, individualist cultural contexts where autonomy and individual choice are primary values. It maps poorly to collectivist contexts where decisions are made within relationships, where identity is communal, and where the wellbeing of the group shapes individual action.
In te ao Māori, the concept of whanau extends beyond the nuclear family. It encompasses extended family, community, and the network of relationships that define a person's place in the world. Decisions about health, education, finances, and wellbeing are not individual decisions. They are whanau decisions.
An AI system designed for an individual user, deployed in a whanau context, is not just culturally insensitive. It is functionally incomplete. It is solving for the wrong unit.

What Whanau-First Design Means

Whanau-first AI design starts with a different question. Not "what does this user need?" but "what does this whanau need, and how does this person's interaction serve the collective?"

Collective Decision Support

In a whanau context, a health AI should not just provide information to a patient. It should support the conversation that the patient has with their whanau about health decisions. This means:
  • Information presented in a way that can be shared and discussed, not just consumed individually
  • Decision frameworks that account for collective values, not just individual preferences
  • Outputs that acknowledge the role of elders, caregivers, and community leaders in decision-making
This is not about dumbing down the AI or adding a "share" button. It is about fundamentally redesigning the information architecture to serve collective decision-making.

Relational Data Models

Individual-centred systems store data in user profiles. Whanau-centred systems need relational data models that represent the connections between people and the collective contexts they belong to.
A whanau health dashboard would not just show one person's health data. It would show patterns across the whanau, with appropriate consent and governance, that help the collective make better decisions. Are the tamariki up to date on vaccinations? Is anyone in the whanau showing early signs of a condition that runs in the family? What community health resources are available?
The data model shifts from individual records to relational networks. This is technically more complex, but it reflects the reality of how health decisions are made in many communities.

Shared Governance

Who controls the AI? In individual-centred design, the user controls their data and their experience. In whanau-centred design, governance is shared.
This raises questions that individual-centred design does not have to answer: Who can see whose data? Who makes decisions about the AI's role? How are disagreements resolved? Who speaks for the collective?
These are not technical questions. They are governance questions rooted in tikanga Māori. The technology must support whatever governance structure the whanau determines, not impose its own.
The hardest part of whanau-first design is not the technology. It is convincing technology designers that the individual user is not the only valid starting point.
Dr Tania Wolfgramm
Chief Research Officer

Design Principles

1. The Unit Is the Relationship

Design for the relationship between people, not for people in isolation. An AI health assistant serves the relationship between a patient and their whanau. An AI education tool serves the relationship between a student and their learning community. The interface, the data model, and the decision support all centre the relationship.
Data governance in whanau-centred design requires collective consent mechanisms. Not just "does this individual consent to data collection?" but "does the whanau consent to how this data will be used in the context of their collective?" This is consistent with Te Mana Raraunga (Māori Data Sovereignty) principles.

3. Outputs Serve Conversation

AI outputs in a whanau context should be designed to support conversation, not replace it. A health recommendation is a conversation starter, not a directive. An educational assessment is a topic for family discussion, not a verdict. The AI provides information and framing. The whanau provides wisdom and decision-making.

4. Time Operates Differently

Whanau decision-making operates on a different timescale than individual decision-making. Decisions may involve consultation with people who are not immediately available. They may require hui (meetings). They may draw on ancestral knowledge and long-term thinking.
AI systems designed for rapid individual decisions do not work in this context. Whanau-first AI needs to support extended decision timelines, asynchronous input from multiple people, and iterative refinement as the conversation develops.

5. Mana Is Preserved

Every person in a whanau has mana. The AI system must not diminish anyone's mana by design. This means not privileging one person's input over another's based on technical literacy. Not making people feel inadequate for asking questions. Not replacing the wisdom of elders with algorithmic recommendations.

Practical Application

This is not theoretical. There are immediate applications:
Whanau Ora programmes already take a whanau-centred approach to health and social services. AI tools designed for Whanau Ora should reflect this approach in their information architecture, not bolt collective features onto an individual-centred system.
Pacific health services serving communities where collective decision-making is the norm. AI diagnostic support that presents information for family discussion, not just individual consumption.
Education in kura kaupapa (Māori-medium schools) where learning is communal and assessment serves the collective, not just the individual learner.
Community organisations that make decisions through hui and consensus rather than individual authority.

The Industry Challenge

The global AI industry is not designed for this. Model training is based on individual interactions. Evaluation metrics assume individual users. UX patterns centre the individual. Building whanau-first AI requires working against the grain of the entire technology ecosystem.
This is hard. But the alternative, forcing collectivist communities into individualist technology, is worse. It does not just produce poor technology. It produces technology that actively undermines the cultural values of the people it claims to serve.
The opportunity for Aotearoa is to lead. Nowhere else in the developed world is better positioned to design AI for collective contexts. We have the cultural knowledge. We have the growing AI capability. We have the communities ready to co-design the solutions.
What we need is the will to build differently.