We talk a lot about AI capabilities. Model performance, integration architecture, data pipelines. But the most consistent predictor of whether an AI initiative succeeds or fails has nothing to do with technology. It's whether the leaders driving the initiative can see it from their people's perspective.
What You Need to Know
- Perspective-taking is the deliberate practice of understanding how others experience a situation. It is not the same as empathy (feeling what they feel) or sympathy (feeling for them). It is a cognitive skill that can be developed.
- Leaders who practise perspective-taking during AI rollouts see significantly higher adoption rates. The research is clear: when people feel understood, resistance decreases.
- Most AI adoption strategies are designed from the leader's perspective, not the user's. This is the root cause of the "great demo, no adoption" pattern.
- Perspective-taking is a learnable skill, not a personality trait. It can be taught, practised, and integrated into standard change management.
The Research
Gerson's work on perspective-taking in organisational change draws on three decades of social psychology research, adapted for the specific context of technology adoption. The core finding is consistent across studies: when leaders demonstrate genuine understanding of how change affects the people experiencing it, those people are more likely to engage constructively.
This is not about being "nice." It is about being accurate. A leader who understands that their claims team sees AI as a threat to their expertise will make different decisions than a leader who assumes the team is simply resistant to change. The first leader addresses a real concern. The second leader dismisses one.
2.4x
higher adoption rates in AI initiatives where leaders demonstrated active perspective-taking behaviours
Source: Tuazon, G., Perspective-Taking in Organisational Technology Adoption, working paper, 2023
Impact of Perspective-Taking on AI Adoption
Source: Tuazon, G., working paper, 2023
The mechanism is straightforward. Perspective-taking improves three things that matter for adoption:
Communication accuracy. When you understand how someone experiences a change, you can explain it in terms that make sense to them. "This AI will process claims faster" means something different to a claims handler (who hears "my job is being automated") than to a CFO (who hears "lower operating costs"). The same fact, framed differently, produces opposite emotional responses.
Design quality. AI systems designed with genuine understanding of the user's workflow, concerns, and mental models are better products. They fit into existing patterns rather than disrupting them. They address real friction rather than imagined friction.
Trust formation. Trust is the currency of change management. People extend trust to leaders who demonstrate understanding. Not agreement. Understanding. You can acknowledge someone's concern without agreeing that the concern should prevent the change.
What This Looks Like in Practice
Before the Rollout
Most AI adoption strategies start with a business case and a project plan. The perspective-taking approach adds a step: understanding the current experience of the people who will use the system.
This is not a survey. Surveys aggregate. Perspective-taking requires specificity. What does a claims handler's Tuesday look like? What are the frustrations they already manage? What are they good at, and proud of? What would they lose if their workflow changed?
We use structured conversations, not focus groups. One-on-one or small groups, with open questions and genuine listening. The goal is not to validate the AI initiative. The goal is to understand the landscape it's entering.
During Design
The insights from perspective-taking conversations directly inform AI design decisions. If handlers are proud of their ability to spot unusual claims, the AI should augment that skill, not replace it. If they're frustrated by repetitive data entry, the AI should start there. If they're worried about being judged by the AI's output, the system should be framed as a tool they control, not a monitor they're subject to.
These are design decisions that technology teams rarely make on their own, because they require understanding a context they haven't lived in.
During Deployment
The deployment phase is where perspective-taking pays the most visible dividends. Leaders who understand their people's concerns can anticipate resistance and address it before it calcifies. They can identify early adopters, not by enthusiasm for technology, but by readiness for the specific change this technology represents.
They can also spot the difference between reasonable feedback ("the extraction misses handwritten notes") and resistance disguised as feedback ("we need to test this for another six months"). Both deserve responses, but different ones.
The leaders who get AI adoption right aren't the ones with the best strategy decks. They're the ones who can sit in a room with their claims team and genuinely hear what the team is telling them.
Tim Hatherley-Greene
Chief Operating Officer
The Practical Framework
Gerson and I have been developing a framework that integrates perspective-taking into standard AI change management. It's built around four practices:
1. Map before you move. Before announcing an AI initiative, map the perspectives of every stakeholder group. Not their "buy-in level." Their actual experience, concerns, and motivations.
2. Design from the edge. Design the AI system starting from the perspective of the person furthest from the decision to adopt it. If the frontline user's experience works, everything above it usually follows.
3. Communicate in their language. Translate every communication about the AI initiative into the terms that matter to each audience. This isn't spin. It's accuracy. The same initiative genuinely means different things to different people.
4. Measure what they measure. Track adoption metrics that reflect the user's experience, not just the organisation's objectives. Time-to-task. Confidence in outputs. Frequency of override. These tell you whether the system is working for the people using it.
Why This Matters Now
Enterprise AI is moving from pilot to production across New Zealand. The initiatives that succeed will not be the ones with the best models or the biggest budgets. They will be the ones led by people who understood, genuinely understood, how the change would land.
Perspective-taking is not soft. It is a strategic capability with measurable impact on adoption, satisfaction, and ROI. The research supports this. Our practice confirms it.
If you're planning an AI rollout, start by listening. Not to validate your plan. To understand the world your plan is entering.

