I've spent years working in health IT across New Zealand, from primary health organisations to Crown Research Institutes. The digital health projects that actually work share something in common, and it's not the technology. It's how closely they track to the way clinicians and administrators already operate.
The Short Version
- Health IT implementations succeed when they're built around existing clinical workflows, not when they ask clinicians to adapt to new ones.
- The biggest risk in primary care IT isn't choosing the wrong platform. It's underestimating how fragmented and context-dependent the daily workflow is.
- Vendor-led roadmaps rarely account for the realities of New Zealand primary care, where a single practice might serve urban professionals, rural farmers, and Māori health programmes simultaneously.
- The systems that stick are the ones that make existing processes faster, not the ones that replace them with something "better."
What I've Seen Work
At Rotorua Area Primary Health Services, I managed information systems across a network that served diverse communities with genuinely different needs. That experience taught me something vendors don't like to hear: the best system is the one that disappears into the workflow.
~60%
of health IT implementations fail to achieve intended outcomes within the first three years
Source: HIMSS Analytics, Digital Health Indicator Survey, 2021
GPs in New Zealand don't have the luxury of long, uninterrupted sessions with software. They're running 15-minute consultations back to back. They're switching between patient management systems, lab results, referral portals, and community health records. Any tool that adds steps to that process, even well-intentioned steps, will be abandoned within weeks.
The implementations I've seen work share three traits.
They start with observation, not requirements
Before we configured anything at RAPHS, we spent time watching how staff actually used their systems. Not how the training manual said they should use them. How they actually did. The gap between those two things is where most implementations fail.
A practice nurse managing chronic conditions has a mental model of her workflow that doesn't map to any system diagram. She knows which fields matter. She knows which ones she skips. She knows the workarounds she's built over years. If your implementation ignores that knowledge, you're fighting the people you're trying to help.
They solve for the constraint, not the feature
Primary care in New Zealand operates under real constraints. Funding models, workforce shortages, geographic spread, ageing infrastructure. The winning implementations don't try to solve everything. They identify the single biggest bottleneck and address it.
At one practice, the bottleneck was referral tracking. GPs were losing visibility of patients once they referred them to secondary care. We didn't need a full platform overhaul. We needed a reliable way to close the loop. That specific, targeted solution stuck. The "comprehensive digital transformation" another vendor proposed the same year did not.
They plan for the long tail
The first 80% of any health IT project is relatively straightforward. Standard workflows, common scenarios, predictable data. The last 20% is where it gets hard. Edge cases. Practices that operate differently from the reference model. Community health programmes with their own reporting requirements.
18
months average time for NZ primary care practices to fully adopt new clinical software
Source: NZIER, Health IT Adoption Study, 2020
Most vendors budget for the first 80%. The implementations that actually work budget for the full 100%, and they accept that the last 20% will take longer and cost more than anyone estimated.
The Vendor Problem
I don't blame vendors for this. They're building products for a market, and the New Zealand primary care market is small. But the consequence is that most health IT products are designed for a generic practice that doesn't exist. They assume standardised workflows, consistent data quality, and staff who have time for training.
In reality, NZ primary care is extraordinarily diverse. A practice in central Auckland operates nothing like one in Murupara. Their patient populations are different. Their funding models are different. Their staffing models are different. And yet they're often expected to use the same system configured the same way.
The organisations that bridge this gap successfully are the ones that invest in local configuration and support. Not just a national rollout plan, but practice-by-practice adaptation. It's slower. It's more expensive. And it's the only approach I've seen consistently work.
What I'd Tell a CIO
If you're leading health IT for a primary care organisation, here's what I'd suggest based on what I've seen succeed and fail.
Spend more time in practices than in vendor demos. The demo will always look good. The question is whether it will look good at 8am on a Monday when the practice nurse has 30 patients to process and the locum GP can't find the referral form.
Budget for change management, not just implementation. The technology is maybe 40% of the cost. The other 60% is training, workflow redesign, and ongoing support. If your budget is 100% technology, your implementation will fail.
Measure adoption, not deployment. Deploying a system is not the same as people using it. Track actual usage patterns. If staff are reverting to workarounds within three months, the implementation hasn't succeeded no matter what the project report says.
Accept that "good enough" often beats "best in class." The perfect system that takes three years to implement will be overtaken by the adequate system that's working in six months. In primary care, where the environment changes constantly, speed of adoption matters more than feature completeness.
The technology is never the hard part. The hard part is making it work for the people who have to use it every day, in conditions that no vendor demo will ever replicate.
