Every health IT vendor says their platform is "clinician-friendly." After two decades of watching implementations, I can tell you what that usually means: they've added a dashboard. The underlying assumption, that clinicians will adapt their workflows to the software, hasn't changed. And it's still wrong.
What You Need to Know
- Clinical workflows are shaped by years of practice, regulation, and patient safety requirements. They don't bend to accommodate software design choices.
- The most successful health IT systems are the ones that observe existing workflows and build around them, rather than proposing idealised workflows that look good in a demo.
- AI creates a new opportunity here. Instead of forcing clinicians into rigid software paths, AI can adapt to the clinician's natural workflow in real time. But only if the system is designed that way from the start.
- The cost of getting this wrong isn't just wasted IT budget. It's clinician burnout, degraded patient care, and institutional resistance to future technology adoption.
The Workflow Problem
I've seen this pattern repeat across every health IT role I've held. A vendor presents a system. It has a logical flow: patient registration, clinical notes, orders, referrals, billing. It makes perfect sense on a screen.
Then it meets a GP who sees 40 patients a day in 15-minute slots.
That GP doesn't follow a linear workflow. She checks lab results while the patient is talking. She dictates notes between appointments. She writes a referral, gets interrupted by a phone call about another patient, comes back, and finishes the referral from memory. She processes prescriptions in batches at the end of the day because it's faster.
77
separate tasks per hour performed by a GP during patient consultations
Source: Annals of Family Medicine, Physician Time Study, 2022
The vendor's logical flow requires her to complete each step before moving to the next. It times out her session if she's inactive for five minutes. It requires three clicks to switch between patients. Each friction point is small. Together, they turn a manageable day into an exhausting one.
This isn't a training problem. You can train someone on a system and they'll still reject it if it fights their natural rhythm. Clinical workflows are shaped by constraints that software designers don't always see: time pressure, interruption frequency, cognitive load, patient safety protocols, and the physical environment of the practice.
Why Vendor-Led Design Fails
Most health IT vendors design from the data model up. They start with what the system needs to capture, then build interfaces around those data requirements. The result is software that's optimised for data collection, not for clinical work.
I've sat in enough vendor demonstrations to recognise the pattern. The demo shows a clean, logical sequence. The presenter clicks through screens smoothly. Everything flows. Nobody interrupts the presenter mid-task. Nobody hands them a phone message while they're writing a note. Nobody asks them to squeeze in an urgent patient while they're processing a referral.
The demo environment and the clinical environment share almost nothing in common.
The vendors that break this pattern do something different. They put their designers in practices for weeks, not hours. They watch. They count clicks. They time tasks. They ask clinicians what they hate about their current system and, more importantly, what they've built workarounds for. The workarounds tell you everything about where the system fails.
Where AI Changes the Equation
This is where I think AI genuinely helps, when it's applied correctly.
Traditional software enforces a path. You must enter data in this order, on this screen, in this format. AI can be different. It can observe the clinician's natural behaviour and handle the structure in the background.
A GP who dictates notes in her own style shouldn't need to restructure those notes for the system. AI can do that. A clinician who writes half a referral, gets interrupted, and comes back to it later shouldn't lose their work or restart a process. The system should hold context across interruptions.
The best AI in health IT is invisible. It handles the administrative burden without adding to the cognitive load. The moment a clinician has to think about the AI tool instead of the patient, you've failed.
Isaac Rolfe
Managing Director
Isaac and I have discussed this at length while working through how River approaches health sector projects. His instinct as a platform builder aligns with what I've seen operationally: the technology should conform to the user, not the other way around.
But here's the critical point. AI that adapts to clinical workflows requires a different design philosophy from the start. You can't bolt adaptive AI onto a rigid system. The underlying architecture needs to support flexible input, contextual awareness, and workflow interruption as a normal state, not an error condition.
The Burnout Connection
This isn't just about efficiency. It's about sustainability.
50%
of NZ GPs reported symptoms of burnout in 2023, with administrative burden cited as a primary contributor
Source: RNZCGP Workforce Survey, 2023
Clinician burnout is a workforce crisis in New Zealand. And while software isn't the primary cause, poorly designed software is a consistent accelerator. Every additional click, every workflow interruption, every piece of data entry that could be automated but isn't, adds to the cumulative load.
When I talk to GPs about their systems, the frustration isn't about the technology itself. It's about the disrespect for their time. They trained for years to provide clinical care. They spend hours on data entry. The system was supposed to help, and instead it became another thing to manage.
Software that respects clinical workflows, that adapts rather than dictates, directly addresses this. Not as a complete solution to burnout, but as one less thing making a hard job harder.
Building Software That Bends
For teams building health IT, whether traditional platforms or AI-enabled tools, the principles are straightforward.
Observe before you design. Spend real time in clinical environments. Watch multiple clinicians, because they all work differently. Map the real workflow, not the official one.
Design for interruption. Clinical work is constantly interrupted. Your software needs to save state gracefully, allow task switching without penalty, and let users pick up where they left off without restarting processes.
Measure cognitive load, not just task completion. A workflow might take the same number of minutes in your system as in the old one. But if it requires more concentration, more decision points, more context switching, it's worse. Time isn't the only metric that matters.
Use AI to absorb complexity, not add it. AI should handle the gap between how clinicians naturally work and what the system needs. It should translate, not constrain. If your AI feature requires training sessions and user manuals, reconsider the design.
Test with tired clinicians. Not fresh ones in a training room. Test at 4pm on a Friday with someone who's seen 35 patients. If the system is still usable under those conditions, you've designed it right.
Clinical workflows exist for good reasons. They've been refined by practice, constrained by regulation, and shaped by patient safety. Software that tries to replace those workflows will be rejected. Software that learns from them and makes them faster has a chance of actually being used.

