We're working on a health technology platform right now. Genomic reporting for clinicians. The kind of work where getting the interface wrong doesn't just mean a frustrated user. It means a clinician spending hours on admin that could be spent with patients. The experience has reinforced something I've suspected for a while: health tech is building for the wrong audience.
The Misalignment
Most health technology is designed to satisfy three groups: regulators, IT procurement, and executives. The people who actually use the system, clinicians, are consulted last if they're consulted at all.
The result is software that's technically compliant, architecturally sound, approved by the steering committee, and absolutely miserable to use at 7am on a Tuesday when a clinician has 40 patients to see and each one needs a report reviewed.
If a clinician has to click six times to do something they used to do in two, you haven't improved their workflow. You've added to it.
Rainui Teihotua
Chief Creative Officer
Design for the Workflow, Not the Feature List
The instinct in health tech is to lead with features. "Our platform has genomic variant classification, automated report generation, clinical decision support, and integration with seven lab systems." That's a feature list. It tells you what the system does. It says nothing about how the clinician experiences it.
Here's what we learned from sitting with clinicians and watching them work (contextual inquiry, the technique I wrote about recently):
Their day is fragmented. A clinician doesn't sit down and process reports for four hours. They have 10 minutes between patients. They're interrupted. They context-switch constantly. Software that assumes long, uninterrupted focus doesn't match reality.
They're experts at their job, not at your software. A geneticist trained for a decade in clinical genetics. They didn't train in software navigation. If the interface requires them to remember that "Report Configuration" lives under "Administration > System Settings > Templates," you've lost them. Not because they're not smart, but because their expertise is elsewhere.
Speed is kindness. Every second a clinician spends navigating software is a second not spent with a patient. When we measured the existing workflow for the platform we're building, the manual process took up to eight hours for a single complex report. That's not a software efficiency problem. That's a care delivery problem.
They've been burned before. Most clinicians have lived through at least one failed technology implementation. They're sceptical, and they should be. Earning their trust requires showing that you've designed for their reality, not for a demo.
What We Changed
When we started on this project, the initial brief was feature-heavy. Integration requirements. Data models. Compliance specifications. All necessary. But the brief said almost nothing about the clinician's experience.
We pushed for user research before any design work started. We sat with clinicians. We watched them process reports manually. We timed each step. We asked them what frustrated them and what they wished they could do.
What emerged was different from the feature list:
Priority one wasn't a feature. It was speed. The clinician's single biggest pain point was time. A complex genomic report took hours of manual work. If the platform could reduce that to minutes, everything else was secondary.
Priority two was confidence. Clinicians need to trust the output. They need to understand what the system did and verify it easily. A black box that produces a report faster but can't be verified is worse than the manual process.
Priority three was fit. The system needed to fit into their existing workflow, not replace it. Clinicians have established routines, preferred formats, specific ways of communicating with patients. The platform needed to support those patterns, not impose new ones.
The Design Principles We Adopted
Minimum clicks to maximum value. Every screen was designed around the question: what's the fewest steps from opening this to finishing the task? We cut navigation layers. We surfaced the most common actions on the first screen. We reduced a multi-tab workflow to a single progressive flow.
Show the work. Rather than hiding the processing behind a loading screen, we designed the interface to show clinicians what was happening at each stage. Variant identified. Classification applied. Source referenced. This builds trust. It also catches errors early, because the clinician can verify each step rather than just checking the final output.
Respect the expertise. The interface uses clinical terminology, not software terminology. The layout follows the structure of a clinical report, not the structure of a database. When a clinician opens the platform, it should feel like a clinical tool, not an enterprise application.
The Broader Pattern
This isn't just a health tech problem. It's an enterprise software problem. The people who buy the software and the people who use the software have different priorities. Procurement wants compliance, integration, and vendor stability. Users want speed, simplicity, and a system that respects their expertise.
The best enterprise software satisfies both. But you only get there by putting users in the room from the beginning, not as a validation step at the end.
