The loudest voice wins. That's how most enterprise decisions actually get made, regardless of what the governance framework says. The person with the strongest opinion, the most senior title, or the best presentation skills drives the direction. Evidence exists, but it's used to justify decisions that have already been made, not to inform decisions that haven't.
The Decision Theatre
Tania calls it "decision theatre." The rituals of evidence-based decision-making without the substance.
The steering committee reviews a business case. The business case was written to support a conclusion that was reached weeks ago. The numbers are real but selectively presented. The risks section exists but is optimistically framed. The committee approves because the political groundwork was done before the document was written.
The post-mortem analyses a failure. The analysis is thorough. The root causes are identified. The recommendations are documented. Nothing changes, because the same dynamics that caused the failure are still in place and the post-mortem has no enforcement mechanism.
The data dashboard is impressive. Twelve charts, real-time updates, beautiful visualisation. Nobody can articulate what decisions the dashboard informs. It exists because someone said "we need to be more data-driven" and a dashboard felt like progress.
If your evidence always supports the decision that was already made, you're not doing evidence-based decision-making. You're doing evidence-decorated decision-making.
Dr Tania Wolfgramm
Chief Research Officer
What Evidence-Based Actually Means
Evidence-based decision-making has a specific definition, borrowed from evidence-based medicine where the concept originated. It means making decisions based on the best available evidence, combined with professional judgement, while being explicit about uncertainty.
Three components. All three are necessary.
Best available evidence. Not perfect evidence. Not comprehensive evidence. The best you can get within the time and resources available. Sometimes that's rigorous data analysis. Sometimes it's five customer interviews. Sometimes it's a literature review. The standard isn't perfection. It's "better than opinion alone."
Professional judgement. Evidence doesn't replace expertise. A dataset might show that customer satisfaction is high, but an experienced account manager who senses a client is about to leave brings context that data alone doesn't capture. The evidence and the judgement work together.
Explicit uncertainty. This is the part most organisations skip. Every decision involves uncertainty. Evidence-based practice requires naming that uncertainty. "We're 70% confident this approach will work because of X, Y, Z. The 30% risk is A and B." This feels vulnerable. It's actually rigorous.
68%
of senior leaders say their organisation's decisions are driven more by hierarchy than by evidence
Source: McKinsey Decision-Making Survey, 2021
The Framework
Here's what we use internally and recommend to clients. It's deliberately simple because complicated frameworks don't get used.
Step 1: Frame the Decision
Write it down. Not "what should we do about X?" but "should we pursue option A, option B, or option C?" Framing the decision as a choice between specific options forces clarity.
If you can't frame the decision clearly, you're not ready to make it. Spend more time on framing before gathering evidence.
Step 2: Identify What Would Change Your Mind
Before looking at any evidence, each person involved in the decision writes down: what evidence would make me choose each option? This is pre-commitment. It prevents the common pattern of interpreting all evidence to support your existing preference.
If the answer is "nothing would change my mind," be honest about that. The decision has already been made. Don't waste time gathering evidence.
Step 3: Gather the Evidence
Match the evidence to the decision's significance. A $5,000 decision doesn't need a six-week research project. A $5 million decision probably does.
Sources, roughly in order of reliability:
- Direct measurement (your own data, properly analysed)
- Structured user/customer research (interviews, surveys with proper methodology)
- Industry benchmarks and published research
- Expert opinion (named experts with relevant experience)
- Analogies from similar situations
Be explicit about the quality of your evidence. "Three customer interviews" is different from "a statistically significant survey of 500 customers." Both are useful. Neither should be presented as the other.
Evidence Reliability by Source Type
Source: Barends, Rousseau, and Briner, Evidence-Based Management, 2014
Step 4: Make the Decision Explicitly
State the decision, the evidence that supports it, the evidence that argues against it, and the uncertainty that remains. Document it.
This documentation isn't bureaucracy. It's accountability. When you revisit the decision in six months, you can assess whether the uncertainty materialised, whether the evidence held up, and what you'd do differently.
Step 5: Review and Learn
Set a review date at the time of the decision. Not "we'll review later." A specific date. On that date, assess: did the expected outcomes materialise? Was the evidence reliable? What would we do differently?
This is where organisational learning happens. Not from the successes, but from the gaps between prediction and reality.
Common Failure Modes
Confirmation bias. Seeking evidence that supports your preferred option and dismissing evidence that doesn't. The Step 2 pre-commitment mitigates this, but it requires genuine honesty.
Analysis paralysis. Gathering evidence indefinitely because the decision feels risky. Set a time limit. The cost of a delayed decision is real and usually underestimated.
Anchoring. The first piece of evidence disproportionately influences the decision. Present evidence simultaneously rather than sequentially. Let people review all the evidence before discussing.
Authority bias. The most senior person's opinion carries more weight than the evidence. This is the hardest to address because it's structural. The framework helps by making evidence explicit, so authority has to argue against documented data rather than just assert preference.
Making It Stick
The framework works only if leadership models it. If the CEO makes evidence-based decisions and expects their team to do the same, it cascades. If the CEO makes intuition-based decisions and expects the team to justify them with evidence afterwards, the framework is theatre.
Start small. Pick one recurring decision - quarterly planning, vendor selection, hiring - and apply the framework to that decision for six months. Measure whether the outcomes improve. If they do, expand. If they don't, refine the approach.
Evidence-based decision-making isn't about removing human judgement. It's about making human judgement better by combining it with the best information available and being honest about what you don't know.

