Skip to main content

AI Discovery Sprints: How We Start Every AI Engagement

A 2-4 week structured sprint that validates the problem, assesses the data, and recommends a solution before you commit to building.
15 January 2025·7 min read
Isaac Rolfe
Isaac Rolfe
Managing Director
Every AI engagement we run starts the same way: a discovery sprint. Not a workshop. Not a strategy document. A structured 2-4 week sprint that produces a clear recommendation on whether to build, what to build, and how to build it. It's the single most important thing we've learned about delivering enterprise AI.

What You Need to Know

  • Discovery sprints eliminate the most expensive risk in AI projects: building the wrong thing. Two weeks of structured investigation is cheaper than six months of building something nobody uses.
  • The output isn't a report. It's a decision. Go, no-go, or pivot. With enough evidence to make the decision with confidence.
  • We've run discovery sprints that recommended not building. Those are some of the most valuable outcomes. They save the client six figures and redirect investment to where AI actually adds value.
  • Discovery is not a sales exercise. It's an honest assessment. If the data isn't there, the problem isn't real, or the ROI doesn't stack up, we say so.
40%
of our discovery sprints recommend a significantly different approach than the client's initial brief
Source: RIVER, discovery sprint data, 2023-2024

Why Discovery Before Delivery

Traditional software projects can often start with a clear specification. The requirements are knowable. The solution is bounded. You can scope with reasonable confidence before writing code.
AI projects are different. Three unknowns make traditional scoping unreliable:
Data uncertainty. You don't know the state of the data until you look at it. "We have all the data" usually means "we have data, but it's in 12 systems, 3 formats, and hasn't been validated in 2 years."
Solution uncertainty. The right approach for an AI problem isn't always obvious. Should this be a RAG system, a classification pipeline, an agentic workflow, or something simpler? The answer depends on the data, the accuracy requirements, and the operational context.
Value uncertainty. The expected ROI might not survive contact with reality. The process you're trying to automate might have edge cases that make full automation impossible. The users might not want what the sponsor is asking for.
Discovery resolves all three uncertainties before you commit significant budget.

The Sprint Structure

Week 1: Problem Validation

  • Stakeholder interviews with the people who experience the problem and the people who sponsor the investment. These are often different people with different expectations.
  • Process mapping of the current workflow. How does work flow today? Where are the bottlenecks? What takes the most time? What causes the most errors?
  • Success criteria definition. What does "good enough" look like? What business metric needs to change, and by how much, for this investment to be worthwhile?

Week 2: Data Assessment

  • Data source inventory. What data exists? Where does it live? What format is it in? Who owns it?
  • Quality assessment. Sample the actual data. Assess completeness, consistency, accuracy, and recency. This is where most surprises happen.
  • Access validation. Can we technically access the data? Are there legal or compliance constraints? What approvals are needed?

Week 3: Solution Design

  • Architecture options. Based on the problem and the data, what are the viable approaches? What are the trade-offs?
  • Proof of concept. A focused technical test against real data. Not a demo. A test of the hardest part of the problem. If the hard part works, the easy parts will follow.
  • Risk assessment. What could go wrong? What assumptions are we making? What would change the recommendation?

Week 4: Recommendation

  • Go/no-go/pivot recommendation with evidence. Not opinion. Evidence from the data, the process analysis, and the proof of concept.
  • If go: Detailed scope, timeline, team requirements, and budget estimate for the build phase. This scope is based on what we learned, not what we assumed.
  • If no-go: Clear explanation of why, and usually a redirect to where AI investment would deliver more value.
  • If pivot: A different approach than originally envisioned, supported by what discovery revealed.
The 50% Credit
We credit 50% of the discovery sprint investment against a build engagement that starts within 90 days. This isn't a discount. It's alignment. Discovery produces artefacts (data assessments, architecture designs, process maps) that directly accelerate the build.

What Discovery Actually Catches

Real examples from discovery sprints (anonymised):
The data wasn't there. A client wanted AI-powered customer segmentation. Discovery found that their CRM data was 40% incomplete and hadn't been cleaned in 3 years. Recommendation: invest in data quality first, then return to AI. Saved 6 months and $200K+ of building on a broken foundation.
The problem was different. A client wanted an AI chatbot for customer support. Discovery found that 70% of support volume was caused by confusing product documentation, not by slow response times. Recommendation: fix the docs first (simpler, cheaper, faster impact), then add AI for the remaining 30%. Better outcome, lower investment.
The approach was wrong. A client specified a complex ML model for fraud detection. Discovery found that rule-based logic would catch 85% of fraud patterns, with AI needed only for the remaining 15%. Recommendation: build the rule engine first (cheaper, faster, more explainable), layer AI on top for the edge cases. Delivered faster at lower cost.

When to Skip Discovery

Almost never. But if you insist:
  • You've already run a thorough data assessment in the past 6 months
  • The problem is well-defined and you've built similar AI systems before
  • You're extending an existing AI platform with a well-understood capability
Even then, a compressed 1-week discovery sprint is usually worth the investment.
The most valuable discovery sprints are the ones that say "don't build this." Every client who's heard that from us has thanked us later. That's the job: honest assessment, not just enthusiastic delivery.
Isaac Rolfe
Managing Director