Your procurement team is about to kill your AI initiative. Not deliberately. They're following the same process that works perfectly well for SaaS, hardware, and professional services. The problem is that AI doesn't fit that process, and nobody's told procurement.
I've watched this play out across multiple enterprise engagements now. The pattern is predictable. An AI initiative gets executive sponsorship, budget approval, and genuine organisational energy. Then it hits procurement, and everything stalls. Three months later, the initiative is either dead or so diluted by procurement requirements that it can't deliver what was promised.
Why SaaS Procurement Doesn't Work for AI
Enterprise procurement evolved around a simple model: define requirements, evaluate vendors against those requirements, select the best fit, negotiate terms, deploy. It works brilliantly for well-understood categories. CRM systems. Cloud infrastructure. Office productivity suites.
AI breaks this model in four fundamental ways:
1. Requirements Can't Be Fully Specified Upfront
SaaS procurement starts with a requirements document. Functional requirements, non-functional requirements, integration requirements. The vendor either meets them or doesn't.
AI capabilities are emergent. You don't fully know what the system can do until you've built it with your data, in your context, for your users. A requirement like "the system should accurately extract key terms from insurance contracts" sounds precise. But what counts as "accurately"? What counts as "key terms"? The answer depends on your specific contracts, your specific policies, and your specific operational context. You can't spec this in advance. You discover it through iteration.
The trap: Procurement forces premature specification. Vendors respond with ambitious capability claims to win the contract. The gap between what was specified, what was promised, and what's achievable only becomes apparent after the contract is signed.
2. Pricing Models Are Unpredictable
SaaS pricing is simple: per-user, per-month, per-year. You can forecast costs accurately. AI pricing is consumption-based and volatile. Per-token, per-API-call, per-compute-hour. Usage depends on adoption, query complexity, data volume, and model choice - all of which change as the system evolves.
2.5x
average variance between initial AI cost estimates and actual first-year spend in enterprise deployments
Source: Deloitte, Enterprise AI Cost Analysis, 2023
The trap: Procurement requires fixed pricing commitments. Vendors either pad significantly (you overpay) or lowball to win the deal (you get surprise costs later). Neither outcome serves the enterprise.
3. Vendor Evaluation Criteria Are Wrong
Standard vendor evaluation weights things like: years in business, number of enterprise clients, financial stability, compliance certifications, reference customers. These matter. But they're insufficient for AI.
What actually determines AI vendor quality:
- Can they demonstrate working solutions with data similar to yours?
- Do they have technical depth, or are they a wrapper around OpenAI's API?
- Can they explain their architecture and why they made specific technical choices?
- What's their approach when the AI makes mistakes?
- How do they handle model updates and performance regression?
Most enterprise RFPs don't ask these questions. They ask about uptime SLAs and data centre locations.
4. The Timeline Is Wrong
Procurement timelines assume a known deployment pattern: select vendor, sign contract, implement, go live. AI deployment is iterative. You prototype, test with real data, discover issues, iterate, expand scope, iterate again. The concept of a fixed "go live" date is misleading for AI. It's more like a continuous improvement curve.
The trap: Procurement imposes waterfall timelines on an inherently iterative process. The result: either the AI team cuts corners to hit dates, or the project is labelled "delayed" when it's actually progressing normally for an AI initiative.
What Needs to Change
Adopt Phased Procurement
Instead of one big procurement for "an AI solution," structure it in phases:
Phase 1: Discovery (4-6 weeks, small contract). Pay a vendor to prove feasibility with your data. Fixed price, fixed scope, clear deliverables. This is cheap insurance against committing to the wrong vendor or the wrong approach.
Phase 2: Pilot (8-12 weeks, medium contract). Build a working system for a specific use case. Measurable outcomes, real users, real data. This is where you learn what the system can actually do.
Phase 3: Production (ongoing, structured contract). Scale what works. By this point, you have empirical evidence of capability, realistic cost projections, and a proven working relationship.
Each phase has its own procurement event. You can change vendors between phases. You never commit to a large, long-term contract based on promises.
Use Proof-Based Evaluation
Replace RFP-based evaluation with proof-based evaluation. Give shortlisted vendors a sample of your data and a defined task. Evaluate the results, not the proposal. This takes more effort upfront but prevents the far more expensive mistake of selecting a vendor based on presentation skills rather than technical capability.
Build AI-Literate Procurement Teams
Procurement teams need enough AI literacy to ask the right questions. Not technical depth - but enough understanding to distinguish genuine capability from marketing. At minimum:
- Understanding the difference between a model vendor, a platform vendor, and a wrapper
- Ability to evaluate whether a vendor's architecture is appropriate for the use case
- Knowledge of AI-specific cost drivers and pricing models
- Awareness of AI governance and data sovereignty requirements
Restructure Commercial Terms
AI contracts need different commercial structures:
- Consumption caps with visibility. Set a maximum monthly spend with real-time cost dashboards. No surprises.
- Performance-linked payments. Tie a portion of the fee to measurable outcomes, not just delivery milestones.
- Model flexibility clauses. The AI landscape moves fast. Your contract should allow model changes without renegotiation.
- Data ownership clarity. Explicit terms on who owns the data, the fine-tuned models, and the intellectual property generated during the engagement.
Actionable Takeaways
- Brief your procurement team on AI-specific dynamics before the process starts. They're not the problem - the process is. Give them the context to adapt it.
- Start with a small discovery contract. Prove feasibility before committing to scale. This is the single highest-value change you can make.
- Evaluate on demonstrated capability, not proposals. Any vendor can write a compelling response. Require proof with your data.
- Structure contracts for iteration. Phased commitments, consumption-based pricing with caps, and exit clauses that don't trap you.
- Include AI-specific evaluation criteria. Technical architecture, model flexibility, governance approach, and error handling - not just uptime and certifications.
The enterprises that adapt their procurement for AI will move faster and waste less. The ones that force AI through SaaS procurement will keep wondering why their AI initiatives stall.
