Contract review sits at the intersection of obligation and oversight. Large volumes of text, high stakes, time pressure, and a need for consistency that human reviewers struggle to maintain across hundreds of documents. AI can carry the weight of volume, but the judgement, the understanding of what these obligations truly mean for the parties involved, that remains with people. The combination is powerful when implemented with the governance it demands.
What You Need to Know
- AI contract review is not a replacement for legal judgement. It is a screening and flagging tool that ensures consistency, catches common risks, and frees legal professionals to focus on the clauses that actually require expertise.
- The highest-value application is risk flagging, not contract drafting. Identifying problematic clauses, missing protections, and unusual terms across a portfolio of contracts is where AI delivers the most reliable enterprise value.
- Accuracy depends heavily on domain tuning. An off-the-shelf model will miss industry-specific risks. A model tuned on your contract types, your risk appetite, and your jurisdictional requirements will catch what matters.
- Governance is not optional here. Contract review intersects with data sovereignty, cultural obligations, and regulatory compliance in ways that a purely technical implementation will miss.
What AI Catches
Standard Risk Clauses
AI is excellent at identifying standard risk patterns: uncapped liability, automatic renewal, one-sided termination rights, broad indemnification, restrictive IP assignment, and problematic limitation of liability clauses. These are well-defined patterns that models can learn reliably.
For a procurement team reviewing 50 vendor contracts per quarter, automated scanning for these patterns saves days of legal time and, more importantly, ensures nothing gets missed. The third contract reviewed on a Friday afternoon gets the same scrutiny as the first one reviewed on a Monday morning.
Missing Protections
Equally valuable is identifying what is absent. A contract without a data processing agreement. A services agreement missing an SLA. A vendor contract without limitation of liability. These omissions are harder for humans to catch consistently because they require remembering what should be there, not just reading what is.
Deviation From Standards
Organisations with standard contract templates benefit from AI comparison: how does this executed contract deviate from the standard template? Which clauses were modified, removed, or added? This is particularly valuable for organisations managing large portfolios of similar agreements.
40-60%
reduction in initial contract review time with AI-assisted screening
Source: RIVER, enterprise engagement data, 2025
What AI Misses
Honesty about limitations is more important than enthusiasm about capabilities.
Strategic context. AI cannot assess whether a contract serves the organisation's strategic interests. A perfectly structured agreement that commits the organisation to the wrong partnership is not something AI will flag.
Relationship dynamics. Contract terms exist in the context of a business relationship. A clause that looks unfavourable in isolation might be acceptable given the overall value of the partnership. AI does not understand relationship context.
Novel risk. AI is pattern-based. It catches risks it has been trained on. Novel or unusual risks that do not match known patterns may pass through. This is why human review of AI-screened contracts remains essential.
Cultural and sovereignty considerations. This is where I see the most significant gaps in standard legal AI. Contracts involving Māori data, Indigenous intellectual property, or culturally significant information carry obligations that go beyond standard commercial law. AI can be trained to flag these contexts, but the assessment requires human expertise grounded in those obligations.
Try It
Loading demo...
Implementation Guide
Phase 1: Risk Taxonomy (2 weeks)
Define your risk categories. What does your organisation consider high risk, medium risk, and low risk? This taxonomy becomes the foundation for everything the AI does. Work with legal, procurement, and compliance to build it.
Standard categories include:
- Liability and indemnification
- Termination and renewal
- Data protection and sovereignty
- Intellectual property
- Service levels and remedies
- Insurance and guarantees
- Dispute resolution and jurisdiction
Phase 2: Model Training (3-4 weeks)
Train the model on your contract types. This requires a corpus of reviewed contracts with risk annotations. The more representative the training data, the better the model performs on your specific contract patterns.
For NZ enterprises, jurisdictional tuning matters. NZ contract law, the Consumer Guarantees Act, the Fair Trading Act, and Privacy Act implications are different from US or UK law. A model trained primarily on US contracts will miss NZ-specific risks.
Phase 3: Integration (2-3 weeks)
Integrate the review system into your contract workflow. This is typically a document upload interface that produces a structured risk report: flagged clauses, severity ratings, missing protections, and template deviations.
The output goes to the legal reviewer, not to the contract signatory. The AI screens; the human decides.
Phase 4: Feedback and Refinement (ongoing)
Every review where the human disagrees with the AI assessment is a training signal. Build the feedback loop from day one. Track false positives (AI flags something that is not actually a risk) and false negatives (AI misses something the human catches). Over time, the model learns your organisation's risk appetite and judgement patterns.
Governance Requirements
AI contract review creates its own governance obligations:
- Transparency. Counterparties should know if AI is involved in contract review. This is both an ethical obligation and, in some jurisdictions, a regulatory one.
- Data handling. Contracts contain sensitive commercial information. The AI system must handle this data with appropriate security, access controls, and retention policies.
- Audit trail. Every AI assessment should be logged, including the model version, the risk flags generated, and the human reviewer's decisions. This is essential for compliance and dispute resolution.
- Sovereignty. Contract data should be processed within appropriate jurisdictional boundaries. For NZ enterprises, this typically means NZ-hosted infrastructure.
The Compound Effect
Contract review is a gateway to broader legal AI. The clause extraction capability serves compliance monitoring. The risk taxonomy informs policy development. The document processing infrastructure supports due diligence, regulatory analysis, and knowledge management.
Build the contract review capability well, and you have built the foundation for a comprehensive legal AI practice.
