Enterprise AI training has a shelf life problem. The workshop you ran three months ago taught people GPT-4 prompting techniques. The tool has been updated twice since then. The workflows have changed. Half the team has forgotten what they learned. The other half have developed their own approaches that may or may not be effective. Josiah and I have been thinking about this differently: instead of training events, build learning systems.
What You Need to Know
- One-off AI training events decay to roughly 20% retention within three months. Learning systems maintain and build capability over time.
- The key difference: training teaches a skill at a point in time. A learning system continuously develops capability as the technology and use cases evolve.
- Effective AI learning systems have three layers: foundational literacy (everyone), applied practice (practitioners), and strategic evaluation (leaders).
- Peer learning is the highest-leverage component. People learn AI skills faster from a colleague who solved a similar problem than from a trainer who demonstrates capabilities.
Why Training Events Fail
The Forgetting Curve
Ebbinghaus's forgetting curve applies to AI training just as it applies to everything else. Without reinforcement, people retain about 40% of new material after one day and roughly 20% after one month. A two-hour AI workshop is a momentary spike in knowledge that rapidly decays.
One-off sessions don't work. Sustained, structured learning with practice opportunities and peer support works. It's not more complicated than that, but it is harder to implement.
Dr Josiah Koh
Education & AI Innovation
The Context Problem
AI training conducted in a classroom or virtual session lacks the most important learning context: the user's actual work. Demonstrating prompt engineering on generic examples builds generic knowledge. The user still has to figure out how to apply it to their specific claims processing, contract review, or reporting workflow.
The Moving Target
AI tools and capabilities change faster than training can keep up. A curriculum built for GPT-4 needs updating for GPT-4o. A workflow designed for one RAG implementation needs adapting when the retrieval approach changes. Static training cannot keep pace with dynamic technology.
Building Learning Systems
Layer 1: Foundational Literacy (Everyone)
A baseline understanding of what AI can and can't do, how to evaluate AI output, and how to communicate AI needs. This layer is relatively stable, because foundational concepts change slowly even as specific tools change fast.
Format: Short, self-paced modules (30-45 minutes each) with embedded practice. Updated quarterly to reflect significant technology shifts.
Measure: Can every team member identify one AI opportunity in their workflow and evaluate an AI-generated output critically?
Layer 2: Applied Practice (Practitioners)
Hands-on skill building with the specific AI tools and workflows relevant to each team's work. This layer changes frequently and must be designed for rapid updates.
Format: Team-based workshops (2-3 hours) focused on real tasks from the team's actual workflow. Followed by structured practice periods (2 weeks) with peer support and coaching.
Measure: Can practitioners design an AI-assisted workflow for their team and iterate on it based on results?
Layer 3: Strategic Evaluation (Leaders)
The ability to evaluate AI investments, understand programme risks, and make informed decisions. This layer connects AI capability to business outcomes.
Format: Half-day executive sessions, quarterly. Each session covers one strategic AI topic in depth with case studies and decision frameworks.
Measure: Can leaders evaluate an AI business case critically and articulate what success looks like for their organisation?
The Peer Learning Engine
The highest-leverage component of any AI learning system is structured peer learning. When a team member figures out a clever use of the AI tool for their specific workflow, that knowledge is immediately relevant to their colleagues.
Building a peer learning engine:
- Share sessions. Monthly 30-minute sessions where team members demonstrate what they've figured out. Low ceremony. Real examples from real work.
- Internal library. A simple repository of "here's what I tried and here's what worked." Wiki, shared doc, Slack channel, whatever the team already uses.
- Champions network. Designated AI champions in each team who support their colleagues and escalate issues. These are not trainers. They are peers who happen to be further along the learning curve.
The organisations that build AI capability fastest are not the ones that train the hardest. They're the ones that build systems for continuous learning, peer support, and practical application. Training is an event. Learning is a system. Build the system.

