Skip to main content

Five Dollars to Become Anyone

A synthetic identity kit costs less than a coffee. AI-generated face, cloned voice, supporting documents. The deepfake-as-a-service economy has arrived and most enterprises aren't ready.
20 January 2026·4 min read
Mak Khan
Mak Khan
Chief AI Officer
Five dollars. That's what a synthetic identity kit costs on the dark web. AI-generated face. Cloned voice. Supporting documents. A complete person who never existed, ready to pass a KYC check.
$5
cost of a complete synthetic identity kit on the dark web

The Marketplace

Dark web vendors now sell identity packages the way legitimate businesses sell SaaS subscriptions. A basic kit, face and documents, runs about $5. A premium package with voice cloning and video capability costs a bit more. A subscription to a dark LLM (no safety guardrails, optimised for fraud) runs roughly $30 per month.
AI tools now dominate over 60% of dark web cyber listings. This is an industrialised supply chain.

Voice Cloning Crossed the Line

The technical barrier for voice cloning has collapsed. A few seconds of audio, pulled from a conference talk, a podcast appearance, a LinkedIn video, is enough to generate a near-perfect clone. Not "sounds roughly similar." Indistinguishable.
This changes the threat model for every organisation that uses voice as a verification factor. Phone callbacks, voice-authorised transactions, verbal confirmation of identity. All of these assumed that a person's voice was difficult to replicate. That assumption is dead.
60%
of dark web cyber tool listings are now AI-powered

What Breaks

KYC and identity verification. Most identity checks rely on a combination of documents and a face match. Both are trivially generated. Liveness detection helps, but the arms race between generation and detection is tilting toward the generators.
Remote hiring. Video interviews are no longer proof of identity. Organisations hiring remotely, which is most organisations, need verification that goes beyond "I saw them on camera and they matched their CV photo."
Multi-factor authentication. Voice-based MFA, face-based MFA, and even video-based verification are all vulnerable. The remaining strong factors are physical tokens and cryptographic keys. Everything biometric is now replicable.

An Engineer's Perspective

I work with AI systems daily. The generation quality is genuinely impressive from a technical standpoint. It is also genuinely terrifying from a security standpoint.
The defence cannot be "detect fakes better." Generation will always outpace detection. The defence has to be systemic: multi-channel verification, cryptographic identity, and the organisational discipline to slow down when something feels urgent.
A $5 identity kit won't fool a well-designed system. It will absolutely fool a system that relies on humans looking at faces and listening to voices.