On September 8, 2025, FinCEN issued a strong warning about a disturbing trend: a surge in financially motivated sextortion. The main victims? Teenage boys.
Criminals are now using generative AI and deepfakes to create realistic, explicit images of minors. Once the images are made, the victim is pressured to pay up or face exposure. In 2024, the FBI logged nearly 55,000 reports of sextortion and extortion, with losses topping $33.5 million — a 59% jump from the year before. Law enforcement has linked many teen suicides directly to this kind of crime.
FBI logged nearly 55,000 reports of sextortion and extortion, with losses topping $33.5 million.
From Deepfakes to Extortion
The playbook is evolving. Fraudsters create fake but convincing images or videos, then demand payment — often through P2P transfers, crypto wallets, or prepaid cards. Victims, terrified of exposure, send money quickly and quietly.
FinCEN’s Six Red Flags to Watch For
FinCEN wants financial institutions to step in. They’ve outlined six key warning signs AML teams should be looking for:
- Minors making repeated P2P transfers to risky jurisdictions.
- Series of small, round-dollar transfers (think $10–50) to unknown people, which get quickly moved onward.
- Odd payment memos like “delete the pictures” or “please stop,” often sent late at night.
- Teens buying crypto on P2P apps and sending it to unhosted wallets flagged for illicit use.
- Sudden prepaid card purchases, redeemed far away from where they were bought.
If these show up, it’s time to take a closer look — and potentially file a SAR. FinCEN specifically asks filers to include the term “SEXTORTION” in narratives.
Why It Matters for AML
This isn’t just a “cybercrime problem.” It’s a financial crime with real human costs. Every suspicious transfer is part of a chain that fuels blackmail, mental health crises, and in some cases, tragedy. For AML professionals, spotting these patterns is more than compliance — it’s protection.
This alert also builds on FinCEN’s 2024 notice about deepfakes in fraud, which warned banks about AI-generated IDs and documents being used in onboarding scams. AI is changing the fraud landscape, and financial institutions are in the front row.
What AML Teams Can Do
- Update monitoring systems with FinCEN’s sextortion red flags.
- Pay extra attention to minors’ accounts, especially joint accounts with parents.
- Educate frontline staff — those weird late-night memos could be lifesaving clues.
- File SARs clearly and consistently, using the right keywords so law enforcement can connect the dots.
Final Thoughts
Deepfake sextortion is a chilling mix of technology abuse and financial crime. For AML teams, it’s a reminder that our work doesn’t just tick regulatory boxes — it can directly protect vulnerable people. Staying alert to new patterns, especially those flagged by FinCEN, is more important than ever.
Check out the full AML Essentials Course, where we cover real-world typologies, case studies, and compliance best practices.




Leave a Reply