Anti-Financial Crime & Financial Crime Compliance
Regulatory Intelligence Leadership | Insight | Network

AI, Financial Crime, Tech, Technology, US

NEWS: FBI alert warns criminals are using AI to commit fraud ‘on a larger scale’

Federal Bureau of Investigation [FBI] - FILE PHOTO, REUTERS/Yuri Gripas

BY PAUL O’DONOGHUE, Senior Correspondent

THE FBI has issued a public alert warning that criminals are increasingly exploiting artificial intelligence (AI) tools to conduct financial fraud, making scams more believable and harder to detect.

The advisory from the U.S. Federal Bureau of Investigation outlined how generative AI is being used in schemes involving social engineering, spear phishing, and identity fraud. Criminals have deployed AI-generated text, images, videos, and audio to deceive victims.

“The FBI is warning the public that criminals exploit generative artificial intelligence (AI) to commit fraud on a larger scale, increasing the believability of their schemes,” the agency stated in its announcement.

Attackers reportedly use AI-generated text to create fake social media profiles, fraudulent websites, and phishing emails. AI-generated images enhance the credibility of these profiles or are shared privately to convince targets they are communicating with real individuals.

“Criminals generate fraudulent identification documents, such as fake driver’s licenses or credentials (law enforcement, government, or banking) for identity fraud and impersonation schemes,” the FBI said.

Criminals are also using vocal cloning and deepfake videos. AI-generated audio can mimic public figures or acquaintances to manipulate victims into transferring funds. Meanwhile, fake videos are being employed in investment frauds or live video calls, posing as company executives or authority figures.

The FBI highlighted an incident involving North Korean cybercriminals who used a deepfake video to secure a job on an AI team at KnowBe4, leveraging their position to access sensitive information. Another case involved Russian threat actors creating fake videos as part of misinformation campaigns targeting the U.S. 2024 elections.

To counter these threats, the FBI advises individuals to develop secret codes with trusted contacts to verify identities, limit sharing of personal images and voice data online, and remain vigilant for subtle inconsistencies in suspicious content.

AML Intelligence
We hope you enjoyed reading this article

If you would like unlimited access to AML Intelligence premium articles, newsletter delivered twice a week, access to our Global Bank Fines and Penalties database, free access to Boardroom Series events and much more, select one of our subscription options and become a subscriber!