“The internet never forgets.” This warning from the U.S. Army Reserves is a stark reminder that online activities create an unerasable trail of information.
In a digital age where most Americans spend countless hours on social media, the sheer availability of Personally Identifiable Information (PII) on the internet provides an unprecedented opportunity for illicit activities that exploit your customer’s likeness. With deepfake technology becoming a significant concern, one such scam — virtual kidnapping — is growing increasingly refined and difficult to identify. The need to understand the scope of the challenge and ensure your institution has robust controls to protect your customers from extortion, is paramount.
Virtual Kidnapping, Real Consequences
“GenAI-rendered content that is highly realistic is commonly referred to as “deepfake” content, or “deepfakes”. Deepfakes can manufacture what appear to be real events, such as a person doing or saying something they did not actually do or say.”
Virtual kidnapping is an unsettling financial crime where a victim is led to believe that a loved one has been taken hostage. In a phone call, the criminal threatens violence against their captive and ultimately coerces the victim to sending a ransom payment, usually by wire transfer. The entire scenario is socially engineered, capitalizing on the victim’s panic before they realize their loved one was never in harm’s way.
Once limited to U.S. border states, virtual kidnapping proliferated across the country in the late 2010s. While the scam originally relied on persuasion and the power of suggestion, it has evolved in the 2020s alongside generative AI. In these advanced typologies, clips of a grandchild or other loved one are harvested from social media and transformed into deepfake audio or video of the individual in distress. These deepfakes are ultra-realistic and can make the situation feel authentic — maximizing the scammer’s ability to extort funds from their target.
Protecting Payments in AI-Enabled Extortion
Recently, FinCEN issued a warning to financial institutions on the significant threat of fraud schemes involving deepfake media, noting its potential for use in impersonating victim’s family members, friends and trusted persons. As AI and deepfakes grow in prominence, your customers may be increasingly susceptible to scams leveraging this technology, such as virtual kidnapping. A robust payments fraud solution is essential to detect and interdict wires or other suspicious payments before losses occur.
According to FinCEN’s Alert, a financial red flag of deepfake fraud is when “a newly opened account or an account with little prior transaction history has a pattern of rapid transactions; high payment volumes to potentially risky payees… or high volumes of chargebacks or rejected payments.”
By offering insight into the payor and payees who do not bank at your institution, a consortium approach can help protect against these and other payments fraud scams. Analyzing hundreds of millions of counterparties, this approach provides insight into the complete picture of risk across the entirety of a payment. Payments involving known accounts with a history of legitimate activity across the consortium are analyzed as lower risk, while transactions involving unrecognized accounts are analyzed as higher risk — representing potential mules or other accounts opened to facilitate fraud. When strengthened with cross-channel analytics that consider customer activity across multiple payment rails and real-time analysis and interdiction, your institution can shut down suspicious transfers with confidence.
Ready for the Future
The intersection of deepfake technology and virtual kidnapping highlights the growing threat of criminal activity in our digital age. More than ever, financial institutions must remain vigilant and adopt robust fraud detection solutions to protect their customers. Embracing innovation in fraud prevention is essential to safeguard payments and your customers’ funds in the years ahead.
![](https://verafin.com/wp-content/themes/material-verafin-2.5-prod/i/subscribe-icon.png)