As Valentine’s Day nears, Tenable, an exposure management company, warns that romance scams remain a major threat, with fraudsters now using generative AI (GenAI) to deceive victims more convincingly.
A recent case in Hong Kong highlights the growing sophistication of these scams, Tenable shared. Fraudsters used artificial intelligence (AI) to create deepfake videos and audio, tricking victims across Taiwan, Singapore, and India into believing they were in genuine relationships. This scheme led to over $46 million in losses.
“Many of these scammers operate from overseas and don’t speak fluent English,” said Satnam Narang, senior staff research engineer at Tenable. “AI helps them craft sophisticated, emotionally compelling messages that make their scams more believable and harder to detect.”
Romance scams affect people of all ages, but elderly individuals, former military personnel, and those seeking financial arrangements are particularly vulnerable. Scammers employ various tactics, including impersonating service members with stolen photos and setting up fake “sugar mummy and daddy” schemes to lure victims into fraudulent financial transactions. Others persuade victims to join adult video chats requiring paid registrations, generating illicit profits.
Pig butchering
The most damaging type of romance scam today is “romance baiting,” previously known as pig butchering. In these long-term schemes, fraudsters build trust over time before convincing victims to invest in fake cryptocurrency or stock platforms.
“People have lost their life savings to romance scams, and it’s heartbreaking,” said Narang. “Victims are often blamed for falling for these schemes, but these scams are highly manipulative and exploit vulnerabilities that anyone could have.”
Recovering lost funds is difficult, especially when cryptocurrency is involved. Scammers also target victims again by posing as recovery agents and offering to retrieve lost money for a fee.
Authorities urge caution: If someone you’ve never met asks for money, consider it a red flag and report it immediately.