Cybersecurity solutions company Tenable said romance scams are becoming more dangerous as criminals use advanced artificial intelligence (AI) to trick victims and make more money.

According to the Federal Trade Commission (FTC), investment scams, often linked to romance fraud, caused $5.7 billion in losses in 2024. Experts believe the actual amount could be higher.

“2026 marks our entry into a dark age of romance scams,” said Satnam Narang, senior staff research engineer at Tenable. “The availability of powerful frontier AI models has provided digital gold for scammers. For the price of a cup of coffee, predators can now generate perfect, emotionally convincing messages to lure victims worldwide.”

Narang outlined four main ways AI is changing romance scams:

  1. The AI Frontier – Scammers use AI language models to write messages that have correct grammar, keeping dozens of fake conversations going at once.
  2. The AI Room – Some operations use “AI Rooms” with deepfake video calls, showing victims a fake face to make the scam seem real.
  3. The Investment Pivot – Romance is now mostly a hook. Scammers build trust and show fake financial success before convincing victims to invest, then take their savings.
  4. Open-Source Tools – Free AI models like DeepSeek and Qwen let scammers operate without ethical limits, making sophisticated fraud easier and cheaper.
Close-up portrait of a smiling man with glasses and a beard, wearing a light blue collared shirt, against a blurred background of greenery and buildings.
Satnam Narang, senior staff research engineer at Tenable

“These scams are the engine of a multi-billion-dollar industry, often built on the backs of trafficking victims,” Narang said. “Inside these compounds, people are forced to work like on a sales floor, with quotas and celebrations when a victim loses money. The technology is new, but the psychological tricks are old, just scaled up.”

Consumers are advised to stay alert. Screenshots of earnings, claims of insider knowledge, or sudden talk about investments are strong warning signs. If a match brings up money, the safest response is to cut contact, unmatch, and report.

“Even as AI improves at generating audio, video, and images, basic caution still works,” Narang said. “If something sounds too good to be true, it probably is. Don’t fall for staged investments or fake success stories.”

Authorities have tracked scam operations for years, but firsthand accounts now reveal how AI has made them more sophisticated, with dedicated rooms and deepfake tools that make fraud harder to spot.

Discover more from Back End News

Subscribe now to keep reading and get access to the full archive.

Continue reading