Scams are not new, but the tools behind them have changed dramatically in recent years. With artificial intelligence becoming more accessible, fraudsters are finding new ways to trick people, often by exploiting fear, urgency, and trust.
The numbers tell a worrying story. According to data from the Sift Global Data Network, blocked scam content jumped 50% in the first quarter of 2025 compared to the same period last year.
This rise shows what users are feeling online. In a survey conducted by Sift, 74% of respondents said they have noticed an increase in spam and scams. Experts say a major reason behind this surge is the growing use of generative AI tools by criminals.
When a familiar voice is not real
One recent incident shared on X (formerly Twitter) shows just how convincing these scams have become. Dustin Burnham, described how he narrowly avoided losing money to an AI-powered scam that sounded deeply personal.
My wife calls me, panicked.
— Dustin Burnham (@ModernDad) February 11, 2026
The call is from her number, and her voice is unmistakable- that’s my wife.
‘Babe, our son is hurt. He got in a bike wreck. I’m at the emergency room but they won’t take our insurance and I need cash to get him help. Please send me 3000 dollars as… https://t.co/Gbo9r96VsX
“My wife calls me, panicked. The call is from her number, and her voice is unmistakable- that’s my wife,” he wrote.
On the call, the voice claimed their son had been injured in a bike accident and needed urgent medical help. The caller insisted that the hospital would not accept insurance and demanded $3,000 in cash immediately. “‘Babe, our son is hurt. He got in a bike wreck. I’m at the emergency room but they won’t take our insurance and I need cash to get him help. Please send me 3000 dollars as soon as you can, he’s really not doing well,’” the voice said..
A simple question saved the day
Instead of panicking, Burnham paused and asked for something only his real wife would know. “‘Wow, that’s scary. Tell me our passphrase and then I’ll send the money,’” he replied. That is when the story began to fall apart. “‘What? What passphrase? This is your wife, our son is hurt. Send the money now!!’” the caller said.Burnham trusted his instinct and ended the call.
“‘I’ll call you back. I don’t believe that this is my wife. If it is, I’m sorry, but we discussed this,’” he wrote. When he called his wife back directly, the truth was clear. The number had been spoofed, and the voice was not real.
How scammers pull this off
Burnham later explained how easy it has become to fake both phone numbers and voices. Phone number spoofing can make a call appear as if it is coming from a trusted contact, and there is no reliable way to detect it while the call is happening. The voice, he said, was AI-generated. With just a few seconds of audio, often pulled from social media, scammers can now create realistic voice deepfakes that sound exactly like a spouse, parent, or child.
How to tackle such scams?
Burnham believes this incident shows a larger problem that will only grow worse. “Cognitive security is an essential skill in 2026,” he wrote, urging people to assume that images, videos, and even voices could be fake unless proven otherwise.
He also stressed the importance of planning ahead with family members. Having a shared passphrase, discussed offline and never written down, can act as a last line of defence when emotions are running high. He suggested pairing the passphrase with a harmless trigger sentence so it does not raise suspicion during a real call.
Equally important, he said, is having backup ways to communicate with close family members. Relying on a single phone number is no longer enough in a world where identities can be copied so easily. “I already get dozens of increasingly realistic spam calls and texts daily- it’s only going to get more annoying,” he wrote. “Have a plan to keep your family and your finances safe.”
