‘Car crash victim’ calls mother for help and $15K bail money. But it’s an AI voice scam
A woman in Florida was tricked into giving thousands of dollars to a scammer after her daughter’s voice was AI-cloned and used in a scam.
Sharon Brightwell says she received a call from someone who sounded just like her daughter. The woman on the other end was sobbing and crying, telling her mom that she had caused a car accident in which a pregnant woman had been seriously injured. She said she’d been texting and driving and that her phone had now been taken by police.
“There is nobody that could convince me that it wasn’t her. I know my daughter’s cry.”
A man claiming to be her daughter’s attorney then allegedly took over the phone. He told Sharon that authorities were detaining her daughter and that she needed to provide $15,000 in cash for bail. He gave very specific instructions on what to do, including not telling the bank what the large withdrawal was for since, he said, it might affect her daughter’s credit rating.
Sharon withdrew the money, placed it in a box, and a driver picked it up. But that wasn’t the end. A new call followed, informing her that the pregnant woman’s unborn child had died in the accident, but that the family had agreed not to sue Sharon’s daughter if she paid them $30,000 dollars.
Luckily for Sharon, her grandson didn’t trust the whole thing and decided to call her daughter’s number. That call was answered by her daughter who was at work, unaware of anything that had been going on.
By then it was too late for the $15,000.
“My husband and I are recently retired. That money was our savings.”
Unfortunately, we’re hearing a lot of these and similar stories. So, what’s going on and how can we protect ourselves?
Cloning voices with AI has improved considerably over the years and has become easily available to everyone, including cybercriminals. Many of our voices are online, via video or audio that’s been posted to social media. In Sharon’s case, they believe the scammers used videos from Facebook or other social media to create the replica of her daughter’s voice.
AI-powered phone scams can range from brief, scripted robocalls to full conversations. Recent studies have shown that relying on human perception to detect AI-generated voice clones is no longer consistently reliable. I imagine it’s even harder to determine when the voice is made to sound stressful and upset and you believe it to be your child.
How to stay safe from AI-generated voice scams
- Don’t answer calls from unknown callers and be careful about where you’ve posted audio and video online in which your voice features. It only takes a recording of a few seconds of your voice to create a convincing clone.
- Agree on a family password that only you and your loved ones know. Don’t ever post or message about this online anywhere, decide on it in person and stick to it.
- If you’ve forgotten the password, ask about a long-ago memory that hasn’t featured on social media. Be sure it is definitely your loved one that you are talking to.
- Don’t try to handle situations like these alone. Find a friend, family member, friendly neighbor, or anyone who can sensitively give you their view, or support you if you’ve fallen for the scam. Sometimes having a second opinion, like Sharon’s grandson, can help to make you think twice before handing over any money.
And if you decide you don’t trust the situation:
- Call the number you have for the relative or use other channels to contact them.
- Whether you’ve fallen for the scam or not, report the incident to local authorities, the FTC, or relevant consumer protection bodies. Every report helps track and prevent future scams, and you may even help catch one of these criminals.
We don’t just report on phone security—we provide it
Cybersecurity risks should never spread beyond a headline. Keep threats off your mobile devices by downloading Malwarebytes for iOS, and Malwarebytes for Android today.
Source link