views
A man’s father nearly fell victim to a Rs 25 lakh voice-cloning scam. Jay Shooster, who is running for the Florida State House, shared the alarming incident online. He explained that a scammer used AI to clone his voice, tricking his parents into believing he had been in a car accident and arrested for driving under the influence and that he urgently needed money for bail.
“Today, my dad got a phone call no parent ever wants to get,” wrote Jay Shooster on X (formerly known as Twitter).
He added, “He heard me tell him I was in a serious car accident, injured, and under arrest for a DUI and I needed $30,000 to be bailed out of jail. But it wasn’t me. There was no accident. It was an AI scam.”
Today, my dad got a phone call no parent ever wants to get. He heard me tell him I was in a serious car accident, injured, and under arrest for a DUI and I needed $30,000 to be bailed out of jail. But it wasn't me. There was no accident. It was an AI scam.
— Jay Shooster (@JayShooster) September 28, 2024
Shooster continued that the incident happened after he went live on television, adding “As a consumer protection lawyer, I’ve literally given presentations about this exact sort of scam, posted online about it, and I’ve talked to my family about it, but they still almost fell for it. That’s how effective these scams are.”
As a consumer protection lawyer, I've literally given presentations about this exact sort of scam, posted online about it, and I've talked to my family about it, but they still almost fell for it. That's how effective these scams are. Please spread the word to your friends and…— Jay Shooster (@JayShooster) September 28, 2024
The lawyer added that his father got alerted after the scammer refused to accept payment by credit card: “Seems like the main thing that tipped them off was that they wouldn’t accept payment via card. Then other things started to seem fishy (e.g., I claimed that the random public defender assigned to me was a great lawyer).”
He stressed that world leaders need to prioritise regulating the AI industry to contain such scams, adding, “A very sad side-effect of this voice-cloning tech is that now people in *real* emergencies will have to prove their identities to their loved ones with passwords etc. Can you imagine your parent doubting whether they’re talking to you when you need help?”
Here’s how people reacted to this AI scam:
“My son’s grandmother got the same call,” an individual claimed.
Another added, “My dad had us memorise a password since the 70s for communication like this.”
“My dad would have told me good luck and hung up even if he thought it was me,” joked a third.
A fourth asked, “As a parent… What do you suggest we do when receiving a call like this? Text them while they are supposedly on the line with me? ‘Are you calling me?’”
To this, Shooster replied, “Tell them you’re going to hang up and call them back on their phone. Ideally, you would have agreed to use some kind of password in advance or you can ask them questions that only they would know the answer to and that would be very hard to for someone find online (and that you have never used before in an online form (e.g., do not use the name of your first pet or any of those standard password questions)).”
Comments
0 comment