AI voicemail scams are on the rise. Here’s how to avoid them

The most powerful people on the planet don’t quite know what to do with AI as it quickly becomes one of the most significant new technologies in history.
But criminals do.
In the six months since OpenAI first unleashed ChatGPT on the masses and set off an artificial intelligence arms race with the potential to reshape history – a new breed of cybercriminals has been among the first to cash in.
These next generation bandits are coming armed with sophisticated new tools and techniques to steal hundreds of thousands of dollars from people like you and me.
“I see a very worrying increase in criminals using advanced technology ̵[ads1]1; AI-generated deep fake and cloned voices – to carry out very sophisticated plans that are almost impossible to detect,” Haywood Talcove, managing director of LexisNexis Risk Solutions’ Government Group, a multinational information group. and analytics company based in Atlanta told me over Zoom.

AI-generated images are already fooling people:Why experts say they’re only getting harder to detect.
Competition in Cyberspace:Google ups the ante on AI to compete with ChatGPT. Here’s how search and Gmail are changing.
“If you get a call in the middle of the night and it sounds exactly like your panicked child or grandchild saying ‘help, I was in a car accident, the police found drugs in the car, and I need money to post bail (or for a container for a lawyer),’ it’s a scam,” Talcove explained.
Earlier this year, law enforcement officials in Canada said a man used AI-generated voices he likely cloned from social media profiles to con at least eight pensioners out of $200,000 in just three days.
Senior Fraud:An elderly man was defrauded of millions. Could the bank have done more to prevent fraud?
The what-if scenarios:Fears of AI dangers are growing as some question whether tools like ChatGPT will be used for evil
Similar scams preying on parents and grandparents are also popping up in almost every state in America. This month, several school districts in Oregon warned parents about a wave of fake kidnapping calls.
The calls come in from an unknown caller ID (although cell phone numbers are easy to spoof these days). There will be a voice that sounds just like your loved one saying they are in trouble. Then they get cut off, you hear a scream, and another voice comes on the line demanding a ransom, or something.
The FBI, FTC and even the NIH are warning of similar scams targeting parents and grandparents across the United States. In recent weeks, it has happened in Arizona, Illinois, New York, New Jersey, California, Washington, Florida, Texas, Ohio, Virginia and many others.
An FBI special agent in Chicago told CNN that families in America lose an average of $11,000 to each fake kidnapping scam.
Here’s what to do if you get that call
Talcove recommends having a family password that only you and your closest inner circle share. Don’t make it something that’s easy to discover online either – no pet names, favorite bands, etc. Better yet, make it two or three words that you discuss and remember. If you get a call that sounds like someone you love, immediately ask them for the code word or phrase.
If the caller pretends to be the police, tell them you have a bad connection and will call them back. Ask for the name of the facility they’re calling from (campus security, local jail, FBI) and hang up (although scammers will say just about anything to get you to stay on the line). If you cannot reach your loved one, look up the phone number of that facility or call your local police and tell them what is happening.
What is ChatGPT?:Everything to know about OpenAI’s free AI essay writer and how it works
New Twitter boss:What you should know about Linda Yaccarino, Elon Musk’s choice
Remember, these criminals use fear, panic and other tried and tested tactics to get you to share personal information or send money. Usually, the caller wants you to transfer money, transfer it directly through Zelle or Venmo, send cryptocurrency or buy gift cards and give them the card numbers and PINs. These are all giant red flags.
Also, be more careful than ever about what information you send out into the world.
An FTC notice also suggests calling the person who allegedly contacted you to verify the story, “using a phone number you know to be theirs. If you can’t reach your loved one, try to get in touch with them through another family member or their friend », it says on its website.

To see it all unfold
“A criminal only needs three seconds of audio of your voice to ‘clone’ it,” warns Talcove. “Be very careful with social media. Consider making your accounts private. Don’t reveal the names of your family or even your dog. This is all information that a criminal armed with deep-fake technology can use to trick you or your loved ones into a scam.”
Talcove shared half a dozen “how-to” videos he says he pulled from the dark web showing these scams in action. He explained that criminals often sell information on how to make these deep fakes to other fraudsters.
“I keep my eyes on criminal networks and new tactics. We literally monitor social media and the dark web and infiltrate criminal groups,” he added. “It’s getting scary. For example, filters can be used over Zoom to change someone’s voice and appearance. A criminal who grabs just a few seconds of audio from yours [social media feeds]for example, can clone your voice and tone.”
Fool my relatives with a clone of my husband’s voice
I skipped all the organized crime parts and just googled “AI voice clone”. I won’t say exactly what tool I used, but it took me less than ten minutes to upload 30 seconds of my husband’s voice from a video stored on my smartphone to an online AI sound generator, for free. I typed in some funny lines I wanted “him” to say, saved it to my laptop and sent it to our family. The most challenging part was transferring the original clip from a .mov to a .wav file (and that’s easy too).
Jennifer Jolly’s AI voice generator example.
It fooled his mother, my parents and our children.
“We are all vulnerable, but the most vulnerable among us are our parents and grandparents,” says Talcove. “99-in-100 people couldn’t spot a deepfake video or voice clone. But our parents and grandparents, categorically speaking, are less familiar with this technology. They would never suspect that the voice on the phone, which sounds exactly like their child screaming for help during a kidnapping, could be completely artificial.”
More from Jennifer Jolly:
Jennifer Jolly is an Emmy Award-winning consumer technology columnist. The views and opinions expressed in this column are those of the author and do not necessarily reflect those of USA TODAY.