All the scammer needs now is a small audio clip of your voice and they can create an AI robocall.
Tech experts created an AI generated voice of our own ABC7 Consumer Investigative Reporter Jason Knowles, asking, "Can you send me $500?" But he never actually said those words.
"It's Jason, I've been in a car accident here on my vacation in Mexico," the AI generated voice continues. "I'm okay for the most part, but I am at some hospital and they say I need to pay money for x-rays and treatment. My credit card isn't working and I don't have enough cash."
"Anybody receiving that call would think it's you and you're in trouble," said Greag Bohl, chief data officer with Transaction Network Services.
The company, with a Chicago office, helps wireless carriers stop robocalls. Bohl made Knowles' AI call using cloning software and a clip of his voice from an ABC7 broadcast. He said scammers can do the same thing to you simply by trawling social media.
"They can go on TikTok, they can go on any of these media platforms and go ahead and extract that voice," he said. "Actually, what can take place is a bad actor can take your voicemail from your cell phone and replicate that from your voice for as little as three seconds."
The Federal Trade Commission said it's making AI robocalls a top priority, and educating people on the issue. The FTC said the new technology is making what's known as the "imposter scam" even more convincing.
"The number one scam for the FTC for the last several years has been imposter scams and voice clone calls are a form of imposter scams," said Todd Kossow, director of FTC Midwest Region Office in Chicago. "Imposter scam is any scam where somebody pretends to be somebody they're not in order to gain your trust so a government agency, a private company like Microsoft or your friend or family member for the grandparents scam."
"It's just the start," explained Bohl. "It's early in the year, and we're going to see more and more activity as we roll through and get closer to elections."
In January, there was an AI recording of President Biden meant to discourage New Hampshire voters from showing up to the primary.
And it's not just recordings; Bohl said it's possible for scammers to fake voices on a live talk-back call, making it seem more real.
"Everybody has to know that this is technology that's here today, it's not in the future it's here it's very low cost," he said.
You can prevent scammers form using your voice by thinking twice before posting videos on social media. Also, use the automated voicemail greeting on your phone, instead of your own voice. You can also outsmart scammers pretending to be your loved ones.
"Even if it sounds exactly like your grandchild, one thing you can do is you can ask the caller you know a personal question that only you would know so even if it sounds like you," recommended Kossow. "They won't be able to answer that question."
Something that simple can help you avoid these high-tech fakes.
The FTC recently made it illegal for scammers to impersonate government agencies or businesses, and a pending law could make it illegal to impersonate an individual.
If you get an impersonation call that you think is from a family member saying they are injured or kidnapped, you should contact that family member directly.