True Crime Thursday – Vicious Vishing by Voice Cloning

Photo credit: Jason Rosewell, Unsplash

By Debbie Burke

@burke_writer

Believe half of what you see and none of what you hear.

That saying has been attributed to sources like Benjamin Franklin and Edgar Allen Poe. It’s appeared in song lyrics like Leon Haywood’s “Believe Half of What You See (and None of What You Hear)” and the third verse of the immortal Marvin Gaye classic, “I Heard It Through The Grapevine.”

Today, those wise words are even truer because of Voice Cloning, a new tool created by Artificial Intelligence (AI) for the cyber-scammer’s toolkit.

Phishing and Vishing are scams where criminals contact victims, often posing as a bank, reputable business, government agency, hospital, law enforcement, or other entity in order to gain access to your personal and/or financial information.

Phishing contacts victims by email, urging you to open an infected attachment or click on a link that downloads malware. Phishing attacks are generally sophisticated and massive in scale, targeting thousands of businesses and individuals at a time using automation.

Reportedly 74% of organizations in the US have been successfully targeted by phishing attacks.

Vishing (AKA voice phishing) is when a scammer contacts the victim by phone, impersonating a law officer, banker, charity, etc. who convinces you to verbally share your confidential information. The caller ID is spoofed, making the call appear to come from a legitimate source like a bank, Social Security Administration, credit card company, etc.

Vishing requires a scammer to contact one victim at a time, to persuade them to give up sensitive personal and financial information on the phone, making it less efficient than phishing.

However, vishing can still be devastatingly effective, especially now thanks to Voice Cloning. AI can take a sample of someone’s voice and create speech that’s impossible to tell from the real person.

Voice cloning is a boon for scams like Family Emergency, Friend Stranded Overseas, or Grandchild in Trouble. Scammers harvest voice samples of your loved one from YouTube, TikTok, and other online sources.

They can also call the person to record their voice or even use their outgoing voicemail message.

In a few seconds, criminals can download enough of a person’s voice to create a convincing imitation.

Now when “Johnny” calls Grandma saying he was in a car accident and needs bail money, the voice is identical to the real Johnny. That triggers panic, and the victim is more likely to act without thinking. And lose money in the process.

“Johnny” will ask for gift cards, cryptocurrency, or want you to wire money. Once you do, the funds are instantly transferred to the scammer and the transaction can’t be reversed.

Your money is gone.

How do you protect yourself against a voice that sounds exactly like your loved one?

The FTC advises:

“Don’t trust the voice. Call the person who supposedly contacted you and verify the story. Use a phone number you know is theirs. If you can’t reach your loved one, try to get in touch with them through another family member or their friends.”

A simple low-tech safeguard is to have a password or code that only you and your family knows. If something about a call with a loved one sounds suspicious, ask them for the password.

But be careful how you select a code. Criminals often scour social media accounts for clues to possible passwords.

If you post a photo of “my dog Spot” and choose that for your password, it could be guessed.

Speaking of Spot, to wrap up this post on a light note, does anyone remember the old Cal Worthington TV commercials that always featured “My Dog Spot”?

The last Worthington family car dealership was sold in February, 2023. End of an era but Cal’s jingle lives on.

~~~

TKZers: Has someone you know been taken in by voice cloning?

Can you think of ways to use Voice Cloning in mystery, suspense, or thriller fiction?

~~~

 

Investigator Tawny Lindholm plunges into an alternate reality where video is fake but death is real.

Please check out my new thriller Deep Fake Double Down.

This entry was posted in #truecrimethursday, Writing and tagged , , , by Debbie Burke. Bookmark the permalink.

About Debbie Burke

Debbie writes the Tawny Lindholm series, Montana thrillers infused with psychological suspense. Her books have won the Kindle Scout contest, the Zebulon Award, and were finalists for the Eric Hoffer Book Award and BestThrillers.com. Her articles received journalism awards in international publications. She is a founding member of Authors of the Flathead and helps to plan the annual Flathead River Writers Conference in Kalispell, Montana. Her greatest joy is mentoring young writers. http://www.debbieburkewriter.com

24 thoughts on “True Crime Thursday – Vicious Vishing by Voice Cloning

  1. Voice cloning is a scary step forward in scamming, Debbie. Brrr….

    If you do video marketing, be sure to add music, even if the volume is at zero. Same for TikTok and YouTube. That way, your voice isn’t as easily stolen.

    As an aside, I hate how Cal used wildlife to sell cars. Imagine how that poor elephant was treated? Heartbreaking.

    • How interesting about adding music, Sue. Thanks for the tip.

      Cal’s ads were entertaining when I was a kid. So were circuses. Back then, we didn’t know the underside.

      Sometimes the animals swiped back–like the tiger lying on the hood of a car with Cal in front. The tiger reached between his legs and grabbed a handful.

  2. Great subject for discussion today. Debbie.

    I don’t know of any family or friends that have been taken by Vishing, but we get tons of spam calls on our landline. Our caller ID and phone company are pretty good at identifying “high risk for spam call.” We usually don’t even pick up if we don’t recognize the caller. And when my wife answers, she loves to disguise her voice in different ways, or we just stay silent on the line to listen and see what country the spam is coming from.

    Interesting how writers get excited about new types of crime, ex. voice cloning, as ways to make their stories more interesting. Now, I’ll be thinking all day about ways to include voice cloning in my WIP.

    Hope your day is free of vishing and phishing.

    • Thanks, Steve. Yes, for writers, everything is grist for the story mill.

      I don’t answer the phone unless I recognize the name/number. Even then, caller ID can be spoofed. A few days ago, I received an MMS text message from a friend. It said it had to be downloaded by an expiration date a few days later. Something smelled. I called my friend. Sure enough, her phone had been hacked. Deleted the message.

  3. Thank you, Deb. Almost makes you yearn for the good old days of the Nigerian princess scammers and their poorly written letters.

    Have a great day!

    • I know what you mean, Joe. Thankfully, there are still all those handsome, widowed, military officer-doctors pining for me. Otherwise, I’d be so lonely.

    • Jim, I knew as an Angeleno, you’d remember Cal. I wanted to return the favor after your Sunday post with that trip down the branding memory lane.

  4. My day job includes IT security. Five or six years ago I saw a demo of phone scamming. Didn’t need voice copying, just three programs stacked one on the other. CallerID Spoofing. Ridiculously easy with the right software. In about ten keystrokes, I could call you from 202-456-1414. Next voice modulator. Bass to alto in a minute. The icing on the cake, background noise. The speaker called someone’s boss, from “a local bar” and the boss never knew the difference.

    • Alan, I knew you’d have additional input on this subject. Thanks for chiming in with that example.

      As you say, these techniques are “ridiculously easy.” One no longer needs to be a brain surgeon to take advantage of them. Scary.

  5. I just heard Glenn Beck interview a mom in Arizona who had an AI scam call about her daughter being kidnapped. The imitation of her daughter’s way of crying had her convinced that it was her daughter. It took talking to her husband and daughter to convince her that her daughter was ok.

  6. Another informative and cautionary True Crime Thursday post, Debbie. Voice cloning is only going to get better and better. I like the idea of a password to be used among family members. I think I mentioned it before, but a couple of years ago my father-in-law received a scam call about 6 or 7 A.M. claiming to be from his then 21 year old grandson, saying that he needed money to get his car out of ditch or some such. My FiL said, “I know it’s not my grandson, he never gets up before 10 A.M. and hung up.”

    Recently, concerned about recent significant advances in voice cloning, my wife talked with her dad about how good AI had become in impersonating voices and to not take any action based on call asking for help without first calling another family member. I pointed out that his grandson (our nephew), for example, would likely call his parents first, or friend, before bothering his grand parents.

    One idea for a mystery or thriller is to use voice cloning to threaten or provoke someone. I could see it being used to frame someone—it probably won’t pass police analysis, but if you wanted to finger someone for a hit on a rival gang or criminal organization, that might be an effective way to do it. Scary thought.

  7. “Believe half of what you see and none of what you hear.” Now we can’t believe what we hear *or* what we see.

    We have friends who moved to the U.S. about twenty years ago. They’re both physicians and very intelligent, but they were victims of a phone scam. Someone called pretending to be a member of the FBI or the IRS (can’t remember which) and told our friends there was some problem that needed to be corrected. The scammers got all the info they needed to essentially steal our friends’ ID’s and get their tax refund.

    If they ever catch any of these people, I envision a special cell in a jail where the sound of phone scammers is piped in all day and night.

  8. All that energy and brain power put into something illegal…too bad they don’t use it for good…

    BTW I’m very upset with you… I started DeepFake Double Down last night when I couldn’t go to sleep, thinking I’d shut my brain down. I finished it at 3:15 this morning..one of the best books I’ve read this year. Going as soon as I get my second cup of coffee to leave a review.

Comments are closed.