By Debbie Burke
Believe half of what you see and none of what you hear.
That saying has been attributed to sources like Benjamin Franklin and Edgar Allen Poe. It’s appeared in song lyrics like Leon Haywood’s “Believe Half of What You See (and None of What You Hear)” and the third verse of the immortal Marvin Gaye classic, “I Heard It Through The Grapevine.”
Today, those wise words are even truer because of Voice Cloning, a new tool created by Artificial Intelligence (AI) for the cyber-scammer’s toolkit.
Phishing and Vishing are scams where criminals contact victims, often posing as a bank, reputable business, government agency, hospital, law enforcement, or other entity in order to gain access to your personal and/or financial information.
Phishing contacts victims by email, urging you to open an infected attachment or click on a link that downloads malware. Phishing attacks are generally sophisticated and massive in scale, targeting thousands of businesses and individuals at a time using automation.
Reportedly 74% of organizations in the US have been successfully targeted by phishing attacks.
Vishing (AKA voice phishing) is when a scammer contacts the victim by phone, impersonating a law officer, banker, charity, etc. who convinces you to verbally share your confidential information. The caller ID is spoofed, making the call appear to come from a legitimate source like a bank, Social Security Administration, credit card company, etc.
Vishing requires a scammer to contact one victim at a time, to persuade them to give up sensitive personal and financial information on the phone, making it less efficient than phishing.
However, vishing can still be devastatingly effective, especially now thanks to Voice Cloning. AI can take a sample of someone’s voice and create speech that’s impossible to tell from the real person.
Voice cloning is a boon for scams like Family Emergency, Friend Stranded Overseas, or Grandchild in Trouble. Scammers harvest voice samples of your loved one from YouTube, TikTok, and other online sources.
They can also call the person to record their voice or even use their outgoing voicemail message.
In a few seconds, criminals can download enough of a person’s voice to create a convincing imitation.
Now when “Johnny” calls Grandma saying he was in a car accident and needs bail money, the voice is identical to the real Johnny. That triggers panic, and the victim is more likely to act without thinking. And lose money in the process.
“Johnny” will ask for gift cards, cryptocurrency, or want you to wire money. Once you do, the funds are instantly transferred to the scammer and the transaction can’t be reversed.
Your money is gone.
How do you protect yourself against a voice that sounds exactly like your loved one?
The FTC advises:
A simple low-tech safeguard is to have a password or code that only you and your family knows. If something about a call with a loved one sounds suspicious, ask them for the password.
But be careful how you select a code. Criminals often scour social media accounts for clues to possible passwords.
If you post a photo of “my dog Spot” and choose that for your password, it could be guessed.
Speaking of Spot, to wrap up this post on a light note, does anyone remember the old Cal Worthington TV commercials that always featured “My Dog Spot”?
The last Worthington family car dealership was sold in February, 2023. End of an era but Cal’s jingle lives on.
TKZers: Has someone you know been taken in by voice cloning?
Can you think of ways to use Voice Cloning in mystery, suspense, or thriller fiction?
Investigator Tawny Lindholm plunges into an alternate reality where video is fake but death is real.
Please check out my new thriller Deep Fake Double Down.