True Crime Thursday – DEEPFAKES

By Debbie Burke

@burke_writer

 

Believe none of what you hear and half of what you see.

This saying has been around for centuries, variously attributed to Benjamin Franklin and Edgar Allen Poe.

Today, thanks to Artificial Intelligence (AI) and Machine Learning (ML), you can no longer believe anything you hear or see.

That’s because what your ears hear and what your eyes see could be a DEEPFAKE.

What is a deepfake? Wikipedia says:

…synthetic media in which a person in an existing image or video is replaced with someone else’s likeness. While the act of faking content is not new, deepfakes leverage powerful techniques from machine learning and artificial intelligence to manipulate or generate visual and audio content with a high potential to deceive. The main machine learning methods used to create deepfakes are based on deep learning and involve training generative neural network architectures, such as autoencoders or generative adversarial networks (GANs).

Deepfakes have garnered widespread attention for their uses in creating child sexual abuse material, celebrity pornographic videosrevenge pornfake newshoaxes, bullying, and financial fraud. This has elicited responses from both industry and government to detect and limit their use.

 

I wrote about AI three years ago. Since then, technology has progressed at warp speed.

The first recognized crime that used deepfake technology occurred in 2019 with voice impersonation.

The CEO of an energy business in the UK received an urgent call from his boss, an executive at the firm’s German parent company. The CEO recognized his boss’s voice…or so he thought. He was instructed to immediately transfer $243,000 to pay a Hungarian supplier. He followed orders and transferred the money.

The funds went into a Hungarian account but then disappeared to Mexico. According to the company’s insurer, Euler Hermes, the money was never recovered.

To pull off the heist, cybercriminals used AI voice-spoofing software that perfectly mimicked the boss’s tone, speech inflections, and slight German accent.

Such spoofing extends to video deceptions that are chilling. The accuracy of movement and gesture renders the imposter clone indistinguishable from the real person. Some research shows a fake face can more believable than the real one.

Security safeguards like voice authentication and facial recognition are no longer reliable.

A November 2020 study by Trend Micro, Europol, and United Nations Interregional Crime and Justice Research Institute concludes:

The Crime-as-a-Service (CaaS) business model, which allows non-technologically savvy criminals to procure technical tools and services in the digital underground that allow them to extend their attack capacity and sophistication, further increases the potential for new technologies such as AI to be abused by criminals and become a driver of crime.

We believe that on the basis of technological trends and developments, future uses or abuses could become present realities in the not-too-distant future.

The not-too-distant future they mentioned in 2020 is here today. A person no longer needs to be a sophisticated expert to create fake video and audio recordings of real people that defy detection.

In the following YouTube, a man created a fake image of himself to fool coworkers into believing they were video-chatting with the real person. It’s long—more than 18 minutes—but watching even a few minutes demonstrates how simple the process is.

Consider the implications:

What if you could appear to be in one place but actually be somewhere else? Criminals can create their own convincing alibis.

If corrupt law enforcement, government entities, or political enemies want to frame or discredit someone, they manufacture video evidence that shows the person engaged in criminal or abhorrent behavior.

Imagine the mischief terrorists could cause by putting words in the mouths of world leaders. Here are some examples: https://www.cnn.com/interactive/2019/01/business/pentagons-race-against-deepfakes/

Deepfakes could change history, creating events that never actually happened. Check out this example made at MIT of a fake Richard Nixon delivering a fake 1969 speech to mourn astronauts who supposedly perished on the moon. Fast forward to 4:18.

How was this software developed?

It arose from Machine Learning (ML). The process involves pitting computers against one another to see which one most accurately reproduces expressions, gestures, and voices from real people. The more they compete with each other, the better they learn, and the more authentic their fakes become.

A fanciful imagining of a contest might sound like this.

Computer A: “Hey, look at this Jack Nicholson eyebrow quirk I mastered.”

Computer B: “That’s nothing. Samuel L. Jackson’s nostril flare is much harder. Bet yours can’t top mine.”

Computer A: “Oh yeah? Check out how I made Margaret Thatcher to cross her legs just like Sharon Stone.”

The Europol study further outlined ways that deepfakes could be used for malicious or criminal purposes:

Destroying the image and credibility of an individual,

Harassing or humiliating individuals online,

Perpetrating extortion and fraud,

Facilitating document fraud,

Falsifying online identities and fooling KYC [Know Your Customer] mechanisms,

Falsifying or manipulating electronic evidence for criminal justice investigations,

Disrupting financial markets,

Distributing disinformation and manipulating public opinion,

Inciting acts of violence toward minority groups,

Supporting the narratives of extremist or even terrorist groups, and

Stoking social unrest and political polarization

 

In the era of deepfakes, can video/audio evidence ever be trusted again?

~~~

A big Thank You to TKZ regular K.S. Ferguson who suggested the idea for this post and provided sources.

~~~

TKZers: Can you name books, short stories, or films that incorporate deepfakes in the plot? Feel free to include sci-fi/fantasy from the past where the concept is used before it existed in real life.

Please put on your criminal hat and suggest fictional ways a bad guy could take advantage of deepfakes.

Or put on your detective hat and offer solutions to thwart the evil-doer.

~~~

 

Debbie Burke’s characters are not created by Artificial Intelligence but rather by her real imagination. 

Please check out her latest Tawny Lindholm Thriller. 

Until Proven Guilty is for sale at major booksellers here.