Another Voice Deepfake Fraud Scam

Time for another voice deepfake scam.

This one’s in Schwyz, in Switzerland, which makes reading of the original story somewhat difficult. But we can safely say that “Eine unbekannte Täterschaft hat zur Täuschung künstliche Intelligenz eingesetzt und so mehrere Millionen Franken erbeutet” is NOT a good thing.

And that’s millions of Swiss francs, not millions of Al Frankens.

Millions of Al Frankens.

Luckily, someone at Biometric Update speaks German well enough to get the gist of the story.

“Deploying audio manipulated to sound like a trusted business partner, fraudsters bamboozled an entrepreneur from the canton of Schwyz into transferring “several million Swiss francs” to a bank account in Asia.”

And what do the canton police recommend? (Google Translated)

“Be wary of payment requests via telephone or voice message, even if the voice sounds familiar.”

Singer/songwriters…and Deepfakes

I was just talking about singers, songwriters, and one singer who pretended to be a songwriter.

Of course, some musicians can be both.

Willie Nelson has written songs for others, sung songs written by others, and sung his own songs.

But despite the Grok deepfake I shared last October, Willie is not known as a rapper.

This is fake. Grok.

Oh Yeah, That Biometric Stuff

Bredemarket works with a number of technologies, but it’s no secret that my primary focus is biometrics. After all, I call myself the “biometric product marketing expert,” having worked with friction ridge (fingerprint, palm print), face, iris, voice, and rapid DNA.

The biometric product marketing expert in the desert.

If I can help your biometric firm with your content, proposal, or analysis needs, schedule a free meeting with me to discuss how I can help.

Grok’s Not-so-deepfake Willie Nelson, Rapper

While the deepfake video generators that fraudsters use can be persuasive, the 6-second videos created by the free version of Grok haven’t reached that level of fakery. Yet.

In my experience, Grok is better at re-creating well-known people with more distinctive appearances. Good at Gene Simmons and Taylor Swift. Bad at Ace Frehley and Gerald Ford.

So I present…Willie Nelson. 

Grok.

Willie with two turntables and a microphone, and one of his buds watching.

  • If you thought “Stardust” was odd for him, listen to this. 
  • Once Grok created the video, I customized it to have Willie rap about bud. 
  • Unfortunately, or perhaps fortunately, it doesn’t sound like the real Willie.

And for the, um, record, Nelson appeared in Snoop’s “My Medicine” video.

As an added bonus, here’s Grok’s version of Cher, without audio customization. It doesn’t make me believe…

Grok.

Reminder to marketing leaders: if you need Bredemarket’s content-proposal-analysis help, book a meeting at https://bredemarket.com/mark/

Deepfake Voices Have Been Around Since the 1980s

(Part of the biometric product marketing expert series)

Inland Empire locals know why THIS infamous song is stuck in my head today.

“Blame It On The Rain,” (not) sung by Milli Vanilli.

For those who don’t know the story, Rob Pilatus and Fab Morvan performed as the band Milli Vanilli and released an extremely successful album produced by Frank Farian. The title? “Girl You Know It’s True.”

But while we were listening to and watching Pilatus and Morvan sing, we were actually hearing the voices of Charles Shaw, John Davis, and Brad Howell. So technically this wasn’t a modern deepfake: rather than imitating the voice of a known person, Shaw et al were providing the voice of an unknown person. But the purpose was still deception.

Anyway, the ruse was revealed, Pilatus and Morvan were sacrificed, and things got worse.

“Pilatus, in particular, found it hard to cope, battling substance abuse and legal troubles. His tragic death in 1998 from a suspected overdose marked a sad epilogue to the Milli Vanilli saga.”

But there were certainly other examples of voice deepfakes in the 20th century…take Rich Little.

So deepfake voices aren’t a new problem. It’s just that they’re a lot easier to create today…which means that a lot of fraudsters can use them easily.

And if you are an identity/biometric marketing leader who needs Bredemarket’s help to market your anti-deepfake product, schedule a free meeting with me at https://bredemarket.com/mark/.

Communicate with the Words of Authority

Biometric marketing leaders, do your firm’s product marketing publications require the words of authority?

John E. Bredehoft of Bredemarket, the biometric product marketing expert.

Can John E. Bredehoft of Bredemarket—the biometric product marketing expert—contribute words of authority to your content, proposal, and analysis materials?

I offer:

  • 30 years of biometric experience, 10 years of product marketing expertise, and complementary proposal and product management talents.
  • Success with numerous biometric firms, including Incode, IDEMIA, MorphoTrak, Motorola, Printrak, and over a dozen biometric consulting clients.
  • Mastery of multiple biometric modalities: friction ridge (fingerprint, palm print), face, iris, voice, DNA.
  • Compelling CONTENT creation: blog posts, case studies and testimonials, LinkedIn articles and posts, white papers.
  • Winning PROPOSAL development: managing, writing, editing for millions of dollars of business for my firms.
  • Actionable ANALYSIS: strategic, market, product, competitive.

To embed Bredemarket’s biometric product marketing expertise within your firm, schedule a free meeting with me.

Make an impact.

In the PLoS One Voice Deepfake Detection Test, the Key Word is “Participants”

(Part of the biometric product marketing expert series)

A recent PYMNTS article entitled “AI Voices Are Now Indistinguishable From Humans, Experts Say” includes the following about voice deepfakes:

“A new PLoS One study found that artificial intelligence has reached a point where cloned voices are indistinguishable from genuine ones. In the experiment, participants were asked to tellhuman voices from AI-generated ones across 80 samples. Cloned voices were mistaken for real in 58% of cases, while human voices were correctly identified only 62% of the time.”

What the study didn’t measure

Since you already read the title of this post, you know that I’m concentrating on the word “participants.”

The PLoS One experiment used PEOPLE to try to distinguish real voices from deepfake ones.

And people aren’t all that accurate. Never have been.

Picture from Google Gemini.

Before you decide that people can’t detect fake voices…

…why not have an ALGORITHM give it a try?

What the study did measure

But to be fair, that wasn’t the goal of the PLoS One study, which specifically focused on human perception.

“Recently, an intriguing effect was reported in AI-generated faces, where such face images were perceived as more human than images of real humans – a “hyperrealism effect.” Here, we tested whether a “hyperrealism effect” also exists for AI-generated voices.”

For the record, the researchers did NOT discover a hyperrealism effect in AI-generated voices.

Do you offer a solution?

But if future deepfake voices sound realer than real, then we will REALLY need the algorithms to spot the fakes.

And if your company has a voice deepfake detection solution, I could have talked about it right now in this post.

Or on your website.

Or on your social media.

Where your prospects can see it…and purchase it.

And money in your pocket is realer than real.

Let’s talk. https://bredemarket.com/mark/

Picture from Google Gemini.

Some Voice Deepfakes Are NOT Fraudulent

(Part of the biometric product marketing expert series)

I’ve spent a ton of time discussing naughty people who use technology to create deepfakes—including voice deepfakes—to defraud people.

But some deepfakes don’t use technology, and some deepfakes are not intended to defraud.

Take Mark Hamill’s impersonation of fellow actor Harrison Ford.

Mark Hamill as Harrison Ford, and Harrison Ford reacting to Mark Hamill.

And then there was a case that I guess could be classified as fraud…at least to Don Pardo’s sister-in-law.

Don Pardo was originally known as an announcer on NBC game shows, and his distinctive voice could be heard on many of them, including (non-embeddable) parodies of them.

With his well-known voice, NBC jumped at the chance to employ him as the announcer for the decidedly non-game television show Saturday Night Live, where he traded dialogue with the likes of Frank Zappa.

“I’m the Slime.”

Except for a brief period after he ran afoul of Michael O’Donoghue, Pardo was a fixture on SNL for decades, through the reigns of various producers and executive producers.

Until one night in 1999 when laryngitis got the best of Don Pardo, and the show had to turn to Bill Clinton.

No, not the real Bill Clinton.

I’m talking about the SNL cast member who did a voice impression of Bill Clinton (and Jeopardy loser Sean Connery), Darrell Hammond. Who proceeded to perform an impression of Don Pardo.

An impression that even fooled Don Pardo’s sister-in-law.

This is Don Pardo saying this is Don Pardo…

Pardo continued to be Saturday Night Live’s announcer for years after that, sometimes live from New York, sometimes on tape from his home in Arizona.

And when Pardo passed away in 2014, he was succeeded as SNL’s announcer by former cast member Darrell Hammond.

Who used his own voice.