Grok’s Not-so-deepfake Willie Nelson, Rapper

While the deepfake video generators that fraudsters use can be persuasive, the 6-second videos created by the free version of Grok haven’t reached that level of fakery. Yet.

In my experience, Grok is better at re-creating well-known people with more distinctive appearances. Good at Gene Simmons and Taylor Swift. Bad at Ace Frehley and Gerald Ford.

So I present…Willie Nelson. 

Grok.

Willie with two turntables and a microphone, and one of his buds watching.

  • If you thought “Stardust” was odd for him, listen to this. 
  • Once Grok created the video, I customized it to have Willie rap about bud. 
  • Unfortunately, or perhaps fortunately, it doesn’t sound like the real Willie.

And for the, um, record, Nelson appeared in Snoop’s “My Medicine” video.

As an added bonus, here’s Grok’s version of Cher, without audio customization. It doesn’t make me believe…

Grok.

Reminder to marketing leaders: if you need Bredemarket’s content-proposal-analysis help, book a meeting at https://bredemarket.com/mark/

Deepfake Voices Have Been Around Since the 1980s

(Part of the biometric product marketing expert series)

Inland Empire locals know why THIS infamous song is stuck in my head today.

“Blame It On The Rain,” (not) sung by Milli Vanilli.

For those who don’t know the story, Rob Pilatus and Fab Morvan performed as the band Milli Vanilli and released an extremely successful album produced by Frank Farian. The title? “Girl You Know It’s True.”

But while we were listening to and watching Pilatus and Morvan sing, we were actually hearing the voices of Charles Shaw, John Davis, and Brad Howell. So technically this wasn’t a modern deepfake: rather than imitating the voice of a known person, Shaw et al were providing the voice of an unknown person. But the purpose was still deception.

Anyway, the ruse was revealed, Pilatus and Morvan were sacrificed, and things got worse.

“Pilatus, in particular, found it hard to cope, battling substance abuse and legal troubles. His tragic death in 1998 from a suspected overdose marked a sad epilogue to the Milli Vanilli saga.”

But there were certainly other examples of voice deepfakes in the 20th century…take Rich Little.

So deepfake voices aren’t a new problem. It’s just that they’re a lot easier to create today…which means that a lot of fraudsters can use them easily.

And if you are an identity/biometric marketing leader who needs Bredemarket’s help to market your anti-deepfake product, schedule a free meeting with me at https://bredemarket.com/mark/.

Communicate with the Words of Authority

Biometric marketing leaders, do your firm’s product marketing publications require the words of authority?

John E. Bredehoft of Bredemarket, the biometric product marketing expert.

Can John E. Bredehoft of Bredemarket—the biometric product marketing expert—contribute words of authority to your content, proposal, and analysis materials?

I offer:

  • 30 years of biometric experience, 10 years of product marketing expertise, and complementary proposal and product management talents.
  • Success with numerous biometric firms, including Incode, IDEMIA, MorphoTrak, Motorola, Printrak, and over a dozen biometric consulting clients.
  • Mastery of multiple biometric modalities: friction ridge (fingerprint, palm print), face, iris, voice, DNA.
  • Compelling CONTENT creation: blog posts, case studies and testimonials, LinkedIn articles and posts, white papers.
  • Winning PROPOSAL development: managing, writing, editing for millions of dollars of business for my firms.
  • Actionable ANALYSIS: strategic, market, product, competitive.

To embed Bredemarket’s biometric product marketing expertise within your firm, schedule a free meeting with me.

Make an impact.

In the PLoS One Voice Deepfake Detection Test, the Key Word is “Participants”

(Part of the biometric product marketing expert series)

A recent PYMNTS article entitled “AI Voices Are Now Indistinguishable From Humans, Experts Say” includes the following about voice deepfakes:

“A new PLoS One study found that artificial intelligence has reached a point where cloned voices are indistinguishable from genuine ones. In the experiment, participants were asked to tellhuman voices from AI-generated ones across 80 samples. Cloned voices were mistaken for real in 58% of cases, while human voices were correctly identified only 62% of the time.”

What the study didn’t measure

Since you already read the title of this post, you know that I’m concentrating on the word “participants.”

The PLoS One experiment used PEOPLE to try to distinguish real voices from deepfake ones.

And people aren’t all that accurate. Never have been.

Picture from Google Gemini.

Before you decide that people can’t detect fake voices…

…why not have an ALGORITHM give it a try?

What the study did measure

But to be fair, that wasn’t the goal of the PLoS One study, which specifically focused on human perception.

“Recently, an intriguing effect was reported in AI-generated faces, where such face images were perceived as more human than images of real humans – a “hyperrealism effect.” Here, we tested whether a “hyperrealism effect” also exists for AI-generated voices.”

For the record, the researchers did NOT discover a hyperrealism effect in AI-generated voices.

Do you offer a solution?

But if future deepfake voices sound realer than real, then we will REALLY need the algorithms to spot the fakes.

And if your company has a voice deepfake detection solution, I could have talked about it right now in this post.

Or on your website.

Or on your social media.

Where your prospects can see it…and purchase it.

And money in your pocket is realer than real.

Let’s talk. https://bredemarket.com/mark/

Picture from Google Gemini.

Some Voice Deepfakes Are NOT Fraudulent

(Part of the biometric product marketing expert series)

I’ve spent a ton of time discussing naughty people who use technology to create deepfakes—including voice deepfakes—to defraud people.

But some deepfakes don’t use technology, and some deepfakes are not intended to defraud.

Take Mark Hamill’s impersonation of fellow actor Harrison Ford.

Mark Hamill as Harrison Ford, and Harrison Ford reacting to Mark Hamill.

And then there was a case that I guess could be classified as fraud…at least to Don Pardo’s sister-in-law.

Don Pardo was originally known as an announcer on NBC game shows, and his distinctive voice could be heard on many of them, including (non-embeddable) parodies of them.

With his well-known voice, NBC jumped at the chance to employ him as the announcer for the decidedly non-game television show Saturday Night Live, where he traded dialogue with the likes of Frank Zappa.

“I’m the Slime.”

Except for a brief period after he ran afoul of Michael O’Donoghue, Pardo was a fixture on SNL for decades, through the reigns of various producers and executive producers.

Until one night in 1999 when laryngitis got the best of Don Pardo, and the show had to turn to Bill Clinton.

No, not the real Bill Clinton.

I’m talking about the SNL cast member who did a voice impression of Bill Clinton (and Jeopardy loser Sean Connery), Darrell Hammond. Who proceeded to perform an impression of Don Pardo.

An impression that even fooled Don Pardo’s sister-in-law.

This is Don Pardo saying this is Don Pardo…

Pardo continued to be Saturday Night Live’s announcer for years after that, sometimes live from New York, sometimes on tape from his home in Arizona.

And when Pardo passed away in 2014, he was succeeded as SNL’s announcer by former cast member Darrell Hammond.

Who used his own voice.

The Best Deepfake Defense is NOT Technological

I think about deepfakes a lot. As the identity/biometric product marketing consultant at Bredemarket, it comes with the territory.

When I’m not researching how fraudsters perpetrate deepfake faces, deepfake voices, and other deepfake modalities via presentation attack detection (liveness detection) and injection attack detection

…I’m researching and describing how Bredemarket’s clients and prospects develop innovative technologies to expose these deepfake fraudsters.

You can spend good money on deepfake-fighting industry solutions, and you can often realize a positive return on investment when purchasing these technologies.

But the best defense against these deepfakes isn’t some whiz bang technology.

It’s common sense.

  • Would your CEO really call you at midnight to expedite an urgent financial transaction?
  • Would that Amazon recruiter want to schedule a Zoom call right now?

If you receive an out-of-the-ordinary request, the first and most important thing to do is to take a deep breath.

A real CEO or recruiter would understand.

And…

…if your company offers a fraud-fighting solution to detect and defeat deepfakes, Bredemarket can help you market your solution. My content, proposal, and analysis offerings are at your service. Let’s talk: https://bredemarket.com/cpa/

CPA

(Imagen 4)

Frictionless Friction Ridges and Other Biometric Modalities

I wanted to write a list of the biometric modalities for which I provide experience.

So I started my usual list from memory: fingerprint, face, iris, voice, and DNA.

Then I stopped myself.

My experience with skin goes way beyond fingerprints, since I’ve spent over two decades working with palm prints.

(Can you say “Cambridgeshire method”? I knew you could. It was a 1990s method to use the 10 standard rolled fingerprint boxes to input palm prints into an automated fingerprint identification system. Because Cambridgeshire had a bias to action and didn’t want to wait for the standards folks to figure out how to enter palm prints. But I digress.)

So instead of saying fingerprints, I thought about saying friction ridges.

But there are two problems with this.

First, many people don’t know what “friction ridges” are. They’re the ridges that form on a person’s fingers, palms, toes, and feet, all of which can conceivably identify individuals.

But there’s a second problem. The word “friction” has two meanings: the one mentioned above, and a meaning that describes how biometric data is captured.

No, there is not a friction method to capture faces.
From https://www.youtube.com/watch?v=4XhWFHKWCSE.

No, there is not a friction method to capture faces. Squishing 

  • If you have to do something to provide your biometric data, such as press your fingers against a platen, that’s friction.
  • If you don’t have to do anything other than wave your fingers, hold your fingers in the air, or show your face as you stand near or walk by a camera, that’s frictionless.

More and more people capture friction ridges with frictionless methods. I did this years ago using MorphoWAVE at MorphoTrak facilities, and I did it today at Whole Foods Market.

So I could list my biometric modalities as friction ridge (fingerprint and palm print via both friction and frictionless capture methods), face, iris, voice, and DNA.

But I won’t.

Anyway, if you need content, proposal, or analysis assistance with any of these modalities, Bredemarket can help you. Book a meeting at https://bredemarket.com/cpa/

The “Biometric Digital Identity Deepfake and Synthetic Identity Prism Report” is Coming

As you may have noticed, I have talked about both deepfakes and synthetic identity ad nauseum.

But perhaps you would prefer to hear from someone who knows what they’re talking about.

On a webcast this morning, C. Maxine Most of The Prism Project reminded us that the “Biometric Digital Identity Deepfake and Synthetic Identity Prism Report” is scheduled for publication in May 2025, just a little over a month from now.

As with all other Prism Project publications, I expect a report that details the identity industry’s solutions to battle deepfakes and synthetic identities, and the vendors who provide them.

And the report is coming from one of the few industry researchers who knows the industry. Max doesn’t write synthetic identity reports one week and refrigerator reports the next, if you know what I mean.

At this point The Prism Project is soliciting sponsorships. Quality work doesn’t come for free, you know. If your company is interested in sponsoring the report, visit this link.

While waiting for Max, here are the Five Tops

And while you’re waiting for Max’s authoritative report on deepfakes and synthetic identity, you may want to take a look at Min’s (my) views, such as they are. Here are my current “five tops” posts on deepfakes and synthetic identity.