Stefan Gladbach’s “A PMM Christmas”

And the Oscar goes to…

Well, probably not. But I enjoyed contributing to Stefan Gladbach’s Christmas video “A PMM Christmas” as the only biometric product marketing expert in the cast.

And if you heard me mutter in the last few weeks that attribution is a myth, now you know why.

As you can see, Gladbach assembled an all-star cast. Credits at the end of the video, and also in the text of Stefan’s LinkedIn post.

Well, one additional credit: Susan Bredehoft was the camerawoman for my contributions. For lighting and background removal purposes, my scenes were taped outside in our back yard. Since my glasses lenses automatically adjust to sunlight, I can, um, attribute my Roy Orbison look to that.

And I did not follow instructions to wear an ugly Christmas sweater for the end credits…because I haven’t got one. (Ugly sweater, yes. Ugly Christmas sweater, no.) I should have stolen one from Talya.

And for those keeping score (only me, to be honest), I appear at 2:15, 4:40, 5:40, and 8:05.

And now I’m wondering if Roy Orbison ever covered a Smiths song. But again, that’s just me.

Merry Christmas.

Step Into Christmas: Deepfake?

Deepfakes are not a 21st century invention. Take this video of “Step Into Christmas.”

But here are the musician credits.

Elton: Piano and vocals

Davey Johnstone: Guitars and backing vocals

Dee Murray: Bass guitar and backing vocals

Nigel Olsson: Drums and backing vocals

Ray Cooper: Percussion

Kiki Dee: Backing vocals (uncredited)

Jo Partridge: Backing vocals (uncredited)

Roger Pope: Tambourine (uncredited)

David Hentschel: ARP 2500 synthesizer (uncredited)

The video doesn’t match this list. According to the video, Elton played more than the guitar, and Bernie Taupin performed on the track.

So while we didn’t use the term “deepfake” in 1973, this promotional video meets at least some of the criteria of a deepfake.

And before you protest that everybody knew that Elton John didn’t play guitar…undoubtedly some people saw this video and believed that Elton was a guitarist. After all, they saw it with their own eyes.

Sounds like fraud to me!

Remember this when you watch things.

Privacy: What Happens When You Data Scrape FROM the Identity Vendors?

There is a lot of discussion about data scraping, an activity in which Company 1 takes the information publicly posted by Company 2 and incorporates it into its own records.

In the identity world, this takes the form of a company “scraping” the facial images that were publicly posted by a second company, such as a social media company.

I think that we all know of one identity company that is well-known (a euphemism for “notorious”) for scraping facial images from multiple sources. These not only include government-posted mugshots, but also content posted by private social media firms.

Needless to say, the social media companies think that data scraping is completely evil and terrible and identity vendors that do this should be fined and put out of businress. The identity vendor is question has a different view, even stating at one point that it had a (U.S.) First Amendment right to scrape data.

But what happens when someone wants to scrape data FROM an identity company?

A Skagit County court case

404 Media links to a Skagit County, Washington court case that addresses this very issue: in this case, data captured by Flock Safety.

The case is CITY OF SEDRO-WOOLLEY and CITY OF STANWOOD, Washington Municipal Corporations vs. JOSE RODRIGUEZ. The following are findings of fact:

“On April 10, 2025, Defendant, Jose Rodriguez made a Public Records Request to the Snohomish Police Department. He requested all of the city’s Flock cameras pictures and data logs between 5 pm and 6 pm on March 30, 2025.”

This particular record does not indicate WHY Rodriguez made this request, but 404 Media provided a clarification from Rodriguez himself.

“I wanted the records to see if they would release them to me, in hopes that if they were public records it would raise awareness to all the communities that have the Flock cameras that they may be public record and could be used by stalkers, or burglars scoping out a house, or other ways someone with bad intentions may use them. My goal was to try getting these cameras taken down by the cities that put them up.”

The City of Stanwood (don’t know its relation to Snohomish) answered Rodriguez in part:

“Stanwood PD is not the holder of the records you’re seeking; you may be able to request the records at FlockSafety.com.”

Incidentally, this is a common issue with identity databases using vendor softwares; who owns the data? I’ve addressed this before regarding the Milwaukee Police Department.

Now some legal talent may be able to parse what the word “holder” means, especially in regard to data hosted in the cloud. Perhaps Stanwood PD was trying to claim that since the records weren’t on site, it wasn’t the “holder.”

Anyway, the defendant subsequently made a similar request to the City of Sedro-Woolley, but for a different date. Sedro-Woolley didn’t provide the images either.

Then it gets weird.

What happened to the data?

“The Flock records sought by Defendant from Stanwood and Sedro-Woolley have been auto-deleted.”

Well how convenient.

And the listed statements of fact also contain the following:

“The contract between Flock and Stanwood sates that all Flock images generated off Flock cameras located in Stanwood are the property of Stanwood.

“The contract between Flock and Sedro-Woolley states that all Flock images generated off Flock cameras located in Sedro-Woolley are the property of Sedro-Woolley.”

The judge’s ruling

Fast forward to November 6, when Judge Elizabeth Neidzwski ruled on the cities’ claim that the Flock camera data was not a public record.

“IT IS HEREBY ORDERED, ADJUDGED AND DECREED that Plaintiff’s motion for Declaratory Judgment that the Flock camera records are not public records is DENIED.”

404 Media noted that the cities argued that they resisted the request to…protect privacy.

“In affidavits filed with the court, police argued that ‘if the public could access the Flock Safety System by making Public Records Act requests, it would allow nefarious actors the ability to track private persons and undermine the effectiveness of the system.’ The judge rejected every single one of these arguments.”

Of course, there are those who argue that the police themselves are the “nefarious actors,” and that they shouldn’t be allowed to track private persons either.

But the parties may take the opposite argument

This is not the only example of conflicting claims over WHO has the right to privacy. In fact, if the police were filming protestors and agitators and wanted the public’s help in identifying them, the police and the protestors would take the opposite arguments in the privacy issue: the police saying the footage SHOULD be released, and the protestors who were filmed saying it SHOULD NOT.

Privacy is in the eye of the beholder.

Introducing Bredemarket’s Services, Process, and Pricing All Over the Series of Tubes

I confess that I love my promotional videos. After all, someone has to.

If you haven’t figured it out yet, my current super-sweet saccharine crush is “Bredemarket: Services, Process, and Pricing,” originally shared here on the Bredemarket blog last Wednesday.

= = reel

Bredemarket: Services, Process, and Pricing.

But I’m forced to admit that there are billions of people who never read the Bredemarket blog, and therefore will never see this post or the original one. Their loss. Thank you to those of you who do stop by; it’s appreciated.

But I can catch a few of them by sharing my video on other social platforms.

If you want to lose 15 minutes of your life, redundantly watch all of them.

So here’s my ask, if you are so inclined. Share this video with your friends on one of the platforms to help me get the word out about how Bredemarket can help technology marketing leaders…um, get the word out.

Or share the direct WordPress video link instead: https://bredemarket.com/wp-content/uploads/2025/11/cftm-serviceprocesspricing-2511a.mp4. Helps with the analytics.

And even if you’d prefer not to share, thank you for watching.

Postscript for those of you unfamiliar with Ted Stevens’ phrase “series of tubes”: watch this video. https://youtu.be/mHpA4dkP1j8?si=gm-3StEgFVBZTyM7

Video Iterations, 10/19/2025 Edition

The perfect is the enemy of the good, and I proved that today by creating a video…and then another one…and then another one.

I planned to write on GoFundMe “helper” scammers, ways to detect scammers, and ways to flush out scammers via a honeypot: a post prominently featuring the word “GoFundMe.”

So I created a video.

Version One. 89 seconds.

After posting that video I decided it was too long and created a shorter version.

Version Two. 44 seconds.

You’ve never seen this before…because just before I was going to post that video I decided it was too long and edited it further.

Version Three. 30 seconds.

I went ahead and posted that third version, leaving the first version active.

And for all I know I will create a fourth version.

But the first and third versions are out there, secret salesperson-ing for me. Now.

And I don’t know whether the first or third video is better. My intuition tells me the third one is better, but maybe the prospects will prefer the first version. Or the second one, which almost never saw the light of day.

Which one do you prefer? Tell me in the comments.

The unavoidable call to action

You know, all this iterating teaches us a lot about B2B sales.

I know some marketing leaders who are afraid to post anything, waiting for the perfect moment.

They’re still waiting.

Don’t let your competitors steal your prospects from you while you delay. Bredemarket can help. Book a free meeting with me: https://bredemarket.com/mark/

Stop losing prospects!

Grok, Celebrities, and Music

As some of you know, my generative AI tool of choice has been Google Gemini, which incorporates guardrails against portraying celebrities. Grok has fewer guardrails.

My main purpose in creating the two Bill and Hillary Clinton videos (at the beginning of this compilation reel) was to see how Grok would handle references to copyrighted music. I didn’t expect to hear actual songs, but would Grok try to approximate the sounds of Lindsey-Stevie-Christine era Mac and the Sex Pistols? You be the judge.

And as for Prince and Johnny…you be the judge of that also.

AI created by Grok.
AI created by Grok.

Removing the Guardrails: President Taylor Swift, Courtesy Grok

Most of my recent generative GI experiments have centered on Google Gemini…which has its limitations:

“Google Gemini imposes severe restrictions against creating pictures of famous figures. You can’t create a picture of President Taylor Swift, for example.”

Why does Google impose such limits? Because it is very sensitive to misleading the public, fearful that the average person would see such a picture and mistakenly assume that Taylor Swift IS the President. In our litigious society, perhaps this is valid.

But we know that other generative AI services don’t have such restrictions.

“One common accusation about Grok is that it lacks the guardrails that other AI services have.”

During a few spare moments this morning, I signed up for a Bredemarket Grok account. I have a personal X (Twitter) account, but haven’t used it in a long time, so this was a fresh sign up.,

And you know the first thing that I tried to do.


Grok.

Grok created it with no problem. Actually, there is a problem, because Grok apparently is not a large multimodal model and cannot precisely generate text in its image generator. But hey, no one will notice “TWIRSHIITE BOUSE,” will they?

But wait, there’s more! After I generated the image, I saw a button to generate a video. I thought that this required the paid service, but apparently the free service allows limited video generation.

Grok.

I may be conducting some video experiments some time soon. But will I maintain my ethics…and my sanity?

A Californian, an Illinoisan, and a Dane Walk Into a Videoconference

I was recently talking with a former colleague, whose name I am not at liberty to reveal, and they posed a question that stymied me.

What happens when multiple people join a videoconference, and they all reside in jurisdictions with different privacy regulations?

An example will illustrate what would happen, and I volunteer to be the evil party in this one.

The videoconference

Let’s say:

On a particular day in April 2026, a Californian launches a videoconference on Zoom.

Imagen 4.

The Californian invites an Illinoisan.

Imagen 4.

And also invites a Dane.

Imagen 4.

And then—here’s the evil part—records and gathers images from the videoconference without letting the other two know.

The legal violations

Despite the fact that the Illinois Biometric Information Privacy Act, or BIPA, requires written consent before acquiring Abe’s facial geometry. And if Cali John doesn’t obtain that written consent, he could lose a lot of money.

And what about Freja? Well, if the Danish Copyright Act takes effect on March 31, 2026 as expected, Cali John can get into a ton of trouble if he uses the video to create a realistic, digitally generated imitation of Freja. Again, consent is required. Again, there can be monetary penalties if you don’t get that consent.

But there’s another question we have to consider.

The vendor responsibility 

Does the videoconference provider bear any responsibility for the violations of Illinois and Danish law?

Since I used Zoom as my example, I looked at Zoom’s EULA Terms of Service.

TL;DR: not our problem, that’s YOUR problem.

“5. USE OF SERVICES AND YOUR RESPONSIBILITIES. You may only use the Services pursuant to the terms of this Agreement. You are solely responsible for Your and Your End Users’ use of the Services and shall abide by, and ensure compliance with, all Laws in connection with Your and each End User’s use of the Services, including but not limited to Laws related to recording, intellectual property, privacy and export control. Use of the Services is void where prohibited.”

But such requirements haven’t stopped BIPA lawyers from filing lawsuits against deep pocketed software vendors. Remember when Facebook settled for $650 million?

So remember what could happen the next time you participate in a multinational, multi-state, or even multi-city videoconference. Hope your AI note taker isn’t capturing screen shots.

Huff + Puff, the Magic Camera Hardware Failure Correction

It was 8:48, just before an important client meeting this morning, and I was freaking out. I had scheduled the meeting in Google Meet, and I started up the session…and the right third of the camera view was obscured.

Imagen 4 re-creation. I didn’t think to take a screenshot at the time. And no, I don’t have facial hair.

I attempted various fixes:

  • I stopped Google Meet, started it again…and got the same result.
  • I logged off and logged back in again…and got the same result.
  • I restarted my computer (turn it off and turn it back on again)…and got the same result.
  • I tried Zoom…and got the same result.

Which meant that the possible problem was a hardware problem with the camera itself. Which meant a lot of hassle sending the computer in for a fix, which was especially upsetting because this was a new computer.

Bredebot proves useful

So I turned to my buddy Bredebot.

In a huddle space in an office, a smiling robot named Bredebot places his robotic arms on a wildebeest and a wombat, encouraging them to collaborate on a product marketing initiative.
Bredebot is the one in the middle.

And he wasn’t reassuring:

A black section in a laptop camera feed is most often due to a hardware issue, such as a damaged camera sensor or a problem with the ribbon cable that connects the camera to the motherboard. Software issues are less likely to cause a precise, consistent black area like this, but they’re still worth checking.

Then I began working down the checklist that Bredebot provided, beginning with the first item.

The most common and easiest issue to rule out is a physical object blocking the lens. This could be a speck of dust or debris, a stray piece of a sticker, or a misplaced privacy slider. Even a tiny particle on the lens can show up as a large black spot or area in the image.

A speck of dust? Just a simple speck of dust causing that major of an obstruction?

Not having a can of compressed air available, I used my mouth to blow on the top of the laptop screen.

The obstruction partially cleared, and now three fourths of the screen was visible.

One more blow, and my “critical hardware failure” was fixed.

What does this mean?

So some computer problems are NOT fixed by turning it off and turning it on again. Sometimes a lot of hot air is necessary.

Imagen 4.

By sheer coincidence, the Just A Band song “Huff + Puff” is on my current Spotify playlist. Nothing to do with computer video hardware, but it’s a good song.