Privacy: What Happens When You Data Scrape FROM the Identity Vendors?

There is a lot of discussion about data scraping, an activity in which Company 1 takes the information publicly posted by Company 2 and incorporates it into its own records.

In the identity world, this takes the form of a company “scraping” the facial images that were publicly posted by a second company, such as a social media company.

I think that we all know of one identity company that is well-known (a euphemism for “notorious”) for scraping facial images from multiple sources. These not only include government-posted mugshots, but also content posted by private social media firms.

Needless to say, the social media companies think that data scraping is completely evil and terrible and identity vendors that do this should be fined and put out of businress. The identity vendor is question has a different view, even stating at one point that it had a (U.S.) First Amendment right to scrape data.

But what happens when someone wants to scrape data FROM an identity company?

A Skagit County court case

404 Media links to a Skagit County, Washington court case that addresses this very issue: in this case, data captured by Flock Safety.

The case is CITY OF SEDRO-WOOLLEY and CITY OF STANWOOD, Washington Municipal Corporations vs. JOSE RODRIGUEZ. The following are findings of fact:

“On April 10, 2025, Defendant, Jose Rodriguez made a Public Records Request to the Snohomish Police Department. He requested all of the city’s Flock cameras pictures and data logs between 5 pm and 6 pm on March 30, 2025.”

This particular record does not indicate WHY Rodriguez made this request, but 404 Media provided a clarification from Rodriguez himself.

“I wanted the records to see if they would release them to me, in hopes that if they were public records it would raise awareness to all the communities that have the Flock cameras that they may be public record and could be used by stalkers, or burglars scoping out a house, or other ways someone with bad intentions may use them. My goal was to try getting these cameras taken down by the cities that put them up.”

The City of Stanwood (don’t know its relation to Snohomish) answered Rodriguez in part:

“Stanwood PD is not the holder of the records you’re seeking; you may be able to request the records at FlockSafety.com.”

Incidentally, this is a common issue with identity databases using vendor softwares; who owns the data? I’ve addressed this before regarding the Milwaukee Police Department.

Now some legal talent may be able to parse what the word “holder” means, especially in regard to data hosted in the cloud. Perhaps Stanwood PD was trying to claim that since the records weren’t on site, it wasn’t the “holder.”

Anyway, the defendant subsequently made a similar request to the City of Sedro-Woolley, but for a different date. Sedro-Woolley didn’t provide the images either.

Then it gets weird.

What happened to the data?

“The Flock records sought by Defendant from Stanwood and Sedro-Woolley have been auto-deleted.”

Well how convenient.

And the listed statements of fact also contain the following:

“The contract between Flock and Stanwood sates that all Flock images generated off Flock cameras located in Stanwood are the property of Stanwood.

“The contract between Flock and Sedro-Woolley states that all Flock images generated off Flock cameras located in Sedro-Woolley are the property of Sedro-Woolley.”

The judge’s ruling

Fast forward to November 6, when Judge Elizabeth Neidzwski ruled on the cities’ claim that the Flock camera data was not a public record.

“IT IS HEREBY ORDERED, ADJUDGED AND DECREED that Plaintiff’s motion for Declaratory Judgment that the Flock camera records are not public records is DENIED.”

404 Media noted that the cities argued that they resisted the request to…protect privacy.

“In affidavits filed with the court, police argued that ‘if the public could access the Flock Safety System by making Public Records Act requests, it would allow nefarious actors the ability to track private persons and undermine the effectiveness of the system.’ The judge rejected every single one of these arguments.”

Of course, there are those who argue that the police themselves are the “nefarious actors,” and that they shouldn’t be allowed to track private persons either.

But the parties may take the opposite argument

This is not the only example of conflicting claims over WHO has the right to privacy. In fact, if the police were filming protestors and agitators and wanted the public’s help in identifying them, the police and the protestors would take the opposite arguments in the privacy issue: the police saying the footage SHOULD be released, and the protestors who were filmed saying it SHOULD NOT.

Privacy is in the eye of the beholder.

A Fingerprint Identification Story: Bobby Driscoll

In early 1968, two boys found a dead body in New York’s East Village. There was no identification on the man, and no one in the neighborhood knew him. He was fingerprinted and buried in a mass grave, identified by the NYPD nearly two years later.

Potter’s Field monument, Hart Island. From Wikipedia. CC BY-SA 4.0.

In the 1960s, fingerprint identification of deceased persons—a laborious process in those days—often happened because the deceased had a criminal record.

And Bobby Driscoll did

His first arrest was in 1956, but he was not convicted of any crime until 1961.

“On May 1, 1961, he was arrested for attempting to cash a check that had been stolen from a liquor store the previous January, and at the same time was also charged with driving under the influence of drugs. He pled guilty to both charges and was sentenced to six months of treatment for drug addiction at the California Institute for Men at Chino.”

Driscoll reportedly cleaned up (his drug of choice was heroin), went east to New York City, and even achieved some fame.

“[H]e purportedly settled into Andy Warhol’s Greenwich Village art community known as “The Factory.” During this time, he also participated in an underground film entitled Dirt, directed by avant-garde filmmaker Piero Heliczer.”

But this was not Driscoll’s first film. He had been in a few films earlier in life.

From Wikipedia. Fair use in this form.

Here he is (in the upper right corner) playing Johnny in the Disney movie Song of the South.

From Wikipedia. Public domain.

And he provided the voice for the lead character in the later Disney movie Peter Pan.

Yes, Bobby Driscoll was a child star for Disney and other studios before appearing in Dirt.

But right after Driscoll’s voice became famous in Peter Pan, Disney declined to renew his contract. The reason? Acne…and the fact that he wasn’t a cute kid any more.

AI generated by Grok.

This led to his tailspin, which eventually led to his fingerprinting.

And his positive identification after his death.

Is Ancestral Supplements a Drug?

What is a drug? Here’s what the U.S. Food and Drug Administration said to Ancestral Supplements in April 2025.

“This letter is to advise you that the U.S. Food and Drug Administration (FDA) reviewed your website at http://ancestralsupplements.com in March 2025 and has found that you take orders there for Ancestral Grassfed Beef Thyroid. Various claims and statements made on your website and/or in other labeling establish that this product is a drug as defined in 21 U.S.C. § 321(g)(1)(B) because it is intended for the treatment, cure, mitigation, or prevention of disease.  For example, your website recommends or suggests the use of Ancestral Grassfed Beef Thyroid to treat or prevent hypothyroidism and Grave’s disease.  As explained further below, the introduction of this product into interstate commerce for such uses violates the Federal Food, Drug, and Cosmetic Act.”

https://www.fda.gov/inspections-compliance-enforcement-and-criminal-investigations/ancestral-supplements-llc

California AB 566 Web Opt-Out Preference Signal (the California Opt Me Out Act)

A new bill has been enrolled in California, where I live. But how will this affect web browser developers outside of California?

The bill is the California Opt Me Out Act, AB 566. The text of Section 2 of the bill is found at the end of this post. But the two major parts of the bill are as follows:

Google Gemini.
  • Starting in 2027, businesses that create web browsers, regardless of their location, must include “functionality configurable by a consumer that enables the browser to send an opt-out preference signal to businesses.”
  • Web browser developers that do this “shall not be liable for a violation of this title by a business that receives the opt-out preference signal.”

The bill doesn’t get any more specific than that; the California Privacy Protection Agency will work out the details.

The part of interest of course, is that happens to businesses that develop web browsers WITHOUT the opt-out functionality. What happens to those non-compliant businesses? What is the liability? Is it civil? Criminal? If Safari doesn’t include easy-to-use opt out functionality, will Tim Cook do time?

This is yet another example of the debate that occurs when one country, or one state, or one county/city enacts a law and expects the rest of the world to comply. In this particular case, the state of California is telling every web browser developer in the entire world how to configure their browsers. The developers have several choices:

  • Comply with California law, while simultaneously complying with laws from all other jurisdictions regarding opt out. Including a theoretical business-friendly jurisdiction that prohibits opt out entirely.
  • Ignore the California law and see what the California Privacy Protection Agency does, or tries to do. Is Yandex, the Russian developer of the Yandex browser, going to really care about California law?
Google Gemini.
  • Contest the law in court, arguing that it violates the U.S. First Amendment, the U.S. Second Amendment, or whatever.

The ball is now in the hands of the CPPA, which needs to develop the regulations to implement the law, as well as develop the penalties for non-compliant businesses.

Here is the exact text of Section 2.

SEC. 2.

Section 1798.136 is added to the Civil Code, to read:

1798.136.

 (a) (1) A business shall not develop or maintain a browser that does not include functionality configurable by a consumer that enables the browser to send an opt-out preference signal to businesses with which the consumer interacts through the browser.

(2) The functionality required by paragraph (1) shall be easy for a reasonable person to locate and configure.

(b) A business that develops or maintains a browser shall make clear to a consumer in its public disclosures how the opt-out preference signal works and the intended effect of the opt-out preference signal.

(c) The California Privacy Protection Agency may adopt regulations as necessary to implement and administer this section.

(d) A business that develops or maintains a browser that includes a functionality that enables the browser to send an opt-out preference signal pursuant to this section shall not be liable for a violation of this title by a business that receives the opt-out preference signal.

(e) As used in this section:

(1) “Browser” means an interactive software application that is used by consumers to locate, access, and navigate internet websites.

(2) “Opt-out preference signal” means a signal that complies with this title and that communicates the consumer’s choice to opt out of the sale and sharing of the consumer’s personal information.

(f) This section shall become operative on January 1, 2027.

Jane Says…Nothing

Remember Jane, my Instagram AI influencer

Well, I received this notification on Instagram:

“Your Al JaneCPAInfluencer is now private because it goes against our Al Studio policies. Please edit it and submit again.”

Naturally I wondered what the violation was. I was directed to the policies at https://aistudio.instagram.com/policies/.

Which part of the policy does Jane violate? That’s a secret…yet another example of “you violated our terms, but we won’t tell you the specifics; YOU figure it out.”

So, since I can still access Jane myself, I asked her. AI is supposed to help you, after all.

“What portion of the Meta AI Studio Policies do you violate, Jane?”

Her response:

“I can’t respond because one or more of my details goes against the AI Studio policies.”

That answer caused me to wonder if Jane would respond to anything.

“Who is Bredemarket?”

“I can’t respond because one or more of my details goes against the AI Studio policies.”

So is it critically important that I spend a lot of time figuring out what the violation is? Um…no.

But I’m curious how this interaction will affect the ads that Meta will present to me later this year.

Is Illinois’ Biometric Information Privacy Act (BIPA) Nullified in Concert Venues?

Illinois music lovers, wanna see a concert? Sounds like you may have to surrender your BIPA protections. 

Specifically, if the concert venue uses Ticketmaster (who doesn’t?), and if the concert venue captures your biometric data without your consent, you may not have legal recourse.

“These Terms of Use (“Terms”) govern your use of Live Nation and Ticketmaster’s websites and applications…

“The Terms contain an arbitration agreement and class action waiver—along with some limited exceptions—in Section 14, below. Specifically, you and we agree that any dispute or claim relating in any way to the Terms, your use of the Marketplace, or products or services sold, distributed, issued, or serviced by us or through us, will be resolved by binding arbitration, rather than in court…

“By agreeing to arbitration, you and we each waive any right to participate in a class action lawsuit or class action arbitration, except those already filed and currently pending as of August 12, 2025.”

See https://legal.ticketmaster.com/terms-of-use/

A Californian, an Illinoisan, and a Dane Walk Into a Videoconference

I was recently talking with a former colleague, whose name I am not at liberty to reveal, and they posed a question that stymied me.

What happens when multiple people join a videoconference, and they all reside in jurisdictions with different privacy regulations?

An example will illustrate what would happen, and I volunteer to be the evil party in this one.

The videoconference

Let’s say:

On a particular day in April 2026, a Californian launches a videoconference on Zoom.

Imagen 4.

The Californian invites an Illinoisan.

Imagen 4.

And also invites a Dane.

Imagen 4.

And then—here’s the evil part—records and gathers images from the videoconference without letting the other two know.

The legal violations

Despite the fact that the Illinois Biometric Information Privacy Act, or BIPA, requires written consent before acquiring Abe’s facial geometry. And if Cali John doesn’t obtain that written consent, he could lose a lot of money.

And what about Freja? Well, if the Danish Copyright Act takes effect on March 31, 2026 as expected, Cali John can get into a ton of trouble if he uses the video to create a realistic, digitally generated imitation of Freja. Again, consent is required. Again, there can be monetary penalties if you don’t get that consent.

But there’s another question we have to consider.

The vendor responsibility 

Does the videoconference provider bear any responsibility for the violations of Illinois and Danish law?

Since I used Zoom as my example, I looked at Zoom’s EULA Terms of Service.

TL;DR: not our problem, that’s YOUR problem.

“5. USE OF SERVICES AND YOUR RESPONSIBILITIES. You may only use the Services pursuant to the terms of this Agreement. You are solely responsible for Your and Your End Users’ use of the Services and shall abide by, and ensure compliance with, all Laws in connection with Your and each End User’s use of the Services, including but not limited to Laws related to recording, intellectual property, privacy and export control. Use of the Services is void where prohibited.”

But such requirements haven’t stopped BIPA lawyers from filing lawsuits against deep pocketed software vendors. Remember when Facebook settled for $650 million?

So remember what could happen the next time you participate in a multinational, multi-state, or even multi-city videoconference. Hope your AI note taker isn’t capturing screen shots.

Strategy is not Tactics

I’ve said that strategy is one of four essential elements of product marketing. But you have to know what strategy is…and what it is not.

To illustrate the difference between strategy and tactics, it helps to differentiate between abstract, long term goals and concrete, short term goals.

If your goal is to better the world, that’s a strategy.

If your goal is to excel in a particular industry, that’s a strategy.

Although strategies can change. Those who know of Nokia as a telecommunications company, and those who remember Nokia as a phone supplier, are not old enough to remember Nokia’s beginnings as a pulp mill in 1865.

If your goal is to secure business from a specific prospect, that’s a tactic. Or it should be.

Fleming Companies secured a 10-year contract in 2001 as the main supplier of groceries to Kmart, accounting for 20% of Fleming’s revenue. Kmart cancelled that contract when it declared bankruptcy a year later. Fleming filed a $1.4 billion claim in Kmart’s bankruptcy case…but only got $385 million. Fleming itself ended up in bankruptcy court in 2003.

But Fleming’s strategy was to excel at food wholesaling through acquisition and innovation.

It’s just that one tactical blunder upended that strategy.

Whether Bredemarket pivots from biometric content to resume writing (not likely), I am presently equipped to address both your strategic and tactical product marketing needs. If I can help you, talk to me at https://bredemarket.com/mark/.

Differentiating the DNA of Twins?

(Part of the biometric product marketing expert series)

There are certain assumptions that you make in biometrics.

Namely, that certain biometrics are unable to differentiate twins: facial recognition, and DNA analysis.

Now as facial recognition algorithms get bettter and better, perhaps they will be able to tell twins apart: even identical twins.

But DNA is DNA, right?

Twins and somatic mutations

Mike Bowers (CSIDDS) links to an article in Forensic Magazine which suggests that twins’ DNA can be differentiated.

For the first time in the U.S., an identical twin has been convicted of a crime based on DNA analysis.

The breakthrough came from Parabon Nanolabs, who’s scientists used deep whole genome sequencing to identify extremely rare “somatic mutations” that differentiated Russell Marubbio and his twin, John. The results were admitted as evidence in court, making last week’s conviction of Russell in the 1987 rape of a 50-year-old woman a landmark case.

Twin DNA.

Parabon Nanolabs (whom I briefly mentioned in 2024) applied somatic mutations as follows:

Somatic mutations are DNA changes that happen after conception and can cause genetic differences between otherwise identical twins. These mutations can arise during the earliest stages of embryonic development, affecting the split of the zygote, and accumulate throughout life due to errors in cell division. Somatic mutations can be present in only one twin, a subset of cells, or both, potentially leading to differences in health and even developmental disorders—and in this case, DNA.

The science behind somatic mutations is not new, and is well-researched, understood and accepted. It’s just uncommon for DNA to lead to twins, and even more uncommon for somatic mutations to be able to distinguish between twins.

Note that “well-researched, understood and accepted” part (even though it lacks an Oxford comma). Because this isn’t the only recent story that touches upon whole genome sequencing.

Whole genome sequencing and legal admissibility

Bowers also links to a CNN article which references Daubert/Frye-like questions about whether evidence is admissable.

Evidence derived from cutting-edge DNA technology that prosecutors say points directly at Rex Heuermann being the Gilgo Beach serial killer will be admissible at his trial, a Suffolk County judge ruled Wednesday….

Heuermann’s defense attorney Michael Brown had argued the DNA technology, known as whole genome sequencing, has not yet been widely accepted by the scientific community and therefore shouldn’t be permitted. He said he plans to argue the validity of the technology before a jury.

Meanwhile, prosecutors have argued this type of DNA extraction has been used by local law enforcement, the FBI and even defense attorneys elsewhere in the country, according to court records.

Let me point out one important detail: the fact that police agencies are using a particular technology doesn’t mean that said technology is “widely accepted by the scientific community.” I suspect that this same question will be raised in other courts, and other judges may hold a different decision.

And after checking my blog, I realize that I have never written an article about Daubert/Frye. Another assignment for Bredebot, I guess…

Your identity/biometric product marketing needs to assert the facts rather than old lies,

Bredemarket can help.

Forget About Milwaukee’s Facial Recognition DATA: We All Want to See Milwaukee’s Facial Recognition POLICY

(Part of the biometric product marketing expert series)

I love how Biometric Update bundles a bunch of stories into a single post. Chris Burt outdid himself on Wednesday, covering a slew of stories regarding use and possible misuse of facial recognition by Texas bounty hunters, the NYPD, and cities ranging from Chicago, Illinois to Houlton, Maine.

But those stories aren’t the ones that I’m focusing on. Before I get to my focus, I want to go off on a tangent and address something else.

Read us any rule, we’ll break it

In a huddle space in an office, a smiling robot named Bredebot places his robotic arms on a wildebeest and a wombat, encouraging them to collaborate on a product marketing initiative.
Bredebot and his pals.

By the time you read this, the first full post by my counterpart “Bredebot” will have published on the Bredemarket blog. This is a completely AI-generated post in which a bot DID write the first draft. More posts are coming.

What I didn’t expect was that competition would arise between me and my bot. I’m writing these words on August 27, two days before the first Bredebot post appears, and I’m already feeling the heat.

What if Bredebot’s posts receive more traffic than the ones I write myself? What does that mean for my own posts…and for the whole premise of hiring Bredemarket to write for others?

I’m treating this as a challenge, vowing to outdo my fast bot counterpart.

And in that spirit, let’s revisit Milwaukee.

Give us any chance, we’ll take it

Access.

When Biometric Update initially visited Milwaukee in its April 28 post, the main concern was the possible agreement for the Milwaukee Police Department to provide “access” to facial data to the company Biometrica in exchange for facial recognition licenses. I subsequently explored the data issue in my own May 6 guest post for Biometric Update.

Vendors must disclose responsible uses of biometric data.

But today the questions addressed to Milwaukee don’t focus on the data, but on the use of facial recognition itself. The Biometric Update article links to a Wisconsin Watch article with more detail. The arguments are familiar to all of you: facial recognition is racist, facial recognition is sometimes relied upon as the sole piece of evidence, facial recognition data can be sent to ICE, and facial recognition can be misused.

However, before Milwaukee’s Common Council can approve facial recognition use, one requirement has to be met.

Since the passage of Wisconsin Act 12, the only official way to amend or reject MPD policy is by a vote of at least two-thirds of the Common Council, or 10 members. 

“However, council members cannot make any decision about it until MPD actually drafts its policy, often referred to as a “standard operating procedure.” 

“Ald. Peter Burgelis – one of four council members who did not sign onto the Common Council letter to Norman – said he is waiting to make a decision until he sees potential policy from MPD or an official piece of legislation considered by the city’s Public Safety and Health Committee.”

The Milwaukee Police Department agrees that such a policy is necessary.

“MPD has consistently stated that a carefully developed policy could help reduce risks associated with facial recognition.

“’Should MPD move forward with acquiring FRT, a policy will be drafted based upon best practices and public input,’ a department spokesperson said.”

An aside from my days at MorphoTrak, when I would load user conference documents into the CrowdCompass mobile app: one year the topic of law enforcement agency facial recognition policies was part of our conference agenda. One agency had such a policy, but the agency would not allow me to upload the policy into the CrowdCompass app. You see, the agency had a policy…but it wasn’t public.

Needless to say, the Milwaukee Police Department’s draft policy WILL be public…and a lot of people will be looking at it.

Although I don’t know if it will make everyone’s dreams come true.