Privacy: What Happens When You Data Scrape FROM the Identity Vendors?

There is a lot of discussion about data scraping, an activity in which Company 1 takes the information publicly posted by Company 2 and incorporates it into its own records.

In the identity world, this takes the form of a company “scraping” the facial images that were publicly posted by a second company, such as a social media company.

I think that we all know of one identity company that is well-known (a euphemism for “notorious”) for scraping facial images from multiple sources. These not only include government-posted mugshots, but also content posted by private social media firms.

Needless to say, the social media companies think that data scraping is completely evil and terrible and identity vendors that do this should be fined and put out of businress. The identity vendor is question has a different view, even stating at one point that it had a (U.S.) First Amendment right to scrape data.

But what happens when someone wants to scrape data FROM an identity company?

A Skagit County court case

404 Media links to a Skagit County, Washington court case that addresses this very issue: in this case, data captured by Flock Safety.

The case is CITY OF SEDRO-WOOLLEY and CITY OF STANWOOD, Washington Municipal Corporations vs. JOSE RODRIGUEZ. The following are findings of fact:

“On April 10, 2025, Defendant, Jose Rodriguez made a Public Records Request to the Snohomish Police Department. He requested all of the city’s Flock cameras pictures and data logs between 5 pm and 6 pm on March 30, 2025.”

This particular record does not indicate WHY Rodriguez made this request, but 404 Media provided a clarification from Rodriguez himself.

“I wanted the records to see if they would release them to me, in hopes that if they were public records it would raise awareness to all the communities that have the Flock cameras that they may be public record and could be used by stalkers, or burglars scoping out a house, or other ways someone with bad intentions may use them. My goal was to try getting these cameras taken down by the cities that put them up.”

The City of Stanwood (don’t know its relation to Snohomish) answered Rodriguez in part:

“Stanwood PD is not the holder of the records you’re seeking; you may be able to request the records at FlockSafety.com.”

Incidentally, this is a common issue with identity databases using vendor softwares; who owns the data? I’ve addressed this before regarding the Milwaukee Police Department.

Now some legal talent may be able to parse what the word “holder” means, especially in regard to data hosted in the cloud. Perhaps Stanwood PD was trying to claim that since the records weren’t on site, it wasn’t the “holder.”

Anyway, the defendant subsequently made a similar request to the City of Sedro-Woolley, but for a different date. Sedro-Woolley didn’t provide the images either.

Then it gets weird.

What happened to the data?

“The Flock records sought by Defendant from Stanwood and Sedro-Woolley have been auto-deleted.”

Well how convenient.

And the listed statements of fact also contain the following:

“The contract between Flock and Stanwood sates that all Flock images generated off Flock cameras located in Stanwood are the property of Stanwood.

“The contract between Flock and Sedro-Woolley states that all Flock images generated off Flock cameras located in Sedro-Woolley are the property of Sedro-Woolley.”

The judge’s ruling

Fast forward to November 6, when Judge Elizabeth Neidzwski ruled on the cities’ claim that the Flock camera data was not a public record.

“IT IS HEREBY ORDERED, ADJUDGED AND DECREED that Plaintiff’s motion for Declaratory Judgment that the Flock camera records are not public records is DENIED.”

404 Media noted that the cities argued that they resisted the request to…protect privacy.

“In affidavits filed with the court, police argued that ‘if the public could access the Flock Safety System by making Public Records Act requests, it would allow nefarious actors the ability to track private persons and undermine the effectiveness of the system.’ The judge rejected every single one of these arguments.”

Of course, there are those who argue that the police themselves are the “nefarious actors,” and that they shouldn’t be allowed to track private persons either.

But the parties may take the opposite argument

This is not the only example of conflicting claims over WHO has the right to privacy. In fact, if the police were filming protestors and agitators and wanted the public’s help in identifying them, the police and the protestors would take the opposite arguments in the privacy issue: the police saying the footage SHOULD be released, and the protestors who were filmed saying it SHOULD NOT.

Privacy is in the eye of the beholder.

The California Privacy Folks Have Executed a Cool Rebrand

I previously discussed the alphabet soup that infests California privacy efforts.

“Before launching into these regulatory changes, remember that the CCPA is the California Consumer Privacy Act, while the CPPA is the California Privacy Protection Agency. (There’s also a CPRA, the California Privacy Rights Act.)”

Well, one of the entities, the agency (CPPA), is trying to extricate itself and differentiate and be cool and stuff.

“The California Privacy Protection Agency has chosen the new public-facing name of CalPrivacy. The name underscores the agency’s commitment to operationalizing privacy rights and delivering clear, consumer-friendly guidance to all Californians.”

Like…cool.

The Healthy Otter: When AI Transcriptions are HIPAA Compliant

When I remember to transcribe my meetings, and when I CAN transcribe my meetings, my meeting transcriber of choice happens to be otter.ai. And if I’m talking to a healthcare prospect or client, and when they grant permission to transcribe, the result is HIPAA compliant.

Otter.ai explains the features that provide this:

Getting HIPAA compliant wasn’t just about checking a box – we’ve implemented some serious security upgrades:

  • Better encryption to keep protected health information (PHI) locked down
  • Tighter access controls so only the right people see sensitive data
  • Team training to make sure everyone knows HIPAA inside and out
  • Regular security audits to stay on top of our game

This builds on our existing SOC 2 Type II certification, so you’re getting enterprise-grade security across the board.

HIPAA privacy protections affect you everywhere.

My Thoughts on the Amazon Ring-Flock Safety Partnership

Amazon didn’t get a lot of good news today, and there was another negative news item that people focused on the AWS outage probably missed.

Anthony Kimery of Biometric Update wrote an article entitled “Ring’s partnership with Flock raises privacy alarms.” I offered the following commentary on LinkedIn’s Bredemarket Identity Firm Services page.

Perhaps I’m industry-embedded, but this seems fine to me. Consent appears to be honored everywhere.

“Under the deal, agencies that use Flock’s Nova or FlockOS investigative platforms will soon be able to post Community Requests through Ring’s Neighbors app, asking nearby residents to share doorbell footage relevant to an investigation.

“Each request includes a case ID, time window, and map of the affected area. Ring says participation is voluntary and that residents can choose whether to respond, and agencies cannot see who declines. Users can also disable the feature entirely in their account settings.”

On the other hand, Senator Ron Wyden doesn’t trust Flock at all and says that “abuse of Flock cameras is inevitable.”

Heck, abuse of citizens by the U.S. Senate is inevitable, but citizens aren’t demanding that the Senate cease operations.

California AB 566 Web Opt-Out Preference Signal (the California Opt Me Out Act)

A new bill has been enrolled in California, where I live. But how will this affect web browser developers outside of California?

The bill is the California Opt Me Out Act, AB 566. The text of Section 2 of the bill is found at the end of this post. But the two major parts of the bill are as follows:

Google Gemini.
  • Starting in 2027, businesses that create web browsers, regardless of their location, must include “functionality configurable by a consumer that enables the browser to send an opt-out preference signal to businesses.”
  • Web browser developers that do this “shall not be liable for a violation of this title by a business that receives the opt-out preference signal.”

The bill doesn’t get any more specific than that; the California Privacy Protection Agency will work out the details.

The part of interest of course, is that happens to businesses that develop web browsers WITHOUT the opt-out functionality. What happens to those non-compliant businesses? What is the liability? Is it civil? Criminal? If Safari doesn’t include easy-to-use opt out functionality, will Tim Cook do time?

This is yet another example of the debate that occurs when one country, or one state, or one county/city enacts a law and expects the rest of the world to comply. In this particular case, the state of California is telling every web browser developer in the entire world how to configure their browsers. The developers have several choices:

  • Comply with California law, while simultaneously complying with laws from all other jurisdictions regarding opt out. Including a theoretical business-friendly jurisdiction that prohibits opt out entirely.
  • Ignore the California law and see what the California Privacy Protection Agency does, or tries to do. Is Yandex, the Russian developer of the Yandex browser, going to really care about California law?
Google Gemini.
  • Contest the law in court, arguing that it violates the U.S. First Amendment, the U.S. Second Amendment, or whatever.

The ball is now in the hands of the CPPA, which needs to develop the regulations to implement the law, as well as develop the penalties for non-compliant businesses.

Here is the exact text of Section 2.

SEC. 2.

Section 1798.136 is added to the Civil Code, to read:

1798.136.

 (a) (1) A business shall not develop or maintain a browser that does not include functionality configurable by a consumer that enables the browser to send an opt-out preference signal to businesses with which the consumer interacts through the browser.

(2) The functionality required by paragraph (1) shall be easy for a reasonable person to locate and configure.

(b) A business that develops or maintains a browser shall make clear to a consumer in its public disclosures how the opt-out preference signal works and the intended effect of the opt-out preference signal.

(c) The California Privacy Protection Agency may adopt regulations as necessary to implement and administer this section.

(d) A business that develops or maintains a browser that includes a functionality that enables the browser to send an opt-out preference signal pursuant to this section shall not be liable for a violation of this title by a business that receives the opt-out preference signal.

(e) As used in this section:

(1) “Browser” means an interactive software application that is used by consumers to locate, access, and navigate internet websites.

(2) “Opt-out preference signal” means a signal that complies with this title and that communicates the consumer’s choice to opt out of the sale and sharing of the consumer’s personal information.

(f) This section shall become operative on January 1, 2027.

A Californian, an Illinoisan, and a Dane Walk Into a Videoconference

I was recently talking with a former colleague, whose name I am not at liberty to reveal, and they posed a question that stymied me.

What happens when multiple people join a videoconference, and they all reside in jurisdictions with different privacy regulations?

An example will illustrate what would happen, and I volunteer to be the evil party in this one.

The videoconference

Let’s say:

On a particular day in April 2026, a Californian launches a videoconference on Zoom.

Imagen 4.

The Californian invites an Illinoisan.

Imagen 4.

And also invites a Dane.

Imagen 4.

And then—here’s the evil part—records and gathers images from the videoconference without letting the other two know.

The legal violations

Despite the fact that the Illinois Biometric Information Privacy Act, or BIPA, requires written consent before acquiring Abe’s facial geometry. And if Cali John doesn’t obtain that written consent, he could lose a lot of money.

And what about Freja? Well, if the Danish Copyright Act takes effect on March 31, 2026 as expected, Cali John can get into a ton of trouble if he uses the video to create a realistic, digitally generated imitation of Freja. Again, consent is required. Again, there can be monetary penalties if you don’t get that consent.

But there’s another question we have to consider.

The vendor responsibility 

Does the videoconference provider bear any responsibility for the violations of Illinois and Danish law?

Since I used Zoom as my example, I looked at Zoom’s EULA Terms of Service.

TL;DR: not our problem, that’s YOUR problem.

“5. USE OF SERVICES AND YOUR RESPONSIBILITIES. You may only use the Services pursuant to the terms of this Agreement. You are solely responsible for Your and Your End Users’ use of the Services and shall abide by, and ensure compliance with, all Laws in connection with Your and each End User’s use of the Services, including but not limited to Laws related to recording, intellectual property, privacy and export control. Use of the Services is void where prohibited.”

But such requirements haven’t stopped BIPA lawyers from filing lawsuits against deep pocketed software vendors. Remember when Facebook settled for $650 million?

So remember what could happen the next time you participate in a multinational, multi-state, or even multi-city videoconference. Hope your AI note taker isn’t capturing screen shots.

Your Product Marketing Must Address as Many Target Audiences as A Las Vegas Buffet

Messaging. It’s what B2B product marketers do. And it’s also what proposal professional professionals do, as we shall see. 

But even the simplest B2B product suffers with one-dimensional messaging.

Why? Because even simple products often require many types of people to get involved in the purchasing cycle.

Marketers often talk about target audiences. I personally believe that term doesn’t describe the concept properly, so I prefer to refer to hungry people.

Which brings us to the Las Vegas buffet.

Variety for hungry people

Las Vegas is a destination visited by over 40 million people per year from all over the world. And the casino hotels know that they’re hungry for food, and they hope the hungry people will stay on property.

So do they serve Caesars Burgers?

Um, no. 40 million people don’t eat the same thing.

This becomes very clear if you visit the Bacchanal Buffet at Caesars Palace, with over 250 items prepared in 10 kitchens.

“From Roman-style pizza to Carne Asada Tacos inspired by the food trucks of L. A., there’s something for everyone. Find a world of flavor at our nine live-action cooking stations. Indulge in originals like slow-cooked prime rib, smoked beef brisket, crab, and wood-fired pizza. Or try something different, like whole Ahi Tuna Poke, roasted duck, or Singaporean Blue Crab and seasonal agua frescas.”

(Imagen 4)

There is literally something for everyone. And the hungry person salivating for Ahi Tuna Poke doesn’t care about the beef brisket.

Which brings us to local police automated fingerprint identification system (AFIS) proposals.

Variety for hungry people

If you had asked me in September 1994 (before I started at Printrak in October) the target audience for local police AFIS, I would have replied, “fingerprint people.”

That answer would be incorrect.

Tenprint and latent people 

Because, even if you limit things to the criminal AFIS world, there are (at least) two types of fingerprint people: tenprint examiners, and latent examiners. I asked my buddy Bredebot to summarize the stereotypical differences between the two. Here is some of what he said:

“‘Assembly line‘ comparisons: Because tenprint comparisons use high-quality, known impressions taken under controlled conditions, their work can be automated and is often perceived as a high-volume, less complex task. This is in contrast to the specialized analysis required for latent prints.

“Artistic and subjective: Because latent prints are often smudged, distorted, and incomplete, examiners must make subjective judgments about their suitability for comparison. This has led to the criticism that the process is more of an art than a science.”

Bredebot has never attended an International Association for Identification conference, but I have. Many many years ago I attended a session on tenprint examiner certification. Latent examiners had this way cool certification and some people thought that more tenprint examiners should participate in their way cool certification program. As I recall, this meeting way heavily attended…by latent folks. Even today, the number of Certified Latent Print Examiners (CLPEs) is far greater than the number of Certified Tenprint Examiners (CTPEs).

Other people

But you can’t procure an AFIS by talking to tenprint and latent people alone.

As I noted years ago, other people get involved in a local police AFIS procurement, using Ontario, California as an example:

(Imagen 4)
  • The field investigators who run across biometric evidence at the scene of a crime, such as a knife with a fingerprint on it or a video feed showing someone breaking into a liquor store.
  • The information technologies (IT) people who are responsible for ensuring that Ontario, California’s biometric data is sent to San Bernardino County, the state of California, perhaps other systems such as the Western Identification Network, and the Federal Bureau of Investigation. 
  • The purchasing agent who has to make sure that all of Ontario’s purchases comply with purchasing laws and regulations. 
  • The privacy advocate who needs to ensure that the biometric data complies with state and national privacy laws.
  • The mayor (still Paul Leon as I write this), who has to deal with angry citizens asking why their catalytic converters are being stolen from their vehicles, and demanding to know what the mayor is doing about it. 
  • Probably a dozen other stakeholders that I haven’t talked about yet, but who are influenced by the city’s purchasing decision.

Feeding the hungry people 

So even a relatively simple B2B product has multiple target audiences.

Should product marketers apply the same one-dimensional messaging to all of them?

Um, no.

If you did that, purchasing agents would fall asleep at mentions of “level 3 detail,” while latent examiners would abandon their usual attention to detail when confronted by privacy references to the California Information Practices Act of 1977. (The CCPA, CPRA, and CPPA apply to private entities.)

So, whether you like it or not, you need separate messaging for each of your categories of hungry people.

(Imagen 4)

One time, as part of an account-based marketing effort, I had to construct a multi-variable messaging matrix…for a product that is arguably simpler than an AFIS.

And yes, I used Microsoft Excel.

And I can use my mad Excel skillz for you also, if your company needs content, proposal, or analysis assistance in your technology product marketing operations. Contact Bredemarket at https://bredemarket.com/mark/.

Content for tech marketers.

And proposal professional professionals, read this.

“Somewhat You Why” and Geolocation Stalkerware

Geolocation and “somewhat you why” (my proposed sixth factor of identity verification and authentication) can not only be used to identify and authenticate people.

They can also be used to learn things about people already authenticated, via the objects they might have in their possession.

Stalkerware

404 Media recently wrote an article about “stalkerware” geolocation tools that vendors claim can secretly determine if your partner is cheating on you.

Before you get excited about them, 404 Media reveals that many of these tools are NOT secret.

“Immediately notifies anyone traveling with it.” (From a review)

Three use cases for geolocation tracking

But let’s get back to the tool, and the intent. Because I maintain that intent makes all the difference. Look at these three use cases for geolocation tracking of objects:

  • Tracking an iPhone (held by a person). Many years ago, an iPhone user had to take a long walk from one location to another after dark. This iPhone user asked me to track their whereabouts while on that walk. Both of us consented to the arrangement.
  • Tracking luggage. Recently, passengers have placed AirTags in their luggage before boarding a flight. This lets the passengers know where their luggage is at any given time. But some airlines were not fans of the practice:

“Lufthansa created all sorts of unnecessary confusion after it initially banned AirTags out of concern that they are powered by a lithium battery and could emit radio signals and potentially interfere with aircraft navigation.

“The FAA put an end to those baseless concerns saying, “Luggage tracking devices powered by lithium metal cells that have 0.3 grams or less of lithium can be used on checked baggage”.   The Apple AirTag battery is a third of that size and poses no risk to aircraft operation.”

  • Tracking an automobile. And then there’s the third case, raised by the 404 Media article. 404 Media found countless TikTok advertisements for geolocation trackers with pitches such as “men with cheating wives, you might wanna get one of these.” As mentioned above, the trackers claim to be undetectable, which reinforces the fact that the person whose car is being tracked did NOT consent.

From consent to stalkerware, and the privacy implications

Geolocation technologies are used in every instance. But in one case it’s perfectly acceptable, while it’s less acceptable in the other two cases.

Banning geolocation tracking technology would be heavy-handed since it would prevent legitimate, consent-based uses of the technology.

So how do we set up the business and technical solutions that ensure that any tracking is authorized by all parties?

Does your firm offer a solution that promotes privacy? Do you need Bredemarket’s help to tell prospects about your solution? Contact me.

Changes in Process to California Privacy Regulations

There are laws, and there are regulations. In California, we are modifying the latter.

Before launching into these regulatory changes, remember that the CCPA is the California Consumer Privacy Act, while the CPPA is the California Privacy Protection Agency. (There’s also a CPRA, the California Privacy Rights Act.)

Imagen 4.

I have attached the May 2025 version of the “Modified Text of Proposed Regulations,” specifically regarding changes to the California Consumer Privacy Act regulations. They affect automated decision-making, conducting risk assessments, and performing cybersecurity audits.

This is still an in-process document. As OneTrust notes:

The regulations will now head to the California Office of Administrative Law for final review before they can be formally enacted. 

In the meantime, we have this thingie, in which

The initial proposal (noticed on November 22, 2024) is illustrated by blue underline for proposed additions and red strikethrough for proposed deletions, unless otherwise indicated, as in Articles 9, 10, and 11. Changes made after the 45-day comment period are illustrated by purple double underline for proposed additions and orange double strikethrough for proposed deletions.

When you get to the purple double underline and orange double strikethrough stage, you know things are getting serious.