“Data Privacy Day is marked each year on January 28….Data Privacy Day began in the United States and Canada in January of 2008 as an extension of its European counterpart. In Europe, Data Protection Day commemorates the January 28, 1981, signing of Convention 108, the first legally binding international treaty on privacy and data protection.”
“This Convention is the first binding international instrument which protects the individual against abuses which may accompany the collection and processing of personal data and which seeks to regulate at the same time the transfrontier flow of personal data.
“In addition to providing guarantees in relation to the collection and processing of personal data, it outlaws the processing of “sensitive” data on a person’s race, politics, health, religion, sexual life, criminal record, etc., in the absence of proper legal safeguards. The Convention also enshrines the individual’s right to know that information is stored on him or her and, if necessary, to have it corrected.”
A lot has happened since 1981, but it all had to start somewhere.
(An aside: I wrote this post on Saturday, November 8, 2025. On that date I asked Google Gemini when the next biometric-related holiday was, and this is what came up.)
Now anyone reading that article over the weekend was probably very confused, since the death of Alex Pretti isn’t exactly a DATA breach.
And, of course, Minnesota doesn’t have a “department of homeland security.”
It does, however, have a Department of Human Services…and THAT was what was breached.
“A single user inappropriately accessed private data within the Minnesota Department of Human Services (DHS) ecosystem, potentially impacting 303,965 individuals, officials report.”
This was not a hack per se, but a case in which a legitimate person accessed something they shouldn’t have accessed. Certainly a breach, and the person’s access was terminated.
My 2024 offboarding post discussed the short-term aspects of how Bredemarket wraps up business with its clients. But it didn’t cover the long-term aspects.
What I didn’t say in 2024
You’ll recall my description of the end of a particular contract.
In 2023 I signed a contract with a client in which I would bill them at an hourly rate. This was a short-term contract, but it was subsequently renewed.
Recently the client chose not to renew the contract for another extended period.
But there’s one thing I didn’t say.
The client (whom I’ll call Client 1) failed to tell me that it wasn’t renewing my contract. In fact, in my last discussion with the client, I did not perceive that it wasn’t planning to renew.
Surprise! In fact, I learned of the non-renewal from a third party, not the client itself.
Of course, the client had every right to choose not to renew without advance notice.
But read on.
What happened in 2025
Contrast that with my relationship with two other existing clients, both of whom contacted me personally and let me know that they weren’t renewing my contract.
Both took the time to explain why they were not renewing. Nothing to do with my performance, but having to do with internal issues at each company (which I am not at liberty to discuss).
I went through the aforementioned data scrub process with both clients, and my obligation to both was done.
But read on.
Two little twists
Add these facts.
There was an interesting connection between Client 2 and Client 3. My primary employee contact at Client 2 was previously a consultant at Client 3 until they were let go (again, not because of performance, but because of internal issues).
And a little while later, my employee contact at Client 2 was let go from Client 2 themselves (again, internal issues).
The long term
Bredemarket completed its contractual obligations to all three firms: the one that let me go in 2024 (Client 1), and the two that let me go in 2025 (Client 2 and Client 3).
But what happens after that?
It depends.
If my employee contact at Client 3 requests help, or if I see something of interest to Client 3, I’ll be more than happy to help or share.
If my employee contact formerly of Client 2 requests help, or if I see something of interest to Client 2, I’ll be more than happy to help or share.
If I see something of interest that affects Client 2, I may or may not share.
If I see something of interest that affects Client 1, I probably won’t share…except to Client 1’s competitors.
There is a lot of discussion about data scraping, an activity in which Company 1 takes the information publicly posted by Company 2 and incorporates it into its own records.
In the identity world, this takes the form of a company “scraping” the facial images that were publicly posted by a second company, such as a social media company.
I think that we all know of one identity company that is well-known (a euphemism for “notorious”) for scraping facial images from multiple sources. These not only include government-posted mugshots, but also content posted by private social media firms.
Needless to say, the social media companies think that data scraping is completely evil and terrible and identity vendors that do this should be fined and put out of businress. The identity vendor is question has a different view, even stating at one point that it had a (U.S.) First Amendment right to scrape data.
But what happens when someone wants to scrape data FROM an identity company?
The case is CITY OF SEDRO-WOOLLEY and CITY OF STANWOOD, Washington Municipal Corporations vs. JOSE RODRIGUEZ. The following are findings of fact:
“On April 10, 2025, Defendant, Jose Rodriguez made a Public Records Request to the Snohomish Police Department. He requested all of the city’s Flock cameras pictures and data logs between 5 pm and 6 pm on March 30, 2025.”
This particular record does not indicate WHY Rodriguez made this request, but 404 Media provided a clarification from Rodriguez himself.
“I wanted the records to see if they would release them to me, in hopes that if they were public records it would raise awareness to all the communities that have the Flock cameras that they may be public record and could be used by stalkers, or burglars scoping out a house, or other ways someone with bad intentions may use them. My goal was to try getting these cameras taken down by the cities that put them up.”
The City of Stanwood (don’t know its relation to Snohomish) answered Rodriguez in part:
“Stanwood PD is not the holder of the records you’re seeking; you may be able to request the records at FlockSafety.com.”
Incidentally, this is a common issue with identity databases using vendor softwares; who owns the data? I’ve addressed this before regarding the Milwaukee Police Department.
Now some legal talent may be able to parse what the word “holder” means, especially in regard to data hosted in the cloud. Perhaps Stanwood PD was trying to claim that since the records weren’t on site, it wasn’t the “holder.”
Anyway, the defendant subsequently made a similar request to the City of Sedro-Woolley, but for a different date. Sedro-Woolley didn’t provide the images either.
Then it gets weird.
What happened to the data?
“The Flock records sought by Defendant from Stanwood and Sedro-Woolley have been auto-deleted.”
Well how convenient.
And the listed statements of fact also contain the following:
“The contract between Flock and Stanwood sates that all Flock images generated off Flock cameras located in Stanwood are the property of Stanwood.
“The contract between Flock and Sedro-Woolley states that all Flock images generated off Flock cameras located in Sedro-Woolley are the property of Sedro-Woolley.”
The judge’s ruling
Fast forward to November 6, when Judge Elizabeth Neidzwski ruled on the cities’ claim that the Flock camera data was not a public record.
“IT IS HEREBY ORDERED, ADJUDGED AND DECREED that Plaintiff’s motion for Declaratory Judgment that the Flock camera records are not public records is DENIED.”
404 Media noted that the cities argued that they resisted the request to…protect privacy.
“In affidavits filed with the court, police argued that ‘if the public could access the Flock Safety System by making Public Records Act requests, it would allow nefarious actors the ability to track private persons and undermine the effectiveness of the system.’ The judge rejected every single one of these arguments.”
Of course, there are those who argue that the police themselves are the “nefarious actors,” and that they shouldn’t be allowed to track private persons either.
But the parties may take the opposite argument
This is not the only example of conflicting claims over WHO has the right to privacy. In fact, if the police were filming protestors and agitators and wanted the public’s help in identifying them, the police and the protestors would take the opposite arguments in the privacy issue: the police saying the footage SHOULD be released, and the protestors who were filmed saying it SHOULD NOT.
“Before launching into these regulatory changes, remember that the CCPA is the California Consumer Privacy Act, while the CPPA is the California Privacy Protection Agency. (There’s also a CPRA, the California Privacy Rights Act.)”
Well, one of the entities, the agency (CPPA), is trying to extricate itself and differentiate and be cool and stuff.
“The California Privacy Protection Agency has chosen the new public-facing name of CalPrivacy. The name underscores the agency’s commitment to operationalizing privacy rights and delivering clear, consumer-friendly guidance to all Californians.”
When I remember to transcribe my meetings, and when I CAN transcribe my meetings, my meeting transcriber of choice happens to be otter.ai. And if I’m talking to a healthcare prospect or client, and when they grant permission to transcribe, the result is HIPAA compliant.
Perhaps I’m industry-embedded, but this seems fine to me. Consent appears to be honored everywhere.
“Under the deal, agencies that use Flock’s Nova or FlockOS investigative platforms will soon be able to post Community Requests through Ring’s Neighbors app, asking nearby residents to share doorbell footage relevant to an investigation.
“Each request includes a case ID, time window, and map of the affected area. Ring says participation is voluntary and that residents can choose whether to respond, and agencies cannot see who declines. Users can also disable the feature entirely in their account settings.”
On the other hand, Senator Ron Wyden doesn’t trust Flock at all and says that “abuse of Flock cameras is inevitable.”
Heck, abuse of citizens by the U.S. Senate is inevitable, but citizens aren’t demanding that the Senate cease operations.
A new bill has been enrolled in California, where I live. But how will this affect web browser developers outside of California?
The bill is the California Opt Me Out Act, AB 566. The text of Section 2 of the bill is found at the end of this post. But the two major parts of the bill are as follows:
Google Gemini.
Starting in 2027, businesses that create web browsers, regardless of their location, must include “functionality configurable by a consumer that enables the browser to send an opt-out preference signal to businesses.”
Web browser developers that do this “shall not be liable for a violation of this title by a business that receives the opt-out preference signal.”
The bill doesn’t get any more specific than that; the California Privacy Protection Agency will work out the details.
The part of interest of course, is that happens to businesses that develop web browsers WITHOUT the opt-out functionality. What happens to those non-compliant businesses? What is the liability? Is it civil? Criminal? If Safari doesn’t include easy-to-use opt out functionality, will Tim Cook do time?
This is yet another example of the debate that occurs when one country, or one state, or one county/city enacts a law and expects the rest of the world to comply. In this particular case, the state of California is telling every web browser developer in the entire world how to configure their browsers. The developers have several choices:
Comply with California law, while simultaneously complying with laws from all other jurisdictions regarding opt out. Including a theoretical business-friendly jurisdiction that prohibits opt out entirely.
Ignore the California law and see what the California Privacy Protection Agency does, or tries to do. Is Yandex, the Russian developer of the Yandex browser, going to really care about California law?
Google Gemini.
Contest the law in court, arguing that it violates the U.S. First Amendment, the U.S. Second Amendment, or whatever.
The ball is now in the hands of the CPPA, which needs to develop the regulations to implement the law, as well as develop the penalties for non-compliant businesses.
Here is the exact text of Section 2.
SEC. 2.
Section 1798.136 is added to the Civil Code, to read:
1798.136.
(a) (1) A business shall not develop or maintain a browser that does not include functionality configurable by a consumer that enables the browser to send an opt-out preference signal to businesses with which the consumer interacts through the browser.
(2) The functionality required by paragraph (1) shall be easy for a reasonable person to locate and configure.
(b) A business that develops or maintains a browser shall make clear to a consumer in its public disclosures how the opt-out preference signal works and the intended effect of the opt-out preference signal.
(c) The California Privacy Protection Agency may adopt regulations as necessary to implement and administer this section.
(d) A business that develops or maintains a browser that includes a functionality that enables the browser to send an opt-out preference signal pursuant to this section shall not be liable for a violation of this title by a business that receives the opt-out preference signal.
(e) As used in this section:
(1) “Browser” means an interactive software application that is used by consumers to locate, access, and navigate internet websites.
(2) “Opt-out preference signal” means a signal that complies with this title and that communicates the consumer’s choice to opt out of the sale and sharing of the consumer’s personal information.
(f) This section shall become operative on January 1, 2027.