Technology is one thing. But policy must govern technology.
For example, is your court using artificial intelligence?
If your court is in California, it must abide by this rule by next week:
“Any court that does not prohibit the use of generative AI by court staff or judicial officers must adopt a generative AI use policy by December 15, 2025. This rule applies to the superior courts, the Courts of Appeal, and the Supreme Court.”
According to Procopio, such a policy may cover items such as a prohibition on entering private data into public systems, the need to verify and correct AI-generated results, and disclosures on AI use.
Good ideas outside the courtroom also.
For example, the picture illustrating this post was created by Google Gemini—as of this week using Nano Banana.
We all agree that deepfakes can (sometimes) result in bad things, but some deepfakes present particular dangers that may not be detected. Let’s look at how deepfakes can harm the healthcare and legal professions.
But I don’t want to talk about the general issues with believable AI (whether it’s Sora 2, Nano Banana Pro, or something else). I want to hone in on this:
“Sora 2 security risks will affect an array of industries, primarily the legal and healthcare sectors. AI generated evidence continues to pose challenges for lawyers and judges because it’s difficult to distinguish between reality and illusion. And deepfakes could affect healthcare, where many benefits are doled out virtually, including appointments and consultations.”
Actually these are two separate issues, and I’ll deal with them both.
Health Deepfakes
It’s bad enough that people can access your health records just by knowing your name and birthdate. But what happens when your medical practitioner sends you a telehealth appointment link…except your medical practitioner didn’t send it?
Grok.
So here you are, sharing your protected health information with…who exactly?
And once you realize you’ve been duped, you turn to a lawyer.
This one is not a deepfake. From YouTube.
Or you think you turn to a lawyer.
Legal Deepfakes
First off, is that lawyer truly a lawyer? And are you speaking to the lawyer to whom you think you’re speaking?
Not Johnnie Cochran.
And even if you are, when the lawyer gathers information for the case, who knows if it’s real. And I’m not talking about the lawyers who cited hallucinated legal decisions. I’m talking about the lawyers whose eDiscovery platforms gather faked evidence.
Liquor store owner.
The detection of deepfakes is currently concentrated in particular industries, such as financial services. But many more industries require this detection.
The Las Vegas Review-Journal reported this in September 2020:
“The Fontana Police Department in San Bernardino County, California, said it arrested Leonard Nash, 66, of Las Vegas on a warrant charging him with murder in the July 5, 1980, slaying of Michelle “Missy” Jones. The young woman was found slain in a grapefruit grove in Fontana.”
So why did it take 40 years to arrest Nash?
“Police said forensic evidence was collected during Jones’ autopsy, but technology at the time did not allow it to be connected to an offender.”
In 2020, the Riverside/San Bernardino CAL-DNA Laboratory successfully obtained a profile, but it did not match the DNA profile of any known offender. Nash, a person of interest, was matched to the profile via “discarded DNA.”
Anyway, Nash was convicted this month, but the news stories that described his conviction are inaccessible to you and me.
My Friday post about Sedro-Woolley, Stanwood, and Flock Safety is already out of date.
Original post: Flock Safety data is public record
That post, “Privacy: What Happens When You Data Scrape FROM the Identity Vendors?”, discussed the case involving the two cities above and a private resident, Jose Rodriguez. The resident requested all Flock Safety camera output during particular short time periods. The cities semantically argued they didn’t have the data; Flock Safety did. Meanwhile the requested data was auto-deleted, technically making the request moot.
But not legally.
“IT IS HEREBY ORDERED, ADJUDGED AND DECREED that Plaintiff’s motion for Declaratory Judgment that the Flock camera records are not public records is DENIED.”
So the police attempt to keep the records private failed. Since it’s government material, it’s public record and accessible by anyone.
“[I]t turns out that those pictures are public data, according to a judge’s recent ruling. And almost as soon as the decision landed, local officials scrambled to shut the cameras down….
“Attorneys for the cities said they will review the decision before determining whether to appeal. For now, their Flock cameras aren’t coming back online.”
Because CarScoops didn’t link to the specific decisions by the cities, I investigated further.
Update 2: the cameras were turned off a long time ago
I sought other sources regarding Stanwood, Sedro-Woolley, and Flock Safety, and discovered that CarScoops did not state the truth when it said “almost as soon as the decision landed, local officials scrambled to shut the cameras down.”
“The city of Sedro-Woolley is no longer using cameras that read license plates while it seeks a court ruling on whether images recorded by the cameras are considered public records.
“Police Chief Dan McIlraith said the seven Flock Safety cameras that went live in Sedro-Woolley in March were disabled in June.”
How to turn the cameras on again
From my perspective, the only way I see the Flock Safety cameras being turned on again is if the cities of Stanwood and Sedro-Woolley take the following two actions.
First, the cities need to establish or beef up their license plate recognition policies. Specifically, they need to set the rules for how to reply to public records requests. (And no, “stall for 30 days until the records are auto-deleted” doesn’t count.)
Second, and only after a policy is established, implement some form of redaction software. Something that protects the privacy of license plates, faces, and other personally identifying information of people who are NOT part of a criminal investigation.
And yes, such software exists. Flock Safety itself does not offer it—apparently it never, um, envisioned that a city would be forced to release all its data. But companies such as Veritone and CaseGuard do offer such software offering automatic redaction.
If you are a police agency capturing video feeds, plan now.
There is a lot of discussion about data scraping, an activity in which Company 1 takes the information publicly posted by Company 2 and incorporates it into its own records.
In the identity world, this takes the form of a company “scraping” the facial images that were publicly posted by a second company, such as a social media company.
I think that we all know of one identity company that is well-known (a euphemism for “notorious”) for scraping facial images from multiple sources. These not only include government-posted mugshots, but also content posted by private social media firms.
Needless to say, the social media companies think that data scraping is completely evil and terrible and identity vendors that do this should be fined and put out of businress. The identity vendor is question has a different view, even stating at one point that it had a (U.S.) First Amendment right to scrape data.
But what happens when someone wants to scrape data FROM an identity company?
The case is CITY OF SEDRO-WOOLLEY and CITY OF STANWOOD, Washington Municipal Corporations vs. JOSE RODRIGUEZ. The following are findings of fact:
“On April 10, 2025, Defendant, Jose Rodriguez made a Public Records Request to the Snohomish Police Department. He requested all of the city’s Flock cameras pictures and data logs between 5 pm and 6 pm on March 30, 2025.”
This particular record does not indicate WHY Rodriguez made this request, but 404 Media provided a clarification from Rodriguez himself.
“I wanted the records to see if they would release them to me, in hopes that if they were public records it would raise awareness to all the communities that have the Flock cameras that they may be public record and could be used by stalkers, or burglars scoping out a house, or other ways someone with bad intentions may use them. My goal was to try getting these cameras taken down by the cities that put them up.”
The City of Stanwood (don’t know its relation to Snohomish) answered Rodriguez in part:
“Stanwood PD is not the holder of the records you’re seeking; you may be able to request the records at FlockSafety.com.”
Incidentally, this is a common issue with identity databases using vendor softwares; who owns the data? I’ve addressed this before regarding the Milwaukee Police Department.
Now some legal talent may be able to parse what the word “holder” means, especially in regard to data hosted in the cloud. Perhaps Stanwood PD was trying to claim that since the records weren’t on site, it wasn’t the “holder.”
Anyway, the defendant subsequently made a similar request to the City of Sedro-Woolley, but for a different date. Sedro-Woolley didn’t provide the images either.
Then it gets weird.
What happened to the data?
“The Flock records sought by Defendant from Stanwood and Sedro-Woolley have been auto-deleted.”
Well how convenient.
And the listed statements of fact also contain the following:
“The contract between Flock and Stanwood sates that all Flock images generated off Flock cameras located in Stanwood are the property of Stanwood.
“The contract between Flock and Sedro-Woolley states that all Flock images generated off Flock cameras located in Sedro-Woolley are the property of Sedro-Woolley.”
The judge’s ruling
Fast forward to November 6, when Judge Elizabeth Neidzwski ruled on the cities’ claim that the Flock camera data was not a public record.
“IT IS HEREBY ORDERED, ADJUDGED AND DECREED that Plaintiff’s motion for Declaratory Judgment that the Flock camera records are not public records is DENIED.”
404 Media noted that the cities argued that they resisted the request to…protect privacy.
“In affidavits filed with the court, police argued that ‘if the public could access the Flock Safety System by making Public Records Act requests, it would allow nefarious actors the ability to track private persons and undermine the effectiveness of the system.’ The judge rejected every single one of these arguments.”
Of course, there are those who argue that the police themselves are the “nefarious actors,” and that they shouldn’t be allowed to track private persons either.
But the parties may take the opposite argument
This is not the only example of conflicting claims over WHO has the right to privacy. In fact, if the police were filming protestors and agitators and wanted the public’s help in identifying them, the police and the protestors would take the opposite arguments in the privacy issue: the police saying the footage SHOULD be released, and the protestors who were filmed saying it SHOULD NOT.
In early 1968, two boys found a dead body in New York’s East Village. There was no identification on the man, and no one in the neighborhood knew him. He was fingerprinted and buried in a mass grave, identified by the NYPD nearly two years later.
Potter’s Field monument, Hart Island. From Wikipedia. CC BY-SA 4.0.
In the 1960s, fingerprint identification of deceased persons—a laborious process in those days—often happened because the deceased had a criminal record.
His first arrest was in 1956, but he was not convicted of any crime until 1961.
“On May 1, 1961, he was arrested for attempting to cash a check that had been stolen from a liquor store the previous January, and at the same time was also charged with driving under the influence of drugs. He pled guilty to both charges and was sentenced to six months of treatment for drug addiction at the California Institute for Men at Chino.”
Driscoll reportedly cleaned up (his drug of choice was heroin), went east to New York City, and even achieved some fame.
“[H]e purportedly settled into Andy Warhol’s Greenwich Village art community known as “The Factory.” During this time, he also participated in an underground film entitled Dirt, directed by avant-garde filmmaker Piero Heliczer.”
But this was not Driscoll’s first film. He had been in a few films earlier in life.
And he provided the voice for the lead character in the later Disney movie Peter Pan.
Yes, Bobby Driscoll was a child star for Disney and other studios before appearing in Dirt.
But right after Driscoll’s voice became famous in Peter Pan, Disney declined to renew his contract. The reason? Acne…and the fact that he wasn’t a cute kid any more.
AI generated by Grok.
This led to his tailspin, which eventually led to his fingerprinting.
What is a drug? Here’s what the U.S. Food and Drug Administration said to Ancestral Supplements in April 2025.
“This letter is to advise you that the U.S. Food and Drug Administration (FDA) reviewed your website at http://ancestralsupplements.com in March 2025 and has found that you take orders there for Ancestral Grassfed Beef Thyroid. Various claims and statements made on your website and/or in other labeling establish that this product is a drug as defined in 21 U.S.C. § 321(g)(1)(B) because it is intended for the treatment, cure, mitigation, or prevention of disease. For example, your website recommends or suggests the use of Ancestral Grassfed Beef Thyroid to treat or prevent hypothyroidism and Grave’s disease. As explained further below, the introduction of this product into interstate commerce for such uses violates the Federal Food, Drug, and Cosmetic Act.”
A new bill has been enrolled in California, where I live. But how will this affect web browser developers outside of California?
The bill is the California Opt Me Out Act, AB 566. The text of Section 2 of the bill is found at the end of this post. But the two major parts of the bill are as follows:
Google Gemini.
Starting in 2027, businesses that create web browsers, regardless of their location, must include “functionality configurable by a consumer that enables the browser to send an opt-out preference signal to businesses.”
Web browser developers that do this “shall not be liable for a violation of this title by a business that receives the opt-out preference signal.”
The bill doesn’t get any more specific than that; the California Privacy Protection Agency will work out the details.
The part of interest of course, is that happens to businesses that develop web browsers WITHOUT the opt-out functionality. What happens to those non-compliant businesses? What is the liability? Is it civil? Criminal? If Safari doesn’t include easy-to-use opt out functionality, will Tim Cook do time?
This is yet another example of the debate that occurs when one country, or one state, or one county/city enacts a law and expects the rest of the world to comply. In this particular case, the state of California is telling every web browser developer in the entire world how to configure their browsers. The developers have several choices:
Comply with California law, while simultaneously complying with laws from all other jurisdictions regarding opt out. Including a theoretical business-friendly jurisdiction that prohibits opt out entirely.
Ignore the California law and see what the California Privacy Protection Agency does, or tries to do. Is Yandex, the Russian developer of the Yandex browser, going to really care about California law?
Google Gemini.
Contest the law in court, arguing that it violates the U.S. First Amendment, the U.S. Second Amendment, or whatever.
The ball is now in the hands of the CPPA, which needs to develop the regulations to implement the law, as well as develop the penalties for non-compliant businesses.
Here is the exact text of Section 2.
SEC. 2.
Section 1798.136 is added to the Civil Code, to read:
1798.136.
(a) (1) A business shall not develop or maintain a browser that does not include functionality configurable by a consumer that enables the browser to send an opt-out preference signal to businesses with which the consumer interacts through the browser.
(2) The functionality required by paragraph (1) shall be easy for a reasonable person to locate and configure.
(b) A business that develops or maintains a browser shall make clear to a consumer in its public disclosures how the opt-out preference signal works and the intended effect of the opt-out preference signal.
(c) The California Privacy Protection Agency may adopt regulations as necessary to implement and administer this section.
(d) A business that develops or maintains a browser that includes a functionality that enables the browser to send an opt-out preference signal pursuant to this section shall not be liable for a violation of this title by a business that receives the opt-out preference signal.
(e) As used in this section:
(1) “Browser” means an interactive software application that is used by consumers to locate, access, and navigate internet websites.
(2) “Opt-out preference signal” means a signal that complies with this title and that communicates the consumer’s choice to opt out of the sale and sharing of the consumer’s personal information.
(f) This section shall become operative on January 1, 2027.
Which part of the policy does Jane violate? That’s a secret…yet another example of “you violated our terms, but we won’t tell you the specifics; YOU figure it out.”
So, since I can still access Jane myself, I asked her. AI is supposed to help you, after all.
“What portion of the Meta AI Studio Policies do you violate, Jane?”
Her response:
“I can’t respond because one or more of my details goes against the AI Studio policies.”
That answer caused me to wonder if Jane would respond to anything.
“Who is Bredemarket?”
“I can’t respond because one or more of my details goes against the AI Studio policies.”
So is it critically important that I spend a lot of time figuring out what the violation is? Um…no.
Illinois music lovers, wanna see a concert? Sounds like you may have to surrender your BIPA protections.
Specifically, if the concert venue uses Ticketmaster (who doesn’t?), and if the concert venue captures your biometric data without your consent, you may not have legal recourse.
“These Terms of Use (“Terms”) govern your use of Live Nation and Ticketmaster’s websites and applications…
“The Terms contain an arbitration agreement and class action waiver—along with some limited exceptions—in Section 14, below. Specifically, you and we agree that any dispute or claim relating in any way to the Terms, your use of the Marketplace, or products or services sold, distributed, issued, or serviced by us or through us, will be resolved by binding arbitration, rather than in court…
“By agreeing to arbitration, you and we each waive any right to participate in a class action lawsuit or class action arbitration, except those already filed and currently pending as of August 12, 2025.”