Privacy: What Happens When You Data Scrape FROM the Identity Vendors?

There is a lot of discussion about data scraping, an activity in which Company 1 takes the information publicly posted by Company 2 and incorporates it into its own records.

In the identity world, this takes the form of a company “scraping” the facial images that were publicly posted by a second company, such as a social media company.

I think that we all know of one identity company that is well-known (a euphemism for “notorious”) for scraping facial images from multiple sources. These not only include government-posted mugshots, but also content posted by private social media firms.

Needless to say, the social media companies think that data scraping is completely evil and terrible and identity vendors that do this should be fined and put out of businress. The identity vendor is question has a different view, even stating at one point that it had a (U.S.) First Amendment right to scrape data.

But what happens when someone wants to scrape data FROM an identity company?

A Skagit County court case

404 Media links to a Skagit County, Washington court case that addresses this very issue: in this case, data captured by Flock Safety.

The case is CITY OF SEDRO-WOOLLEY and CITY OF STANWOOD, Washington Municipal Corporations vs. JOSE RODRIGUEZ. The following are findings of fact:

“On April 10, 2025, Defendant, Jose Rodriguez made a Public Records Request to the Snohomish Police Department. He requested all of the city’s Flock cameras pictures and data logs between 5 pm and 6 pm on March 30, 2025.”

This particular record does not indicate WHY Rodriguez made this request, but 404 Media provided a clarification from Rodriguez himself.

“I wanted the records to see if they would release them to me, in hopes that if they were public records it would raise awareness to all the communities that have the Flock cameras that they may be public record and could be used by stalkers, or burglars scoping out a house, or other ways someone with bad intentions may use them. My goal was to try getting these cameras taken down by the cities that put them up.”

The City of Stanwood (don’t know its relation to Snohomish) answered Rodriguez in part:

“Stanwood PD is not the holder of the records you’re seeking; you may be able to request the records at FlockSafety.com.”

Incidentally, this is a common issue with identity databases using vendor softwares; who owns the data? I’ve addressed this before regarding the Milwaukee Police Department.

Now some legal talent may be able to parse what the word “holder” means, especially in regard to data hosted in the cloud. Perhaps Stanwood PD was trying to claim that since the records weren’t on site, it wasn’t the “holder.”

Anyway, the defendant subsequently made a similar request to the City of Sedro-Woolley, but for a different date. Sedro-Woolley didn’t provide the images either.

Then it gets weird.

What happened to the data?

“The Flock records sought by Defendant from Stanwood and Sedro-Woolley have been auto-deleted.”

Well how convenient.

And the listed statements of fact also contain the following:

“The contract between Flock and Stanwood sates that all Flock images generated off Flock cameras located in Stanwood are the property of Stanwood.

“The contract between Flock and Sedro-Woolley states that all Flock images generated off Flock cameras located in Sedro-Woolley are the property of Sedro-Woolley.”

The judge’s ruling

Fast forward to November 6, when Judge Elizabeth Neidzwski ruled on the cities’ claim that the Flock camera data was not a public record.

“IT IS HEREBY ORDERED, ADJUDGED AND DECREED that Plaintiff’s motion for Declaratory Judgment that the Flock camera records are not public records is DENIED.”

404 Media noted that the cities argued that they resisted the request to…protect privacy.

“In affidavits filed with the court, police argued that ‘if the public could access the Flock Safety System by making Public Records Act requests, it would allow nefarious actors the ability to track private persons and undermine the effectiveness of the system.’ The judge rejected every single one of these arguments.”

Of course, there are those who argue that the police themselves are the “nefarious actors,” and that they shouldn’t be allowed to track private persons either.

But the parties may take the opposite argument

This is not the only example of conflicting claims over WHO has the right to privacy. In fact, if the police were filming protestors and agitators and wanted the public’s help in identifying them, the police and the protestors would take the opposite arguments in the privacy issue: the police saying the footage SHOULD be released, and the protestors who were filmed saying it SHOULD NOT.

Privacy is in the eye of the beholder.

LiveView Technologies and Agentic AI-powered Contextual Detection and Behavioral Deterrence

Government Technology shared an article entitled “Talking Agentic AI Cameras: Can They Prevent Crime?” In the article, Nikki Davidson spoke with Steve Lindsey of LiveView Technologies about the surveillance company’s newest capability:

“The technology analyzes footage to detect activity and determine a best course of action. This can include directly speaking to individuals with personalized, AI-generated voice warnings, without human intervention….

“Lindsey explained the newest update with the technology uses contextual detection as well as generative AI behavioral deterrence. He said the new tech doesn’t just automate tasks; it gives AI agents the ability to make smart decisions based on evolving situations — such how to react to different scenarios.”

But a video is worth 10,000 words, so watch the video.

Lindsey clarifies that the intent of the agentic technology is to handle low-priority situations (such as trespassing on private property), while leaving high-priority situations in the capabilities of human security personnel.

I wonder if LiveView Technologies’ object recognition capabilities are able to detect guns as other video analytic programs do.

Video Analytics is Nothing New or Special

There is nothing new under the sun, despite the MIT Technology Review’s trumpeting of the “new way” to track people. 

The underlying article is gated, but here is what the public summary says:

“Police and federal agencies have found a controversial new way to skirt the growing patchwork of laws that curb how they use facial recognition: an AI model that can track people based on attributes like body size, gender, hair color and style, clothing, and accessories.

“The tool, called Track and built by the video analytics company Veritone, is used by 400 customers….”

Video analytics is nothing new. Viewing a picture of a particular backpack was a critical investigative lead after the Boston Marathon bombing. Two years later, I was adapting Morpho’s video analytics tool (now IDEMIA’s Augmented Vision) to U.S. use.

And it’s important to note that this is not strictly an IDENTIFICATION tool. Just because a tool finds someone with a particular body size, gender, hair color and style, clothing, and accessories means nothing. Hundreds of people may share those same attributes.

But when you combine them with an INDIVIDUALIZATION tool such as facial recognition…only then can you uniquely identify someone. (Augmented Vision can do this.)

And if facial recognition itself is only useful as an investigative lead…then video analytics without facial recognition is also only useful as an investigative lead.

Yawn.

(Imagen 3)

Vision Language Model (VLM)

(Flood image from Indian Navy)

A Vision Language Model (VLM) is a particular type of Large Multimodal Model (LMM).

Hugging Face:

“Vision language models are broadly defined as multimodal models that can learn from images and text. They are a type of generative models that take image and text inputs, and generate text outputs…. There’s a lot of diversity within the existing set of large vision language models, the data they were trained on, how they encode images, and, thus, their capabilities.”

Ipsotek:

“VISense (is) a groundbreaking addition to its VISuite platform that redefines real-time video analytics with Vision Language Models (VLMs). VISense represents a major advancement in Generative AI integration, using VLMs to achieve detailed scene understanding and contextual insights empowering operators to make informed decisions promptly….

“VISense allows users to ask questions like, “Let me know when something unusual is happening in any camera view” and receive a detailed response describing the unusual aspect of the captured behaviour. For instance, it might respond, “Yes, there is a flood; water levels are rising in the northern section, and several vehicles are stranded, causing heavy traffic congestion,” providing actionable insights that enable quick decisions.”

Safety vs. Privacy in Montana School Video Surveillance

At the highest level, debates regarding government and enterprise use of biometric technology boil down to a debate about whether to keep people safe, or whether to preserve individual privacy.

In the state of Montana, school safety is winning over school privacy—for now.

The one exception in Montana Senate Bill 397

Biometric Update links to a Helena Independent Record article on how Montana’s far-reaching biometric ban has one significant exception.

The state Legislature earlier this year passed a law barring state and local governments from continuous use of facial recognition technology, typically in the form of cameras capable of reading and collecting a person’s biometric data, like the identifiable features of their face and body. A bipartisan group of legislators went toe-to-toe with software companies and law enforcement in getting Senate Bill 397 over the finish line, contending public safety concerns raised by the technology’s supporters don’t overcome individual privacy rights. 

School districts, however, were specifically carved out of the definition of state and local governments to which the facial recognition technology law applies.

From the Helena Independent Record.

At a minimum Montana school districts seek to abide by two existing Federal laws when installating facial recognition and video surveillance systems.

Without many state-level privacy protection laws in place, school policies typically lean on the Children’s Online Privacy Protection Act (COPPA), a federal law requiring parental consent in order for websites to collect data on their children, or the Family Educational Rights and Privacy Act (FERPA), which protects the privacy of student education records. 

From the Helena Independent Record.

If a vendor doesn’t agree to abide by these laws, then the Montana School Board Association recommends that the school district not do business with the vendor.

Other vendors agree. Here is the statement of one vendor, Verkada (you’ll see them again later) on FERPA:

The Family Educational Rights and Privacy Act was passed by the US federal government to protect the privacy of students’ educational records. This law requires public schools and school districts to give families control over any personally identifiable information about the student.

Verkada provides educational organizations the tools they need to maintain FERPA compliance, such as face blurring for archived footage.

From https://www.verkada.com/security/#compliance

Simms High School’s use of the technology

How are the schools using these systems? In ways you may expect.

(The Sun River Valley School District’s) use of the technology is more focused on keeping people who shouldn’t be on school property away, he said, such as a parent who lost custody of their child.

(Simms) High School Principal Luke McKinley said it’s been more frequent to use the facial recognition technology during extra-curricular activities, when football fans get too rowdy for a high school sports event. 

From the Helena Independent Record.

Technology (in this case from Verkada) helps the Sun River School District, especially in its rural setting. Back in 2022, it took law enforcement an estimated 45 minutes to respond to school incidents. The hope is that the technology could identify those who engaged in illegal activity, or at least deter it.

What about other school districts?

When I created my educational identity page, I included the four key words “When permitted by law.” While Montana school districts are currently permitted to use facial recognition and video surveillance, other school districts need to check their local laws before implementing such a system, and also need to ensure that they comply with federal laws such as COPPA and FERPA.

I may be, um, biased in my view, but as long as the school district (or law enforcement agency, or apartment building owner, or whoever) complies with all applicable laws, and implements the technology with a primary purpose of protecting people rather than spying on them, facial recognition is a far superior tool to protect people than manual recognition methods that rely on all-too-fallible human beings.

Educational Identity: Why and How Do Educational Institutions Verify Identities?

Chaffey High School, Ontario California.

Whether a student is attending a preschool, a graduate school, or something in between, the educational institution needs to know who is accessing their services. This post discusses the types of identity verification and authentication that educational institutions may employ.

Why do educational institutions need to verify and authenticate identities?

Whether little Johnny is taking his blanket to preschool, or Johnny’s mother is taking her research notes to the local university, educational institutions such as schools, colleges, and universities need to know who the attendees are. It doesn’t matter whether the institution has a physical campus, like Chaffey High School’s campus in the video above, or if the institution has a virtual campus in which people attend via their computers, tablets, or phones.

Access boils down to two questions:

  • Who is allowed within the educational institution?
  • Who is blocked from the educational institution?

Who is allowed within the educational institution?

Regardless of the type of institution, there are certain people who are allowed within the physical and/or virtual campus.

  • Students.
  • Instructors, including teachers, teaching assistants/aides, and professors.
  • Administrators.
  • Staff.
  • Parents of minor students (but see below).
  • Others.

All of these people are entitled to access to at least portions of the campus, with different people having access to different portions of the campus. (Students usually can’t enter the teacher’s lounge, and hardly anybody has full access to the computer system where grades are kept.)

Before anyone is granted campus privileges, they have to complete identity verification. This may be really rigorous, but in some cases it can’t be THAT rigorous (how many preschoolers have a government ID?). Often, it’s not rigorous at all (“Can you show me a water bill? Is this your kid? OK then.”).

Once an authorized individual’s identity is verified, they need to be authenticated when they try to enter the campus. This is a relatively new phenomenon, in response to security threats at schools. Again, this could be really rigorous. For example, when students at a University of Rhode Island dining hall want to purchase food from the cafeteria, many of then consent to have their fingerprints scanned.

From https://www.youtube.com/watch?v=JzMDF_LN_LU

Another rigorous example: people whose biometrics are captured when taking exams, to deter cheating.

But some authentiation is much less rigorous. In these cases, people merely show an ID (hopefully not a fake ID) to authenticate themselves, or a security guard says “I know Johnny.”

(Again, all this is new. Many years ago, I accompanied a former college classmate to a class at his new college, the College of Marin. If I had kept my mouth shut, the professor wouldn’t have known that an unauthenticated student was in his class.)

Who is blocked from the educational institution?

At the same time, there are people who are clearly NOT allowed within the physical and/or virtual campus. Some of these people can enter campus with special permission, while some are completely blocked.

  • Former students. Once a student graduates, their privileges are usually revoked, and they need special permission if they want to re-enter campus to visit teachers or friends. (Admittedly this isn’t rigorously enforced.)
  • Expelled students. Well, some former students have a harder time returning to campus. If you brought a gun on campus, it’s going to be much harder for you to re-enter.
  • Former instructors, administrators, and staff. Again, people who leave the employ of the institution may not be allowed back, and certain ones definitely won’t be allowed back.
  • Non-custodial parents of minor students. In some cases, a court order prohibits a natural parent from contact with their child. So the educational institutions are responsible for enforcing this court order and ensuring that the minor student leaves campus only with someone who is authorized to take the child.
  • Others.

So how do you keep these people off campus? There are two ways.

  • If they’re not on the allowlist, they can’t enter campus anyway. As part of the identity verification process for authorized individuals, there is a list of people who can enter the campus. By definition, the 8 billion-plus people who are not on that “allowlist” can’t get on campus without special permission.
  • Sometimes they can be put on a blocklist. Or maybe you want to KNOW that certain people can’t enter campus. The inverse of an allowlist, people who are granted access, is a blocklist, people who are prevented from getting access. (You may know “blocklist” by the older term “blacklist,” and “allowlist” by the older term “whitelist.” The Security Industry Association and the National Institute of Standards and Technology recommend updated terminology.)

There’s just one teeny tiny problem with blocklists. Sometimes they’re prohibited by law.

In some cases (but not in others), a person is required to give consent before they are enrolled in a biometric system. If you’re the ex-student who was expelled for brining a gun on campus, how motivated will you be to allow that educational institution to capture your biometrics to keep you off campus?

And yes, I realize that the expelled student’s biometrics were captured while they were a student, but once they were no longer a student, the institution would have on need to retain those biometrics. Unless they felt like it.

This situation becomes especially sticky for campuses that use video surveillance systems. Like Chaffey High School.

Sign: "To reduce property damage to our facilities, this campus has installed a video surveillance system."
Chaffey High School, Ontario, California.

Now the mere installation of a video surveillance system does not (usually) result in legally prohibited behavior. It just depends upon what is done with the video.

  • If the video is not integrated with a biometric facial recognition system, there may not be an issue.
  • If Chaffey High School has its own biometric facial recognition system, then a whole host of legal factors may come into play.
  • If Chaffey High School does not have a biometric facial recognition system, but it gives the video to a police agency or private entity that does have a biometric facial recognition system, then some legal factors may emerge.

Or may not. Some facial recognition bans allow police use, and if this is true then Chaffey can give the footage to the police to use for authorized purposes. But if the jurisdiction bans police use of facial recognition, then people on the video can only be recognized manually. And you know how I feel about that.

Writing About Educational Identity

As you can see, educational identity is not as clear-cut as financial identity, both because financial institutions are more highly regulated and because blocklists are more controversial in educational identity. Vladimir Putin may not be able to open a financial account at a U.S. bank, but I bet he’d be allowed to enroll in an online course at a U.S. community college.

So if you are an educational institution or an identity firm who serves educational institutions, people who write for you need to know all of these nuances.

You need to provide the right information to your customers, and write it in a way that will motivate your customers to take the action you want them to take.

Speaking of motivating customers, are you with an identity firm or educational institution and need someone to write your marketing text?

  • Someone with 29 years of identity/biometric marketing experience?
  • Someone who understands that technological, organizational, and legal issues surrounding the use of identity solutions?
  • Someone who will explain why your customers should care about these issues, and the benefits a compliant solution provides to them?

If I can help you create your educational identity content, we need to talk.

Livin’ on the edge (improving video analytics efficiency)

During the last few years of my corporate career, I became involved in video analytics. While there is some overlap between video analytics and biometrics, video analytics is somewhat broader because it not only identifies individuals (via incorporation of facial recognition), but can also count people (for example, to enforce COVID capacity limits), or identify objects (for example, a particular backpack of interest that could contain an explosive device).

Because video analytics involves video rather than still images, there’s much more data that has to move from the cameras to the processing servers. For this reason, some video analytic applications take advantage of edge computing, where the analysis happens right at the edge device, removing the need to clog network bandwidth with complete video feeds.

Perhaps the edge devices only isolate the video of interest and send it off for processing. Or perhaps all of the processing takes place at the edge device.

However, as biometric and video analytics provider NEC has noted, there is a cost to edge computing.

[B]ecause cooling is difficult to manage and electricity consumption is restricted in edge devices, high-performance processors such as GPUs used in high-performance servers are not available, and processing capacity is constrained.

From https://www.nec.com/en/press/202112/global_20211201_01.html

NEC is developing a solution to address this processing capacity constraint.

Application of NEC’s newly developed gradual deep learning-based object detection technology enables efficient, high-speed, and high-precision detection of subjects from a large amount of images, even in an edge device with limited processing capacity, and enables simultaneous processing of images from multiple cameras in real time.

From https://www.nec.com/en/press/202112/global_20211201_01.html

One benefit of using software to perform the necessary calculations is that it lessens the need to upgrade hardware. As NEC and other video analytics providers well know, many organizations have already invested a lot of money in their camera systems, and would prefer software that operates with the current hardware, rather than obtaining software that requires a complete hardware replacement.

NEC’s new software isn’t available yet, but the company aims to commercialize it in 2022.

And now for the music video that is at best tangentially related to NEC’s technology advance. (And no, I don’t know if NEC’s facial recognition technology has been tested with masking of one side of the face.)

From https://www.youtube.com/watch?v=7nqcL0mjMjw