The facial recognition brouhaha in southeastern Wisconsin has taken an interesting turn.
According to Urban Milwaukee, the Milwaukee County Sheriff’s Office is pursuing an agreement with Biometrica for facial recognition services.
The, um, benefit? No cost to the county.
“However, the contract would not need to be approved by the Milwaukee County Board of Supervisors, because there would be no cost to the county associated with the contract. Biometrica offers its services to law enforcement agencies in exchange for millions of mugshots.”
“Milwaukee Police Department has also attempted to contract Biometrica’s services, prompting pushback, at least some of which reflected confusion about how the system works….
“The mooted agreement between Biometrica and MPD would have added 2.5 million images to the database.
“In theory, if MCSO signs a contract with Biometrica, it could perform facial recognition searches at the request of MPD.”
See Bredemarket’s previous posts on the city efforts that are now on hold.
No guarantee that the County will approve what the City didn’t. And considering the bad press from the City’s efforts, including using software BEFORE adopting a policy on its use, it’s going to be an uphill struggle.
I missed this announcement in December, but it carries an important message.
“Gatekeeper Systems, a pioneer in intelligent theft prevention solutions, today announced a significant enhancement to its FaceFirst® platform with the integration of technology from ROC.”
That’s the firm formerly known as Rank One Computing.
The important message is deeper in the press release.
““Facial recognition in retail must be fast, accurate, and accountable,” said Robert Harling, CEO of Gatekeeper Systems. “By embedding ROC’s NIST-verified algorithm directly into FaceFirst, we’re giving retailers a system that performs in real time and stands up to public, operational, and legal scrutiny. It’s AI you can trust—and accuracy you can prove.””
The “accountable” and “prove” part comes from ROC’s demonstrated results in NIST FRTE testing. As well as the fact that people using Gatekeeper Systems now know whose facial recognition algorithm they’re using.
It still shocks me when a company says that they’re using an algorithm, but don’t say whose algorithm they’re using.
But I could have chosen another title: “Fact: lack of deadlines sinks behavior.” That’s a mixture of two quotes from Tracy “Trace” Wilkins and Chris Burt, as we will see.
Whether Vanilla Ice and Gordon Lightfoot would agree with the sentiment is not known.
That guest post touched on Milwaukee, Wisconsin, but had nothing to do with ICE.
Vanilla Ice.
One of the “responsible uses” questions was one that Biometric Update had raised in the previous month: whether it was proper for the Milwaukee Police Department (MPD) to share information with facial recognition vendor Biometrica.
Milwaukee needed a policy
But the conversation subsequently redirected to another topic, as I noted in August. Before Milwaukee’s “Common Council” could approve any use of facial recognition, with or without Biometrica data sharing, MPD needed to develop a facial recognition policy.
According to a quote from MPD, it agreed.
“Should MPD move forward with acquiring FRT, a policy will be drafted based upon best practices and public input.”
It was clear that the policy would come first, facial recognition use afterward.
“Commissioner Krissie Fung pressed MPD inspector Paul Lao on the department’s past use of facial recognition.
““Just to clarify,” asked Fung, “Is the practice still continuing?”
““As needed right now, we are still using [FRT],” Lao responded.”
It was after 10:00 pm Central time, but the commissioner pressed the issue.
Fung asked Lao if the department was currently still using FRT without an SOP in place.
“As we said that’s correct and we’re trying to work on getting an SOP,” Lao said.
That brought the wolves out, because SOP or no SOP, there are people who hate facial recognition, especially because of other things going on in the city that have nothing to do with MPD. Add the “facial recognition is racist” claims, and MPD was (in Burt’s words) sunk.
Yes, a follow-up meeting will be held, but Burt notes (via WISN) that MPD has imposed its own moratorium on facial recognition technology use.
“Despite our belief that this is useful technology to assist in generating leads for apprehending violent criminals, we recognize that the public trust is far more valuable.”
Milwaukee should have asked, then acted
From Bredemarket’s self-interested perspective this is a content problem.
Back in August 2025, Milwaukee knew that it needed a facial recognition policy.
Several months later, in February 2026, it didn’t have one, and didn’t have a timeframe regarding when a policy would be ready for review.
Now I appreciate that a facial recognition policy is not a short writing job. I’ve worked on policies, and you can’t complete one in a couple of days.
But couldn’t you at least come up with a DRAFT in six months?
We commonly believe that modern people enjoy an abundance of data that historical people did not have. While this is often true, sometimes it isn’t.
Let’s look at the images we use in facial recognition.
ISO/IEC 19794-5 (Face image data) recommends a minimum inter-eye distance of 90 pixels.
But imagine for the moment that facial recognition existed 100 years ago. Could century-old film cameras achieve the necessary resolution to process faces on adding machines or whatever?
The answer is yes. Easily.
Google Gemini.
Back in the Roaring ‘20s, photographs of course were not digital images, but were captured and stored on film. During the 1920s a new film standard, 35mm film, was starting to emerge. And if you translate the “grains” in film to modern pixels, your facial image resolution is more than sufficient.
An interesting item popped up in SAM.gov. According to a Request for Information (RFI) due February 20, the FBI may have interest in a system for secret biometric searches.
“The FBI intends to identify available software solutions to store and search subjects at the classified level. This solution is not intended to replace the Next Generation Identification System Functionality, which was developed and implemented in collaboration with the FBI’s federal, state, local, tribal, and territorial partners. The solution shall reside at the Secret and/or Top-Secret/SCI level with the ability to support data feeds from external systems. The solution must allow the ability to enroll and search face, fingerprint, palmprint, iris, and latent fingerprints, and associated biographic information with a given set of biometrics.”
Now remember that the Next Generation Identification (NGI) system is protected from public access by requiring all users to adhere to the CJIS Security Requirements. But the CJIS Security Requirements aren’t Secret or Top Secret. These biometric searches, whatever they are, must REALLY be kept from prying eyes.
The RFI itself is 8 pages long, and is mysteriously numbered as RFI 01302025. I would have expected an RFI number 01152026. I believe this was an editing error, since FBI RFI 01302025 was issued in 2025 for a completely different purpose.
Whatever the real number is, the RFI is labeled “Classified Identity-Based Biometric System.” No acronym was specified, so I’m self-acronyming it as CIBS. Perhaps the system has a real acronym…but it’s secret.
If your company can support such a system from a business, technical, and security perspective, the due date is February 20 and questions are due by February 2. See SAM.gov for details.
Experienced biometric professionals can’t help but notice that the acronym OFIQ is similar to the acronym NFIQ (used in NFIQ 2), but the latter refers to the NIST FINGERPRINT image quality standard. NFIQ is also open source, with contributions from NIST and the German BSI, among others.
But NFIQ and OFIQ, while analyzing different biometric modalities, serve a similar purpose: to distinguish between good and bad biometric images.
But do these open source algorithms meaningfully measure quality?
The study of OFIQ
Biometric Update alerted readers to the November 2025 study “On the Utility of the Open Source Facial Image Quality Tool for Facial Biometric Recognition in DHS Operations” (PDF).
Note the words “in DHS Operations,” which are crucial.
The DHS doesn’t care about how ALL facial recognition algorithms perform.
The DHS only cares about the facial recognition algorithms that may potentially use.
DHS doesn’t care about algorithms it would never use, such as Chinese or Russian algorithms.
In fact, from the DHS perspective, it probably hopes that the Chinese Cloudwalk algorithm performs very badly. (In NIST tests, it doesn’t.)
So which algorithms did DHS evaluate? We don’t know precisely.
“A total of 16 commercial face recognition systems were used in this evaluation. They are labeled in diagrams as COTS1 through COTS16….Each algorithm in this study was voluntarily submitted to the MdTF as part of on-going biometric performance evaluations by its commercial entity.”
So what did DHS find when it used OFIQ to evaluate images submitted to these 16 algorithms?
“We found that the OFIQ unified quality score provides extremely limited utility in the DHS use cases we investigated. At operationally relevant biometric thresholds, biometric matching performance was high and probe samples that were assessed as having very low quality by OFIQ still successfully matched to references using a variety of face recognition algorithms.”
Or in human words:
Images that yielded a high quality OFIQ score accurately matched faces using the tested algorithms.
Images that yielded a low quality OFIQ score…STILL accurately matched faces using the tested algorithms.
Google Gemini.
So, at least in DHS’ case, it makes no sense to use the OFIQ algorithm.
I previously mused about an alternative universe in which a single human body had ten (different) faces.
Facial recognition would be more accurate if biometric systems had ten faces to match. (Kind of like you-know-what.)
Well, now I’m getting ridiculous by musing about a person with one hundred faces for identification.
Grok.
When I’m not musing about alternative universes with different biometrics, I’m helping identity/biometric firms market their products in this one.
And this frivolous exercise actually illustrates a significant difference between fingerprints and faces, especially in use cases where subjects submit all ten fingerprints but only a single face. The accuracy benefits are…well, they’re ten times more powerful.
Are there underlying benefits in YOUR biometric technology that you want to highlight? Bredemarket can help you do this. Book a free meeting with me, and I’ll ask you some questions to figure out where we can work together.