If the City Fails, Try the County (Milwaukee and Biometrica)

The facial recognition brouhaha in southeastern Wisconsin has taken an interesting turn.

According to Urban Milwaukee, the Milwaukee County Sheriff’s Office is pursuing an agreement with Biometrica for facial recognition services.

The, um, benefit? No cost to the county.

“However, the contract would not need to be approved by the Milwaukee County Board of Supervisors, because there would be no cost to the county associated with the contract. Biometrica offers its services to law enforcement agencies in exchange for millions of mugshots.”

Sound familiar? Chris Burt thinks so.

“Milwaukee Police Department has also attempted to contract Biometrica’s services, prompting pushback, at least some of which reflected confusion about how the system works….

“The mooted agreement between Biometrica and MPD would have added 2.5 million images to the database.

“In theory, if MCSO signs a contract with Biometrica, it could perform facial recognition searches at the request of MPD.”

See Bredemarket’s previous posts on the city efforts that are now on hold.

And counties also.

No guarantee that the County will approve what the City didn’t. And considering the bad press from the City’s efforts, including using software BEFORE adopting a policy on its use, it’s going to be an uphill struggle.

Responsible Retail Artificial Intelligence

I missed this announcement in December, but it carries an important message.

“Gatekeeper Systems, a pioneer in intelligent theft prevention solutions, today announced a significant enhancement to its FaceFirst® platform with the integration of technology from ROC.”

That’s the firm formerly known as Rank One Computing.

The important message is deeper in the press release.

““Facial recognition in retail must be fast, accurate, and accountable,” said Robert Harling, CEO of Gatekeeper Systems. “By embedding ROC’s NIST-verified algorithm directly into FaceFirst, we’re giving retailers a system that performs in real time and stands up to public, operational, and legal scrutiny. It’s AI you can trust—and accuracy you can prove.””

The “accountable” and “prove” part comes from ROC’s demonstrated results in NIST FRTE testing. As well as the fact that people using Gatekeeper Systems now know whose facial recognition algorithm they’re using.

It still shocks me when a company says that they’re using an algorithm, but don’t say whose algorithm they’re using.

If you want to say the right stuff, Bredemarket can write your biometric company’s product marketing content.

Who Can Write My Biometric Company’s Product Marketing Content?

Someone who is a biometric product marketing expert.

Someone who has three decades of expertise in biometrics.

I remember ANSI/NIST-CSL 1-1993.

Someone who has worked with fingerprints, faces, irises, voices, DNA, and other biometric modalities.

Some modalities. Butts and tongues not included.

Someone who understands the privacy landscape in Europe (GDPR), Illinois (BIPA), California, and elsewhere.

BIPA is a four-letter word.

Oh…and someone who can write.

A slight exaggeration.

So who can write this stuff?

I know someone. Bredemarket.

Some great videos


Biometric product marketing expert.
Questions.
Services, process, and pricing.

Which Biometric Modalities Does NIST Investigate?

I’ve spent a lot of time in the Bredemarket blog looking at a variety of NIST studies of different biometric modalities.

But you can read up on them yourself.

NIST has investigated the following biometric modalities, using both definitions of the word biometrics:

But NIST has not spent taxpayer money researching other biometric modalities, such as tongue identification.

Biometric product marketing expert.

Fact: Cities Must Disclose Responsible Uses of Biometric Data

“Fact: Cities must disclose responsible uses of biometric data” is a parody of the title of my May 2025 guest post for Biometric Update, “Opinion: Vendors must disclose responsible uses of biometric data.”

From Biometric Update.

But I could have chosen another title: “Fact: lack of deadlines sinks behavior.” That’s a mixture of two quotes from Tracy “Trace” Wilkins and Chris Burt, as we will see.

Whether Vanilla Ice and Gordon Lightfoot would agree with the sentiment is not known.

But back to my Biometric Update guest post (expect my next appearance in Biometric Update in 2035).

That guest post touched on Milwaukee, Wisconsin, but had nothing to do with ICE.

Vanilla Ice.

One of the “responsible uses” questions was one that Biometric Update had raised in the previous month: whether it was proper for the Milwaukee Police Department (MPD) to share information with facial recognition vendor Biometrica.

Milwaukee needed a policy

But the conversation subsequently redirected to another topic, as I noted in August. Before Milwaukee’s “Common Council” could approve any use of facial recognition, with or without Biometrica data sharing, MPD needed to develop a facial recognition policy.

According to a quote from MPD, it agreed.

“Should MPD move forward with acquiring FRT, a policy will be drafted based upon best practices and public input.”

It was clear that the policy would come first, facial recognition use afterward.

Google Gemini.

Well, until last night, when a fact was revealed that caused Chris Burt to write an article entitled “Milwaukee police sink efforts to contract facial recognition with unsanctioned use.”

Sounds like the biggest wreck since the one Gordon Lightfoot sang about. (A different lake, but bear with me here.)

Gordon Lightfoot.

Milwaukee didn’t get a policy

The details are in an article by WUWM, Milwaukee’s NPR station, which took a break from ICE coverage to report on a Thursday night Fire and Police Commission meeting.

“Commissioner Krissie Fung pressed MPD inspector Paul Lao on the department’s past use of facial recognition.

““Just to clarify,” asked Fung, “Is the practice still continuing?”

““As needed right now, we are still using [FRT],” Lao responded.”

It was after 10:00 pm Central time, but the commissioner pressed the issue.

Fung asked Lao if the department was currently still using FRT without an SOP in place.

“As we said that’s correct and we’re trying to work on getting an SOP,” Lao said.

That brought the wolves out, because SOP or no SOP, there are people who hate facial recognition, especially because of other things going on in the city that have nothing to do with MPD. Add the “facial recognition is racist” claims, and MPD was (in Burt’s words) sunk.

Yes, a follow-up meeting will be held, but Burt notes (via WISN) that MPD has imposed its own moratorium on facial recognition technology use.

“Despite our belief that this is useful technology to assist in generating leads for apprehending violent criminals, we recognize that the public trust is far more valuable.”

Milwaukee should have asked, then acted

From Bredemarket’s self-interested perspective this is a content problem.

  • Back in August 2025, Milwaukee knew that it needed a facial recognition policy.
  • Several months later, in February 2026, it didn’t have one, and didn’t have a timeframe regarding when a policy would be ready for review.

Now I appreciate that a facial recognition policy is not a short writing job. I’ve worked on policies, and you can’t complete one in a couple of days.

But couldn’t you at least come up with a DRAFT in six months?

To create a policy, you need a process.

Bredemarket asks, then it acts.

Deadlines drive behavior

Coincidentally, I live-blogged a Never Search Alone webinar this morning at which Tracy “Trace” Wilkins made this statement.

“Deadlines drive behavior.”

Frankly, I see this a lot. Companies (or governments) require content, but don’t set a deadline for finalizing that content.

And when you don’t set a deadline, then it never gets done.

And no, “as soon as possible” is not a deadline, because “as soon as possible” REALLY means “within a year, if we feel like it.”

Lack of deadlines sinks behavior.

Could 1926 Images Support Facial Recognition?

We commonly believe that modern people enjoy an abundance of data that historical people did not have. While this is often true, sometimes it isn’t.

Let’s look at the images we use in facial recognition.

ISO/IEC 19794-5 (Face image data) recommends a minimum inter-eye distance of 90 pixels.

But imagine for the moment that facial recognition existed 100 years ago. Could century-old film cameras achieve the necessary resolution to process faces on adding machines or whatever?

The answer is yes. Easily.

Google Gemini.

Back in the Roaring ‘20s, photographs of course were not digital images, but were captured and stored on film. During the 1920s a new film standard, 35mm film, was starting to emerge. And if you translate the “grains” in film to modern pixels, your facial image resolution is more than sufficient.

Here is what FilmFix says:

“Thirty-five-millimeter film has a digital resolution equivalent to approximately 5.6K — a digital image size of about 5,600 × 3,620 pixels.”

Yeah, that will work—considering that the Google Gemini image illustrating this post was generated at only 1,024 x 1,024 pixels.

CIBS: Keeping Secrets From NGI

An interesting item popped up in SAM.gov. According to a Request for Information (RFI) due February 20, the FBI may have interest in a system for secret biometric searches.

“The FBI intends to identify available software solutions to store and search subjects at the classified level.  This solution is not intended to replace the Next Generation Identification System Functionality, which was developed and implemented in collaboration with the FBI’s federal, state, local, tribal, and territorial partners. The solution shall reside at the Secret and/or Top-Secret/SCI level with the ability to support data feeds from external systems.  The solution must allow the ability to enroll and search face, fingerprint, palmprint, iris, and latent fingerprints, and associated biographic information with a given set of biometrics.”

Now remember that the Next Generation Identification (NGI) system is protected from public access by requiring all users to adhere to the CJIS Security Requirements. But the CJIS Security Requirements aren’t Secret or Top Secret. These biometric searches, whatever they are, must REALLY be kept from prying eyes.

The RFI itself is 8 pages long, and is mysteriously numbered as RFI 01302025. I would have expected an RFI number 01152026. I believe this was an editing error, since FBI RFI 01302025 was issued in 2025 for a completely different purpose.

Whatever the real number is, the RFI is labeled “Classified Identity-Based Biometric System.” No acronym was specified, so I’m self-acronyming it as CIBS. Perhaps the system has a real acronym…but it’s secret.

If your company can support such a system from a business, technical, and security perspective, the due date is February 20 and questions are due by February 2. See SAM.gov for details.

You Can Measure Quality, But is the Measure Meaningful? (OFIQ)

The purpose of measuring quality should not be for measurement’s own sake. The purpose should be to inform people to make useful decisions.

In Germany, the Bundesamt für Sicherheit in der Informationstechnik (Federal Office for Information Security) has developed the Open Source Face Image Quality (OFIQ) standard.

Experienced biometric professionals can’t help but notice that the acronym OFIQ is similar to the acronym NFIQ (used in NFIQ 2), but the latter refers to the NIST FINGERPRINT image quality standard. NFIQ is also open source, with contributions from NIST and the German BSI, among others.

But NFIQ and OFIQ, while analyzing different biometric modalities, serve a similar purpose: to distinguish between good and bad biometric images.

But do these open source algorithms meaningfully measure quality?

The study of OFIQ

Biometric Update alerted readers to the November 2025 study “On the Utility of the Open Source Facial Image Quality Tool for Facial Biometric Recognition in DHS Operations” (PDF).

Note the words “in DHS Operations,” which are crucial.

  • The DHS doesn’t care about how ALL facial recognition algorithms perform.
  • The DHS only cares about the facial recognition algorithms that may potentially use.
  • DHS doesn’t care about algorithms it would never use, such as Chinese or Russian algorithms.
  • In fact, from the DHS perspective, it probably hopes that the Chinese Cloudwalk algorithm performs very badly. (In NIST tests, it doesn’t.)

So which algorithms did DHS evaluate? We don’t know precisely.

“A total of 16 commercial face recognition systems were used in this evaluation. They are labeled in diagrams as COTS1 through COTS16….Each algorithm in this study was voluntarily submitted to the MdTF as
part of on-going biometric performance evaluations by its commercial entity.”

Usally MdTF rally participants aren’t disclosed, unless a participant discloses itself, like Paravision did after the 2022 Biometric Technology Rally.

“Paravision’s matching system alias in the test was ‘Miami.'”

Welcome to Miami, bienvenidos a Miami. Google Gemini.

So what did DHS find when it used OFIQ to evaluate images submitted to these 16 algorithms?

“We found that the OFIQ unified quality score provides extremely limited utility in the DHS use cases we investigated. At operationally relevant biometric thresholds, biometric matching performance was high and probe samples that were assessed as having very low quality by OFIQ still successfully matched to references using a variety of face recognition algorithms.”

Or in human words:

  • Images that yielded a high quality OFIQ score accurately matched faces using the tested algorithms.
  • Images that yielded a low quality OFIQ score…STILL accurately matched faces using the tested algorithms.
Google Gemini.

So, at least in DHS’ case, it makes no sense to use the OFIQ algorithm.

Your mileage may vary.

If you have questions, consult a biometric product marketing expert.

Or Will Smith. Just don’t make a joke about his wife.

Hyper-accuracy: One Hundred Faces

(Part of the biometric product marketing expert series)

I previously mused about an alternative universe in which a single human body had ten (different) faces.

Facial recognition would be more accurate if biometric systems had ten faces to match. (Kind of like you-know-what.)

Well, now I’m getting ridiculous by musing about a person with one hundred faces for identification.

Grok.

When I’m not musing about alternative universes with different biometrics, I’m helping identity/biometric firms market their products in this one.

And this frivolous exercise actually illustrates a significant difference between fingerprints and faces, especially in use cases where subjects submit all ten fingerprints but only a single face. The accuracy benefits are…well, they’re ten times more powerful.

Are there underlying benefits in YOUR biometric technology that you want to highlight? Bredemarket can help you do this. Book a free meeting with me, and I’ll ask you some questions to figure out where we can work together.