Who or What Requires Authorization?

There are many definitions of authorization, but the one in RFC 4949 has the benefit of brevity.

“An approval that is granted to a system entity to access a system resource.”

Non-person Entities Require Authorization

Note that it uses the word “entity.” It does NOT use the word “person.” Because the entity requiring authorization may be a non-person entity.

I made this point in a previous post about attribute-based access control (ABAC), when I quoted from the 2014 version of NIST Special Publication 800-162. Incidentally, if you wonder why I use the acronym NPE (non-person entity) rather than the acronym NHI (non-human identity), this is why.

“A subject is a human user or NPE, such as a device that issues access requests to perform operations on objects. Subjects are assigned one or more attributes.”

If you have a process to authorize people, but don’t have a process to authorize bots, you have a problem. Matthew Romero, formerly of Veza, has written about the lack of authorization for non-human identities.

“Unlike human users, NHIs operate without direct oversight or interactive authentication. Some run continuously, using static credentials without safeguards like multi-factor authentication (MFA). Because most NHIs are assigned elevated permissions automatically, they’re often more vulnerable than human accounts—and more attractive targets for attackers. 

“When organizations fail to monitor or decommission them, however, these identities can linger unnoticed, creating easy entry points for cyber threats.”

Veza recommends that people use a product that monitors authorizations for both human and non-human identities. And by the most amazing coincidence, Veza offers such a product.

People Require Authorization

And of course people require authorization also. They need authorization:

It’s not enough to identify or authenticate a person or NPE. Once that is done, you need to confirm that this particular person has the authorization to…launch a nuclear bomb. Or whatever.

Your Customers Require Information on Your Authorization Solution

If your company offers an authorization solution, and you need Bredemarket’s content, proposal, or analysis consulting help, talk to me.

You Can Measure Quality, But is the Measure Meaningful? (OFIQ)

The purpose of measuring quality should not be for measurement’s own sake. The purpose should be to inform people to make useful decisions.

In Germany, the Bundesamt für Sicherheit in der Informationstechnik (Federal Office for Information Security) has developed the Open Source Face Image Quality (OFIQ) standard.

Experienced biometric professionals can’t help but notice that the acronym OFIQ is similar to the acronym NFIQ (used in NFIQ 2), but the latter refers to the NIST FINGERPRINT image quality standard. NFIQ is also open source, with contributions from NIST and the German BSI, among others.

But NFIQ and OFIQ, while analyzing different biometric modalities, serve a similar purpose: to distinguish between good and bad biometric images.

But do these open source algorithms meaningfully measure quality?

The study of OFIQ

Biometric Update alerted readers to the November 2025 study “On the Utility of the Open Source Facial Image Quality Tool for Facial Biometric Recognition in DHS Operations” (PDF).

Note the words “in DHS Operations,” which are crucial.

  • The DHS doesn’t care about how ALL facial recognition algorithms perform.
  • The DHS only cares about the facial recognition algorithms that may potentially use.
  • DHS doesn’t care about algorithms it would never use, such as Chinese or Russian algorithms.
  • In fact, from the DHS perspective, it probably hopes that the Chinese Cloudwalk algorithm performs very badly. (In NIST tests, it doesn’t.)

So which algorithms did DHS evaluate? We don’t know precisely.

“A total of 16 commercial face recognition systems were used in this evaluation. They are labeled in diagrams as COTS1 through COTS16….Each algorithm in this study was voluntarily submitted to the MdTF as
part of on-going biometric performance evaluations by its commercial entity.”

Usally MdTF rally participants aren’t disclosed, unless a participant discloses itself, like Paravision did after the 2022 Biometric Technology Rally.

“Paravision’s matching system alias in the test was ‘Miami.'”

Welcome to Miami, bienvenidos a Miami. Google Gemini.

So what did DHS find when it used OFIQ to evaluate images submitted to these 16 algorithms?

“We found that the OFIQ unified quality score provides extremely limited utility in the DHS use cases we investigated. At operationally relevant biometric thresholds, biometric matching performance was high and probe samples that were assessed as having very low quality by OFIQ still successfully matched to references using a variety of face recognition algorithms.”

Or in human words:

  • Images that yielded a high quality OFIQ score accurately matched faces using the tested algorithms.
  • Images that yielded a low quality OFIQ score…STILL accurately matched faces using the tested algorithms.
Google Gemini.

So, at least in DHS’ case, it makes no sense to use the OFIQ algorithm.

Your mileage may vary.

If you have questions, consult a biometric product marketing expert.

Or Will Smith. Just don’t make a joke about his wife.

When You Can’t Tell Them Apart

Will facial recognition ever become precise enough to distinguish between identical twins?

NIST investigated this in 2023 but did not continue the research. From the report:

These results show that identical twins and same-sex fraternal twins give outcomes that are inconsistent with the intended or expected behaviour from a face recognition algorithm.

Omnigarde Peter Lo Biography: I Need to Steal This Idea

As we approach 2026, advanced biometric firm Omnigarde has released new marketing materials. One of these is a video biography of Omnigarde’s principal, Dr. Peter Lo.

Dr. Peter Lo.

Of all the videos I’ve created, I’ve never created a “Who I Am” video. Not that I have the industry recognition that Dr. Lo has…

KeyData Cyber Sums Up The Most Visible Change in NIST SP 800-63-4

As we all transition from version 3 of NIST SP 800-63 to the new version 4 (63 63A 63B 63C), Biometric Update has published an article authored by Dustin Hoff of KeyData Cyber, “Navigating the crossroads of identity: leveraging NIST SP 800-63-4 for business advantage.”

So what has changed?

“Perhaps the most visible change is the push for phishing-resistant authentication—methods like passkeys, hardware-backed authenticators, and device binding….This shift signals that yesterday’s non-phishing-resistant MFA (SMS codes, security questions, and email OTPs) is no longer enough because they are easily compromised through man-in-the-middle or social engineering attacks like SIM swapping.”

Iguana-in-the-middle. Google Gemini.

Hoff says a lot more about version 4, including tips of transitioning to the new NIST standard. Read Hoff’s piece here on Biometric Update.

OK, How Does Orchestration REDUCE Complexity?

I’m stealing work from my bot.

I just asked Google Gemini to conceive an illustration of the benefits of orchestration. You can see my original prompt and the resulting illustration, credited to Bredebot, in the blog post “Orchestration: Harmonizing the Tech Universe.” (Not “Harmonzing.” Oh well.)

Google Gemini.

Note the second of the two benefits listed in Bredebot’s AI-generated illustration: “Reduced Complexity.”

On the surface, this sounds like generative AI getting the answer wrong…again.

  • After all, the reason that software companies offer a single-vendor solution is because when everything comes from the same source, it’s easier to get everything to work together.
  • When you have an orchestrated solution incorporating elements from multiple vendors, common sense tells you that the resulting solution is MORE complex, not less complex.

When I reviewed the image, I was initially tempted to ask Bredebot to write a response explaining how orchestrated solution reduce complexity. But then I decided that I should write this myself.

Because I had an idea.

The discipline from orchestration

When you orchestrate solutions from multiple vendors, it’s extremely important that the vendor solutions have ways to talk to each other. This is the essence of orchestration, after all.

Because of this need, you HAVE to create rules that govern how the software packages talk to each other.

Let me cite an example from one of my former employers, Incode. As part of its identity verification process, Incode is capable of interfacing to selected government systems and processing government validations. After all, I may have something that looks like a Mexican ID, but is it really a Mexican ID?

Mexico – INE Validation. When government face validation is enabled this method compares the user’s selfie against the image in the INE database. The method should be called after add-face is over and one of (process-id or document-id) is over.

So Incode needs a standard way to interface with Mexico’s electoral registry database for this whole thing to work. Once that’s defined, you just follow the rules and everything should work.

The lack of discipline from single-vendor solutions

Contrast this with a situation in which all the data comes from a single vendor.

Now ideally interfaces between single-vendor systems should be defined in the same way as interfaces between multi-vendor systems. That way everything is nicely neatly organized and future adaptations are easy.

Sounds great…until you have a deadline to meet and you need to do it quick and dirty.

Google Gemini.

In the same way that computer hardware server rooms can become a tangle of spaghetti cables, computer software can become a tangle of spaghetti interfaces. All because you have to get it done NOW. Someone else can deal with the problems later.

So that’s my idea on how orchestration reduces complexity. But what about those who really know what they’re talking about?

Chris White on orchestration

In a 2024 article, Chris White of Prefect explains how orchestration can be done wrong, and how it can be done correctly.

“I’ve seen teams struggle to justify the adoption of a first-class orchestrator, often falling back on the age-old engineer’s temptation: “We’ll just build it ourselves.” It’s a siren song I know well, having been lured by it myself many times. The idea seems simple enough – string together a few scripts, add some error handling, and voilà! An orchestrator is born. But here’s the rub: those homegrown solutions have a habit of growing into unwieldy systems of their own, transforming the nature of one’s role from getting something done to maintaining a grab bag of glue code.

“Orchestration is about bringing order to this complexity.”

So how do you implement ordered orchestration? By following this high-level statement of purpose:

“Think of orchestration as a self-documenting expert system designed to accomplish well-defined objectives (which in my world are often data-centric objectives). It knows the goal, understands the path to achieve it, and – crucially – keeps a detailed log of its journey.”

Read White’s article for a deeper dive into these three items.

Now think of a layer

The concept of a layer permeates information technology. There are all sorts of models that describe layers and how they work with each other.

Enter the concept of an orchestration layer:

“In modern IT systems, an orchestration layer is a software layer that links the different components of a software system and assists with data transformation, server management, authentication, and integration. The orchestration layer acts as a sophisticated mediator between various components of a system, enabling them to work together harmoniously. In technical terms, the orchestration layer is responsible for automating complex workflows, managing communication, and coordinating tasks between diverse services, applications, and infrastructure components.”

Here’s an example from NIST:

Figure 7 in NIST SP 500-292.

Once you visualize an orchestration layer, and how this layer interacts with the other layers, things become…simple.

So maybe Bredebot does know what he’s talking about.

What is the NIST FIPS 204 Module-Lattice-Based Digital Signature Standard?

In this edition of The Repurposeful Life, I’m revisiting a prior post (“Is the Quantum Security Threat Solved Before It Arrives? Probably Not.“) and extracting just the part that deals with the National Institute of Standards and Technology (NIST) Federal Information Processing Standard (FIPS) 204.

Thales used the NIST “FIPS 204 standard to define a digital signature algorithm for a new quantum-resistant smartcard: MultiApp 5.2 Premium PQC.”

The NIST FIPS 204 standard, “Module-Lattice-Based Digital Signature Standard,” can be found here. This is the abstract:

“Digital signatures are used to detect unauthorized modifications to data and to authenticate the identity of the signatory. In addition, the recipient of signed data can use a digital signature as evidence in demonstrating to a third party that the signature was, in fact, generated by the claimed signatory. This is known as non-repudiation since the signatory cannot easily repudiate the signature at a later time. This standard specifies ML-DSA, a set of algorithms that can be used to generate and verify digital signatures. ML-DSA is believed to be secure, even against adversaries in possession of a large-scale quantum computer.”

ML-DSA stands for “Module-Lattice-Based Digital Signature Algorithm.”

Now I’ll admit I don’t know a lattice from a vertical fence post, especially when it comes to quantum computing, so I’ll have to take NIST’s word for it that modules and lattice are super-good security.

Google Gemini.

But wait, there’s more!

Since I wrote my original post in October, I’ve read NordVPN’s definition of a lattice on its lattice-based access control (LBAC) page.

“A lattice is a hierarchical structure that consists of levels, each representing a set of access rights. The levels are ordered based on the level of access they grant, from more restrictive to more permissive.”

You can see how this fits into an access control mechanism, whether you’re talking about a multi-tenant cloud (NordVPN’s example) or a smartcard (Thales’ example).

Because there are some things that Tom Sawyer can access, but Injun Joe must not access.

Google Gemini.

Is the Quantum Security Threat Solved Before It Arrives? Probably Not.

I’ll confess: there is a cybersecurity threat so…um…threatening that I didn’t even want to think about it.

You know the drill. The bad people use technology to come up with some security threat, and then the good people use technology to thwart it.

That’s what happens with antivirus. That’s what happens with deepfakes.

But I kept on hearing rumblings about a threat that would make all this obsolete.

The quantum threat and the possible 2029 “Q Day”

Today’s Q word is “quantum.”

But with great power comes great irresponsibility. Gartner said it:

“By 2029, ‘advances in quantum computing will make conventional asymmetric cryptography unsafe to use,’ Gartner said in a study.”

Frankly, this frightened me. Think of the possibilities that come from calculation superpowers. Brute force generation of passcodes, passwords, fingerprints, faces, ID cards, or whatever is necessary to hack into a security system. A billion different combinations? No problem.

So much for your unbreakable security system.

Thales implementation of NIST FIPS 204

Unless Thales has started to solve the problem. This is what Thales said:

“The good news is that technology companies, governments and standards agencies are well aware of the deadline. They are working on defensive strategies to meet the challenge — inventing cryptographic algorithms that run not just on quantum computers but on today’s conventional components.

“This technology has a name: post-quantum cryptography.

“There have already been notable breakthroughs. In the last few days, Thales launched a quantum-resistant smartcard: MultiApp 5.2 Premium PQC. It is the first smartcard to be certified by ANSSI, France’s national cybersecurity agency.

“The product uses new generation cryptographic signatures to protect electronic ID cards, health cards, driving licences and more from attacks by quantum computers.”

So what’s so special about the technology in the MultiApp 5.2 Premium PQC?

Thales used the NIST “FIPS 204 standard to define a digital signature algorithm for a new quantum-resistant smartcard: MultiApp 5.2 Premium PQC.”

Google Gemini.

The NIST FIPS 204 standard, “Module-Lattice-Based Digital Signature Standard,” can be found here. This is the abstract:

“Digital signatures are used to detect unauthorized modifications to data and to authenticate the identity of the signatory. In addition, the recipient of signed data can use a digital signature as evidence in demonstrating to a third party that the signature was, in fact, generated by the claimed signatory. This is known as non-repudiation since the signatory cannot easily repudiate the signature at a later time. This standard specifies ML-DSA, a set of algorithms that can be used to generate and verify digital signatures. ML-DSA is believed to be secure, even against adversaries in possession of a large-scale quantum computer.”

ML-DSA stands for “Module-Lattice-Based Digital Signature Algorithm.”

Google Gemini.

Now I’ll admit I don’t know a lattice from a vertical fence post, especially when it comes to quantum computing, so I’ll have to take NIST’s word for it that modules and lattice are super-good security.

Certification, schmertification

The Thales technology was then tested by researchers to determine its Evaluation Assurance Level (EAL). The result? “Thales’ product won EAL6+ certification (the highest is EAL7).” (TechTarget explains the 7 evaluation assurance levels here.)

France’s national cybersecurity agency (ANSSI) then certified it.

However…

…remember that certifications mean squat.

For all we know, the fraudsters have already broken the protections in the FIPS 204 standard.

Google Gemini.

And the merry-go-round between fraudsters and fraud fighters continues.

If you need help spreading the word about YOUR anti-fraud solution, quantum or otherwise, schedule a free meeting with Bredemarket.

Reducing Biometric Marketing Internal Bias By Using Bredemarket

Identity/biometric marketing leaders continuously talk about how their companies have reduced bias in their products. But have they reduced bias in their own marketing to ensure it resonates with prospects?

I recently talked about the problem of internal bias:

“Marketers are driven to accentuate the positive about their companies. Perhaps the company has a charismatic founder who repeatedly emphasizes how ‘insanely great’ his company is and who talked about ‘bozos.’ (Yeah, there was a guy who did both of those.) 

“And since marketers are often mandated to create both external and internal sales enablement content, their view of their own company and their own product is colored.”

Let’s look at two examples of biometric marketing internal bias…and how to overcome it.

Google Gemini.

Internal bias at Company A

  • Company A does not participate in the U.S. National Institute of Standards and Technology (NIST) Face Recognition Technology Evaluation (FRTE) for technical reasons. 
  • As a result, the company’s marketing machine constantly discredits NIST FRTE, and the company culture is permeated with a “NIST is stupid” mentality. 
  • All well and good…until it runs into that one prospect who asks, “Why are you scared to measure yourself against the competition? Does your algorithm suck that bad?”

Internal bias at Company B

  • Company B, on the other hand, participates in FRTE, FATE, FRIF (previously FpVTE), and every other NIST test imaginable. 
  • This company’s marketing machine declares its superiority as a top tier biometric vendor, supported by outside independent evidence. 
  • All well and good…until it runs into that one prospect who declares, “That’s just federal government test data. How will you perform in our benchmark using our real data and real computers?”

Internal bias at Bredemarket 

Well, I have my admittedly biased solution to prevent companies from tumbling into groupthink, drinking of Kool-Aid, and market irrelevance.

Contract with an outside biometric product marketing expert. (I just happen to know one…me.)

Google Gemini.

I haven’t spent 30 years immersed in your insular culture. I’ve heard all the marketing-speak from different companies, and I’ve written the marketing-speak for nearly two dozen of them. I can ensure that your content resonates with your external customers and prospects, not only with your employees.

All well and good…until…

Reducing internal bias at Bredemarket 

“But John, what about your own biases? IDEMIA, Motorola, Incode, and other employers paid you for 25 years! You probably have an established process that you use to prepare andouillette at home, based upon a recipe from 2019!”

Google Gemini.

I don’t…but point taken. So how do I minimize my own biases?

My breadth of experience lessens the biases from my past. Look at my market-speak from 1994 to 2023, in order:

  • We are Printrak, a nimble private company that will dominate AFIS with our client-server solution.
  • We are Printrak (stock symbol AFIS) a well-funded public company that will dominate AFIS, mugshot, computer aided dispatch, and microfiche.
  • We are Motorolans, and our multi-tier Digital Justice Solution has a superior architecture to that of Sagem Morpho and others.
  • We are MorphoTrak, bringing together the best technologies from MetaMorpho and Printrak BIS, plus superior French technology for secure credentials and road safety…unencumbered by the baggage that weighs down MorphoTrust.
  • We are IDEMIA North America, bringing together the best technologies from MorphoTrust and MorphoTrak for ABIS, driver’s licenses, and enrollment, coupled with the resources from the rest of IDEMIA, a combined unbreakable force.
  • We are Incode, not weighed down with the baggage of the old dinosaurs, and certainly not a participant in the surveillance market.

Add all the different messaging of Bredemarket’s clients, plus my continuous improvement (hello MOTO) of my capabilities, and I will ensure that my content, proposals, and analysis does not trap you in a dead end.

Reducing internal bias at your company 

Are you ready to elevate your company with the outside perspective of a biometric product marketing expert?

Let’s talk (a free meeting). You explain, I ask questions, we agree on a plan, and then I act.

Schedule a meeting at https://bredemarket.com/mark/

I’m a Barbie Girl

So I just finished writing some technical content for a blog post, and for other purposes.

The content relates to a publication (the 2017 version of Special Publication 800-63A) from the National Institute of Standards and Technology, or NIST.

(Note to self: gotta check the new version.)

I figure that after the work day is done, the NISTies turn to less strenuous tasks.

And so shall I.