On Melanin

If you’re examining a person’s fingerprints, palm prints, face, and irises, you need to understand melanin.

The Cleveland Clinic goes into great detail on melanin, but for now I’m going to concentrate on one item.

There are three types of melanin, two of which affect the skin, eyes, and hair.

Eumelanin. There are two types of eumelanin: black and brown. Eumelanin is responsible for dark colors in skin, eyes and hair. People with brown or black hair have varying amounts of brown and black eumelanin. When there’s no black eumelanin and a small amount of brown eumelanin, it results in blonde hair.

Pheomelanin. This type of melanin pigments your lips, nipples and other pinkish parts of your body. People who have equal parts eumelanin and pheomelanin have red hair.

Melanin obviously affects the coloration of your skin, although some parts of your body (such as your fingertips) may have less melanin than other parts (such as your face).

Concentrating on fingertips and faces (and ignoring irises for the moment), let’s look at a situation where we use an optical mechanism (such as an optical fingerprint reader or a camera), along with available illumination, to photograph fingers and faces of people with varying skin tones.

But what if your entire photographic system is based upon reference materials optimized for light melanin levels? As late as the 1970s, Kodak’s reference materials, called “Shirley cards” after the first model, used to exclusively white people.

In the 1970s, photographer Jim Lyon joined Kodak’s first photo tech division and research laboratories. He says the company recognized there was a problem with the all-white Shirley cards.

“I started incorporating black models pretty heavily in our testing, and it caught on very quickly,” he says. “It wasn’t a big deal, it just seemed like this is the right thing to do. I wasn’t attempting to be politically correct. I was just trying to give us a chance of making a better film, one that reproduced everybody’s skin tone in an appropriate way.”

So hopefully today optical devices are properly capturing fingers, faces, and irises of people at all melanin levels.

Or is this wishful thinking?

Who Can Write My Biometric Company’s Product Marketing Content?

Someone who is a biometric product marketing expert.

Someone who has three decades of expertise in biometrics.

I remember ANSI/NIST-CSL 1-1993.

Someone who has worked with fingerprints, faces, irises, voices, DNA, and other biometric modalities.

Some modalities. Butts and tongues not included.

Someone who understands the privacy landscape in Europe (GDPR), Illinois (BIPA), California, and elsewhere.

BIPA is a four-letter word.

Oh…and someone who can write.

A slight exaggeration.

So who can write this stuff?

I know someone. Bredemarket.

Some great videos


Biometric product marketing expert.
Questions.
Services, process, and pricing.

Which Biometric Modalities Does NIST Investigate?

I’ve spent a lot of time in the Bredemarket blog looking at a variety of NIST studies of different biometric modalities.

But you can read up on them yourself.

NIST has investigated the following biometric modalities, using both definitions of the word biometrics:

But NIST has not spent taxpayer money researching other biometric modalities, such as tongue identification.

Biometric product marketing expert.

Yoti iBeta Confirmation of Presentation Attack Detection Level 3

We’ve talked about Levels 1 and 2 of iBeta’s confirmation that particular biometric implementations meet the requirements of ISO 30107-3. But now with Yoti’s confirmation, we can talk about iBeta Level 3.

From iBeta:

“The test method was to apply 1 bona fide subject presentation that alternated with 3 artefact presentations such that the presentation of each species consisted of 150 Presentation Attacks (PAs) and 50 bona fide presentations, or until 56 hours had passed per species. The results were displayed for the tester on the device as “Liveness check: Passed” for a successful attempt or “Liveness check: Failed” for an unsuccessful attempt.

“iBeta was not able to gain a liveness classification with the presentation attacks (PAs) on the Apple iPhone 16 Pro. With 150 PAs for each of 3 species, the total number of attacks was 450, and the overall Attack Presentation Classification Error Rate (APCER) was 0%. The Bona Fide Presentation Classification Error Rate (BPCER) was also calculated and may be found in the final report.

“Yoti Limited’s myface12122025 application and supporting backend components were tested by iBeta to the ISO 30107-3 Biometric Presentation Attack Detection Standard and found to be in compliance with Level 3.”

More from Yoti itself.

“Yoti’s MyFace is the first passive, single-selfie liveness technology in the world to conform to iBeta’s Level 3 testing under ISO/IEC 30107-3 – their highest level for liveness checks.”

Also see Biometric Update and UK Tech.

After all, facial age estimation is of no meaning whatsoever if the face is fake. So it was important that Yoti receive this confirmation.

Federal Trade Commission Age Verification (and estimation?) Workshop January 28

A dizzying array of federal government agencies is interested in biometric verification and biometric classification, for example by age (either age verification or age estimation). As Biometric Update announced, we can add the Federal Trade Commission (FTC) to the list with an upcoming age verification workshop.

Rejecting age estimation in 2024

The FTC has a history with this, having rejected a proposed age estimation scheme in 2024.

“Re: Request from Entertainment Software Rating Board, Yoti Ltd., Yoti (USA) Inc., and Kids Web Services Ltd. for Commission Approval of Children’s Online Privacy Protection Rule Parental Consent Method (FTC Matter No. P235402)

“This letter is to inform you that the Federal Trade Commission has reviewed your group’s (“the ESRB group”) application for approval of a proposed verifiable parental consent (“VPC”) method under the Children’s Online Privacy Protection Rule (“COPPA” or “the Rule”). At this time, the Commission declines to approve the method, without prejudice to your refiling the application in the future….

“The ESRB group submitted a proposed VPC method for approval on June 2, 2023. The method involves the use of “Privacy-Protective Facial Age Estimation” technology, which analyzes the geometry of a user’s face to confirm that the user is an adult….The Commission received 354 comments regarding the application. Commenters opposed to the application raised concerns about privacy protections, accuracy, and deepfakes. Those in support of the application wrote that the VPC method is similar to those approved previously and that it had sufficient privacy guardrails….

“The Commission is aware that Yoti submitted a facial age estimation model to the National Institute of Standards and Technology (“NIST”) in September 2023, and Yoti has stated that it anticipates that a report reflecting NIST’s evaluation of the model is forthcoming. The Commission expects that this report will materially assist the Commission, and the public, in better understanding age verification technologies and the ESRB group’s application.”

You can see the current NIST age estimation results on NIST’s “Face Analysis Technology Evaluation (FATE) Age Estimation & Verification” page, not only for Yoti, but for many other vendors including my former employers IDEMIA and Incode.

But the FTC rejection was in 2024. Things may be different now.

Grok.

Revisiting age verification and age estimation in 2026?

The FTC has scheduled an in-person and online age verification workshop on January 28.

  • The in-person event will be at the Constitution Center at 400 7th St SW in Washington DC.
  • Details regarding online attendance will be published on this page in the coming weeks.

“The Age Verification Workshop will bring together a diverse group of stakeholders, including researchers, academics, industry representatives, consumer advocates, and government regulators, to discuss topics including:  why age verification matters, age verification and estimation tools, navigating the regulatory contours of age verification, how to deploy age verification more widely, and interplay between age verification technologies and the Children’s Online Privacy Protection Act (COPPA Rule).”

Will the participants reconsider age estimation in light of recent test results?

We Know All About You, Music Lover

This is the week that we celebrate how much companies in Sweden and elsewhere know about us.

Including estimated ages.

Which may or may not (I’m not telling) be as accurate as software that analyzes your face for age estimation.

And the companies gathering the data can then sell it to advertisers and others who use it in all sorts of ways.

It will be interesting to see the corporate messaging that I and other Spotify users will receive over the next few days.

“If you listen to Depeche Mode, perhaps our Medicare plans may interest you.”

If Only Job Applicant Deepfake Detection Were This Easy

In reality, job applicant deepfake detection is (so far) unable to determine who the fraudster really is, but it can determine who the fraudster is NOT.

Something to remember when hiring people for sensitive positions. You don’t want to unknowingly hire a North Korean spy.

Face Product Marketing Expert (27 posts)

To ensure that my social media followers don’t have all the fun with my “biometric product marketing expert” shares, here are links to some Bredemarket blog posts on facial recognition (identification) and facial analysis (classification).

Facial recognition:

Facial analysis:

Using Grok For Evil: Deepfake Celebrity Endorsement

Using Grok for evil: a deepfake celebrity endorsement of Bredemarket?

Although in the video the fake Taylor Swift ends up looking a little like a fake Drew Barrymore.

Needless to say, I’m taking great care to fully disclose that this is a deepfake.

But some people don’t.

Why is Morph Detection Important?

We’re all familiar with the morphing of faces from subject 1 to subject 2, in which there is an intermediate subject 1.5 that combines the features of both of them. But did you know that this simple trick can form the basis for fraudulent activity?

Back in the 20th century, morphing was primarily used for entertainment purposes. Nothing that would make you cry, even though there were shades of gray in the black or white representations of the morphed people.

Godley and Creme, “Cry.”
Michael Jackson, “Black or White.” (The full version with the grabbing.) The morphing begins about 5 1/2 minutes into the video.

But Godley, Creme, and Jackson weren’t trying to commit fraud. As I’ve previously noted, a morphed picture can be used for fraudulent activity. Let me illustrate this with a visual example. Take a look at the guy below.

From NISTIR 8584.

Does this guy look familiar to you? Some of you may think he kinda sorta looks like one person, while others may think he kinda sorta looks like a different person.

The truth is, the person above does not exist. This is actually a face morph of two different people.

From NISTIR 8584.

Now imagine a scenario in which a security camera is patrolling the entrance to the Bush ranch in Crawford, Texas. But instead of having Bush’s facial image in the database, someone has tampered with the database and inserted the “Obushama” image instead…and that image is similar enough to Barack Obama to allow Obama to fraudulently enter Bush’s ranch.

Or alternative, the “Obushama” image is used to create a new synthetic identity, unconnected to either of the two.

But what if you could detect that a particular facial image is not a true image of a person, but some type of morph attempt? NIST has a report on this:

“To address this issue, the National Institute of Standards and Technology (NIST) has released guidelines that can help organizations deploy and use modern detection methods designed to catch morph attacks before they succeed.”

The report, “NIST Interagency Report NISTIR 8584, Face Analysis Technology Evaluation (FATE) MORPH Part 4B: Considerations for Implementing Morph Detection in Operations,” is available in PDF form at https://doi.org/10.6028/NIST.IR.8584.

And a personal aside to anyone who worked for Safran in the early 2010s: we’re talking about MORPH detection, not MORPHO detection. I kept on mistyping the name as I wrote this.