Let me start by admitting to my, um, bias.
For the last twenty-five plus years, I have been involved in the identification of individuals.
- Who is the person who is going through the arrest/booking process?
- Who is the person who claims to be entitled to welfare benefits?
- Who is the person who wants to enter the country?
- Who is the person who is exiting the country? (Yes, I remember the visa overstay issue.)
- Who is the person who wants to enter the computer room in the office building?
- Who is the person who is applying for a driver’s license or passport?
- Who is the person who wants to enter the sports stadium or concert arena?
These are just a few of the problems that I have worked on solving over the last twenty-five plus years, all of which are tied to individual identity.
From that perspective, I really don’t care if the person entering the stadium/computer room/country whatever is female, mixed race, Muslim, left handed, or whatever. I just want to know if this is the individual that he/she/they claims to be.
If you’ve never seen the list of potential candidates generated by a top-tier facial recognition program, you may be shocked when you see it. That list of candidates may include white men, Asian women, and everything in between. “Well, that’s wrong,” you may say to yourself. “How can the results include people of multiple races and genders?” It’s because the algorithm doesn’t care about race and gender. Think about it – what if a victim THINKS that he was attacked by a white male, but the attacker was really an Asian female? Identify the individual, not the race or gender.

So when Gender Shades came out, stating that IBM, Microsoft, and Face++ AI services had problems recognizing the gender of people, especially those with darker skin, my reaction was “so what”?
(Note that this is a different question than the question of how an algorithm identifies individuals of different genders, races, and ages, which has been addressed by NIST.)
But some people persist in addressing biometrics’ “failure” to properly identify genders and races, ignoring the fact that both gender and race have become social rather than biological constructs. Is the Olympian Jenner male, female, or something else? What are your personal pronouns? What happens when a mixed race person identifies with one race rather than another? And aren’t we all mixed race anyway?
The latest study from AlBdairi et al on computational methods for ethnicity identification
But there’s still a great interest in “race recognition.”
As Jim Nash of Biometric Update notes, a team of scientists has published an open access paper entitled “Face Recognition Based on Deep Learning and FPGA for Ethnicity Identification.”
The authors claim that their study is “the first image collection gathered specifically to address the ethnicity identification problem.”
But what of the NIST demographic study cited above? you may ask. The NIST study did NOT have the races of the individuals, but used the individuals’ country of origin as a proxy for race. Then again, it is possible that this study may have done the same thing.
Despite the fact that there are several large-scale face image databases accessible online, none of these databases are acceptable for the purpose of the conducted study in our research. Furthermore, 3141 photographs were gathered from a variety of sources. Specifically, 1081, 1021, and 1039 Chinese, Pakistani, and Russian face photos were gathered, respectively.
From https://www.mdpi.com/2076-3417/12/5/2605/htm
There was no mention of whether any of the Chinese face photos were Caucasian…or how the researchers could tell that they were Caucasian.
Anyway, if you’re interested in the science behind using Deep Convolutional Neural Network (DCNN) models and field-programmable gate arrays (FPGAs) to identify ethnicity, read the paper. Or skip to the results.
The experimental results reported that our model outperformed all the methods of state-of-the-art, achieving an accuracy and F1 score value of 96.9 percent and 94.6 percent, respectively.
From https://www.mdpi.com/2076-3417/12/5/2605/htm
But this doesn’t answer the question I raised earlier.
Three possible use cases for race recognition, two of which are problematic
Why would anyone want to identify ethnicity or engage in race recognition? Jim Nash of Biometric Update summarizes three possible use cases for doing this, which I will address one by one. TL;DR two of the use cases are problematic.
The code…could find a role in the growing field of race-targeted medical treatments and pharmacogenomics, where accurately ascertaining race could provide better care.
From https://www.biometricupdate.com/202203/identifying-ethnicity-problematic-so-scientists-write-race-recognition-code
Note that in this case race IS a biological construct, so perhaps its use is valid here. Regardless of how Nkechi Amare Diallo (formerly Rachel Dolezal) self-identifies, she’s not a targeted candidate for sickle cell treatment.
It could be helpful to some employers. Such as system could “use racial information to offer employers ethnically convenient services, then preventing the offending risk present in many cultural taboos.”
From https://www.biometricupdate.com/202203/identifying-ethnicity-problematic-so-scientists-write-race-recognition-code
This is where things start to get problematic. Using Diallo as an example, race recognition software based upon her biological race would see no problem in offering her fried chicken and watermelon at a corporate function, but Diallo might have some different feelings about this. And it’s not guaranteed that ALL members of a particular race are affected by particular cultural taboos. (The text below, from 1965, was slightly edited.)

People used to think of (blacks) as going around with fried chicken in a paper bag, (Godfrey) Cambridge says. But things have changed. “Now,” he says, “we carry an attache case—with fried chicken in it. We ain’t going to give up everything just to get along with you people.”
From http://content.time.com/time/subscriber/article/0,33009,839260,00.html. Yes, http.
While some employees may be pleased that they receive a particular type of treatment because of their biological race, others may not be pleased at all.
So let’s move on to Nash’s third use case for race recognition. Hold on to your seats.
Ultimately, however, the broadest potential mission for race recognition would be in security — at border stations and deployed in public-access areas, according to the report.
From https://www.biometricupdate.com/202203/identifying-ethnicity-problematic-so-scientists-write-race-recognition-code
I thought we had settled this over 20 years ago. Although we really didn’t.
While President Bush was primarily speaking about religious affiliation, he also made the point that we should not judge individuals based upon the color of their skin.
Yet we do.
If I may again return to our current sad reality, there have been allegations that Africans encountered segregation and substandard treatment when trying to flee Ukraine. (When speaking of “African,” note that concerns were raised by officials from Gabon, Ghana, and Kenya – not from Egypt, Libya, or Tunisia. Then again, Indian students also complained of substandard treatment.)
Many people in the United States and western Europe would find it totally unacceptable to treat people at borders and public areas differently by race.
Do we want to encourage this use case?
And if you feel that we should, please provide your picture. I want to see if your concerns are worthy of consideration.