Here’s the second of two videos that I filmed at the grand reopening of the Anthony Muñoz Community Center in Ontario, California. If you don’t like politician speeches, skip the first 15 minutes and go right to the facility tour.
Category Archives: Uncategorized
Another reason to repurpose old content
Earlier this week I was asked about one of the posts that I wrote in the Bredemarket blog. I had to confess that I hadn’t thought about the topic much recently.
After this conversation, I realized that the referenced post was written back in July.
Because I’ve written over 200 posts in the Bredemarket blog over the past year-plus, some of them kind of get merged together in my mind.
And in this particular case, my thoughts on the original topic have evolved since the summer.
So if you see a future post that revises and updates something I wrote about four months ago, now you know why.
I hope that the new post won’t be dramatically different from the old one.
The dangers of removing facial recognition and artificial intelligence from DHS solutions (DHS ICR part four)
And here’s the fourth and final part of my repurposing exercise. See parts one, two, and three if you missed them.
This post is adapted from Bredemarket’s November 10, 2021 submitted comments on DHS-2021-0015-0005, Information Collection Request, Public Perceptions of Emerging Technology. As I concluded my request, I stated the following.
Of course, even the best efforts of the Department of Homeland Security (DHS) will not satisfy some members of the public. I anticipate that many of the respondents to this ICR will question the need to use biometrics to identify individuals, or even the need to identify individuals at all, believing that the societal costs outweigh the benefits.

But before undertaking such drastic action, the consequences of following these alternative paths must be considered.
Taking an example outside of the non-criminal travel interests of DHS, some people prefer to use human eyewitness identification rather than computerized facial recognition.

However, eyewitness identification itself has clear issues of bias. The Innocence Project has documented many cases in which eyewitness (mis)identification has resulted in wrongful criminal convictions which were later overturned by biometric evidence.

Mistaken eyewitness identifications contributed to approximately 69% of the more than 375 wrongful convictions in the United States overturned by post-conviction DNA evidence.
Inaccurate eyewitness identifications can confound investigations from the earliest stages. Critical time is lost while police are distracted from the real perpetrator, focusing instead on building the case against an innocent person.
Despite solid and growing proof of the inaccuracy of traditional eyewitness ID procedures – and the availability of simple measures to reform them – traditional eyewitness identifications remain among the most commonly used and compelling evidence brought against criminal defendants.”
Innocence Project, Eyewitness Identification Reform, https://innocenceproject.org/eyewitness-identification-reform/
For more information on eyewitness misidentification, see my November 24, 2020 post on Archie Williams (pictured above) and Uriah Courtney.
Do we really want to dump computerized artificial intelligence and facial recognition, only to end up with manual identification processes that are proven to be even worse?
Biometrics enhances accuracy without adversely impacting timeliness (DHS ICR part three)
This post is adapted from Bredemarket’s November 10, 2021 submitted comments on DHS-2021-0015-0005, Information Collection Request, Public Perceptions of Emerging Technology. See my first and second posts on the topic.
DHS asked respondents to address five questions, including this one:
(2) will this information be processed and used in a timely manner;
Here is part of my response.
I am answering this question from the perspective of a person crossing the border or boarding a plane.

From this perspective, you can ask whether the use of biometric technologies makes the entire process faster, or slower.
Before biometric technologies became available, a person would cross a border or board a plane either by conducting no security check at all, or by having a human conduct a manual security check using the document(s) provided by an individual.
- Unless a person was diverted to a secondary inspection process, automatic identification of the person (excluding questions such as “What is your purpose for entering the United States?”) could be accomplished in a few seconds.
- However, manual security checks are much less accurate than technological solutions, as will be illustrated in a future post.
With biometric technologies, it is necessary to measure both the time to acquire the biometric data (in this case a facial image) and the time to compare the acquired data against the known data for the person (from a passport, passenger manifest, or database).
- The time to acquire biometric data continues to improve. In some cases, the biometric data can be acquired “on the move” as the person is walking toward a gate or other entry area, thus requiring no additional time from the person’s perspective.
- The time to compare biometric data can vary. If the source of the known data (such as the passport) is with the person, then comparison can be instantaneous from the person’s perspective. If the source of the known data is a database in a remote location, then the speed of comparison depends upon many factors, including network connections and server computation times. Naturally, DHS designs its systems to minimize this time, ensuring minimal or no delay from the person’s perspective. Of course, a network or system failure can adversely affect this.
In short, biometric evaluation is as fast if not faster than manual processes (provided no network or system failure occurs), and is more accurate than human processes.

located at international airports across
the nation streamline the passenger’s
entry into the United States. Photo Credit:
James Tourtellotte. From https://www.cbp.gov/travel/us-citizens/apc
A world without biometric collection is a world with increased bias and less security and privacy (DHS ICR part two)
This post is adapted from Bredemarket’s November 10, 2021 submitted comments on DHS-2021-0015-0005, Information Collection Request, Public Perceptions of Emerging Technology. See yesterday’s post for additional thoughts on bias, security, and privacy.

Because of many factors, including the 9/11 tragedy that spurred the organization of the Department of Homeland Security (DHS) itself, DHS has been charged to identify individuals as a part of its oversight of customs and border protection, transportation security, and investigations. There are many ways to identify individuals, including:
- What you know, such as a password.
- What you have, such as a passport or token.
- What you are, such as your individual face, fingers, voice, or DNA.
- Where you are.
Is it possible to identify an individual without use of computerized facial recognition or other biometric or AI technologies? In other words, can the “what you are” test be eliminated from DHS operations?
Some may claim that the “what you have” test is sufficient. Present a driver’s license or a passport and you’re identified.
- However, secure documents are themselves secured by the use of biometrics, primarily facial recognition.
- Before a passport is issued, many countries including the U.S. conduct some type of biometric test to ensure that a single person does not obtain two or more passports.
- Similar tests are conducted before driver’s licenses and other secure documents are issued.
In addition, people attempt to forge secure documents by creating fake driver’s licenses and fake passports. Thus, all secure documents need to be evaluated, in part by confirming that the biometrics on the document match the biometrics of the person presenting the document.
In short, there is no way to remove biometric identification from the DHS identification operation. And if you did, who knows how each individual officer would judge whether a person is who they claim to be?
Thoughts on bias, security, and privacy (DHS ICR part one)
This post is adapted from Bredemarket’s November 10, 2021 submitted comments on DHS-2021-0015-0005, Information Collection Request, Public Perceptions of Emerging Technology.

The original DHS request included the following sentence in the introductory section:
AI in general and facial recognition in particular are not without public controversy, including concerns about bias, security, and privacy.
Even though this was outside of the topics specifically requiring a response, I had to respond anyway. Here’s (in part) what I said.
The topics of bias, security, and privacy deserve attention. Public misunderstandings on these topics have the capability of scuttling all of DHS’ efforts in customs and border protection, transportation security, and investigations.
Regarding bias, it is imperative upon government agencies, biometric vendors, and other interested parties (including myself as a biometric consultant) to educate and inform the public about issues relating to bias. In the interests of brevity, I will confine myself to two critical points.
- There is a difference between identification of individuals and classification of groups of individuals.
- Most of the public was introduced to the topic of biometric bias via the 2018 study Gender Shades (at the http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf URL). Unfortunately, the popular descriptions of this study confused several issues.
- The summary at the top of the Gender Shades website http://gendershades.org/ clearly frames the question asked by the study: “How well do IBM, Microsoft, and Face++ AI services guess the gender of a face?” As the study title and its summary clearly state, the study only attempted to classify the genders of faces.
- This is a different problem than the problem addressed in customs and border protection, transportation security, and investigations applications: namely, the identification of an individual. If someone purporting to be me attempts to board a plane, DHS does not care whether I am male, female, gender fluid, or anything else related to gender. DHS only cares about my individual identity.
- It is imperative that any discussion of bias as related to DHS purposes confine itself to the DHS use case of identification of individuals.
- Different algorithms exhibit different levels of bias (and different types of bias) when identifying individuals.
- While Gender Shades did not directly address this issue, it turns out that it is possible to identify differences in individual identification between different genders, races, and ages.
- The National Institute of Standards and Technology (NIST) has conducted ongoing studies of the accuracy and performance of face recognition algorithms. In one of these tests, the FRVT 1:1 Verification Test (at the https://pages.nist.gov/frvt/html/frvt11.html URL), each tested algorithm is examined for its performance among different genders, races (with nationality used as a proxy for race), and ages.
- While neither IBM nor Microsoft (two of the three algorithm providers studied in Gender Shades) have not submitted algorithms to the FRVT 1:1 Verification Test, over 360 1:1 algorithms have been tested by NIST.
- In a 2019 report issued by NIST on demographic effects (at the https://nvlpubs.nist.gov/nistpubs/ir/2019/NIST.IR.8280.pdf URL), NIST concluded that the tested algorithms “show a wide range in accuracy across developers, with the most accurate algorithms producing many fewer errors.”
- It is possible to look at the data for each individual algorithm to see detailed information on the algorithm’s performance. Click on each 1:1 algorithm to see its “report card,” including demographic results.
However, even NIST tests are just that – tests. Performance of a research algorithm on a NIST test with NIST data does not guarantee the same performance of an operational algorithm in a DHS system with DHS data.
As DHS implements biometric systems for its purposes of customs and border protection, transportation security, and investigations, DHS not only needs to internally measure the overall accuracy of these systems using DHS algorithms and data, but also needs to internally measure accuracy when these demographic factors are taken into account. While even highly accurate results may not be perceived as such by the public (the anecdotal tale of a single inaccurate result may outweigh stellar statistical accuracy in the public’s mind), such accuracy measurements are essential for the DHS to ensure that it is fulfilling its mission.
Regarding security and privacy, which are intertwined in many ways, there are legitimate questions regarding how the use of biometric technologies can detract or enhance the security and privacy of individual information. (I will confine myself to technology issues, and will not comment on the societal questions regarding knowledge of an individual’s whereabouts.)
- Data, including facial recognition vectors or templates, is stored in systems that may themselves be compromised. This is the same issue that is faced by other types of data that may be compromised, including passwords. In this regard, the security of facial recognition data is no different than the security of other data.
- In some of the DHS use cases, it is not only necessary to store facial recognition vectors or templates, but it is also necessary to store the original facial images. These are not needed by the facial recognition algorithms themselves, but by the humans who review the results of facial algorithm comparisons. As long as we continue to place facial images on driver’s licenses, passports, visas, and other secure identity documents, the need to store these facial images will continue and cannot be avoided.
- However, one must ensure that the storage of any personally identifiable information (including Social Security Numbers and other non-biometric data) is secure, and that the PII is only available on a need-to-know basis.
- In some cases, the use of facial recognition technologies can actually enhance privacy. For example, take the moves by various U.S. states to replace their existing physical driver’s licenses with smartphone-based mobile driver’s licenses (mDLs). These mDL applications can be designed to only provide necessary information to those viewing the mDL.
- When a purchase uses a physical driver’s license to buy age-restricted items such as alcohol, the store clerk viewing the license is able to see a vast amount of PII, including the purchaser’s birthdate, full name, residence address, and even height and weight. A dishonest store clerk can easily misuse this data.
- When a purchaser uses a mobile driver’s license to buy age-restricted items, most of this information is not exposed to the store clerk viewing the license. Even the purchaser’s birthdate is not exposed; all that the store clerk sees is whether or not the purchaser is old enough to buy the restricted item (for example, over the age of 21).
- Therefore, use of these technologies can actually enhance privacy.
I’ll be repurposing other portions of my response as new blog posts over the next several days.
When the old Dollar General (that is really new) meets the new DoorDash (that is really old)
I briefly alluded to Dollar General in a post that I wrote back in June.

The picture above is what I picture when I think of Dollar General. In fact, it looks similar to a Dollar General that I’ve seen outside of Huntsville, Alabama: just a building with a parking lot out in a field by a major road. You can hear the crickets chirping at night.
Not the kind of place where you’d expect to see a lot of futurists connecting a spectrum of innovation where human and biological system designs interact together seamlessly.
But as I noted in June:
Yes, even Dollar General is embracing technology, but as far as I can tell it is concentrating on consumer-facing technology and hasn’t adopted blockchain yet. But I could be wrong.
From https://bredemarket.com/2021/06/16/yes-walmart-is-a-technology-company/
I failed to quote from the linked article at the time, which dates from 2019.
Digital is becoming a “big part” of customers’ lives, Dollar General chief executive Todd Vasos said last year.
Dollar General is also building a digital strategy because customers who redeem digital savings coupons and use the new Dollar General app, released last year (2018), spend about twice as much on average as regular shoppers….
It’s not a surprise that Dollar General has been slow to embrace digital. The company’s core customers make about $40,000 a year per household, more than $20,000 below the national average.
Because of the income gap, Dollar General’s main customers are often “behind the curve” on new technology….But smartphones are ubiquitous now, and about 85% of Dollar General’s customers use one, in line with the national average.
From https://www.cnn.com/2019/05/14/business/dollar-general-app/index.html
Well, now Dollar General customers have a new way to use their smartphones.
Dollar General (NYSE: DG) today announced a partnership with DoorDash (NYSE: DASH), the nation’s leading last-mile logistics platform, to offer on-demand delivery of household essential items, including food, snacks, cleaning supplies, and more, at everyday low prices customers trust Dollar General to provide….
On-demand delivery from DoorDash is currently available from more than 9,000 Dollar General stores with plans to expand to more than 10,000 locations by December 2021. Dollar General and DoorDash initially piloted a program in summer 2021 with approximately 600 stores in rural and metropolitan communities.
From https://newscenter.dollargeneral.com/news/dollar-general-and-doordash-announce-partnership-to-offer-on-demand-delivery-of-everyday-essentials.htm
In the minds of some, Dollar General seems very old school while DoorDash seems very cutting-edge. But behind the scenes, Dollar General provides as much tech innovation as another rural success story, Cracker Barrel. And when you think about it, DoorDash is just a warmed-over techie delivery service.

And delivery services have been around forever.
Louie, take a look at this! Free FTR FST Friday in Ontario
Another Friday, another all-day event.

FTR FST (“future fest”), sponsored by 4th Sector Innovations, SwoopIn, and several other organizations, will be held on Friday, November 12 in downtown Ontario, California. While I’m primarily going for the “professional development” part, FTR FST also features creative expression (including food trucks, which appropriately fall into the “creative expression” category), collaboration, and a tech showcase.
The professional development schedule includes the following sessions, among others:
- A keynote presentation from Colin Mangham entitled “Days of Future Past.” According to FTR FST, the topic will be biomimicry.
Biomimicry is the practice of learning from and emulating life’s genius to create more efficient, elegant, and sustainable designs. It’s a problem-solving method, a sustainability ethos, an innovation approach, a change movement, and a new way of viewing and valuing nature. In practice it’s dedicated to reconnecting people with nature, and aligning human systems with biological systems.
As such, our aim is to connect a spectrum of innovation where human and biological system designs interact together seamlessly. Our team offers education and consulting to apply biological insights to systematic sustainability challenges. Our collaborative partnerships and services support interdisciplinary dialogue across industry sectors and regions, while reconnecting all of us to the local ecosystem that supports us.
OK, at this point some of you are saying to yourselves, “THAT kind of conference.”

But frankly, there’s just as much value in approaching problems from a futuristic sustainability view as there is in approaching problems from a more traditional program management process (or Shipley process, or whatever), or even from a more old school sustainability view as espoused by Broguiere’s and the late Huell Howser.

See, it all ties together. After all, the new school 4th Sector Innovations is less than a mile from the decidedly old school Graber Olive House (featured in one of Howser’s “Louie, take a look at this!” TV shows.)
I seem to have strayed from my original topic…
Anyway, let’s refocus and return to some of the other professional development sessions at this Friday’s FTR FST.
- The workshop “Navigating Cashflow” by Gilbert Wenseslao, Chase Bank.
- The workshop “Accessing Grants for Growth” by Pershetta Slack, The Funded Intensive. (NOW they hold this workshop. One of the previous presenters at the Ontario IDEA Exchange just finished submitting a grant application, and probably could have used this session.)
- The workshop “Next Gen Cyber Security” by Erik Delgadillo, SecLex.
- The workshop “The Evolution of Mobility” by Maritza Berger at Piaggio Fast Forward.
- A panel (participants unidentified) on equalizing opportunity.
- Vendor spotlights.
After 3:00 pm, FTR FST transitions to less intensive sessions. Bring the kids! The complete schedule can be found here.
You can register for FTR FST here. Oh, and one new wrinkle: attendance at the professional development sessions is now FREE, thanks to the event sponsors.
FTR FST will be at 4th Sector Innovations, 404 N Euclid Avenue in Ontario.

Perhaps our privacy is REALLY threatened
Technology often advances more quickly than our society’s ability to deal with the ramifications of technology.
For example, President Eisenhower’s effort to improve our national defense via construction of a high-speed interstate highway system led to a number of unintended consequences, including the devastation of city downtown areas that were now being bypassed by travelers.
There are numerous other examples.
The previously unknown consequences of biometric technology
One way in which technology has outpaced society is by developing tools that unintentionally threaten individual privacy. For Bredemarket clients and potential clients, one relevant example of this is the ability to apply biometric technologies to previously recorded photographic, video, and audio content. (I won’t deal with real-time here.)
Hey, remember that time in 1969 that you were walking around in a Ku Klux Klan costume and one of your fraternity buddies took a picture of you? Back then you and your buddy had no idea that in future decades someone could capture a digital copy of that picture and share it with millions of people, and that one of those millions of people could use facial recognition software to compare the face in the picture with a known image of your face, and positively determine that you were the person parading around like a Grand Wizard.
Of course, there are also positive applications of biometric technology on older material. Perhaps biometrics could be used to identify an adoptee’s natural birth mother from an old picture. Or biometrics could be used to identify that a missing person was present in a train station on September 8, 2021 in the company of another (identified) person.
But regardless of the positive or negative use case, biometric identification provides us with unequalled capability to identify people who were previously recorded. Something that couldn’t have been imagined years and years ago.
Well, it couldn’t have been imagined by most of us, anyway.
Enter Carl Sagan (courtesy Elena’s Short Wisdom)
As a WordPress user (this blog and the Bredemarket website are hosted on WordPress), I subscribe to a number of other WordPress blogs. One of these blogs is Short Wisdom, authored by Elena. Her purpose is to collect short quotes from others that succinctly encapsulate essential truths.
Normally these quotes are of the inspirational variety, but Elena posted something today that applies to those of us concerned with technology and privacy.
This is a quote from Carl Sagan.
“Might it be possible at some future time, when neurophysiology has advanced substantially, to reconstruct the memories or insight of someone long dead?…It would be the ultimate breach of privacy.”
The quote is taken from Broca’s Brain: Reflections on the Romance of Science, originally published in 1979.

The future is not now…yet
Obviously such technology did not exist in 1979, and doesn’t exist in 2021 either.
Even biometric identification of living people via “brain wave” biometrics isn’t substantively verified to any large degree; last month’s study only included 15 people. Big whoop.
But it’s certainly possible that this ability to reconstruct the memories and insights of the deceased could exist at some future date. Some preliminary work has already been done in this area.
If this technology ever becomes viable and the memories of the dead can be accessed, then the privacy advocates will REALLY howl.
And the already-deceased privacy advocates will be able to contribute to the conversation. Perhaps Carl Sagan himself will posthumously share some thoughts on the ongoing NIST FRVT results.
He can even use technology to sing about the results.
Pangiam is flying high on its acquisitions
Pangiam, the company that acquired both the Metropolitan Washington Airports Authority product veriScan and the Trueface company and product, is continuing to establish itself in the airport market (while pursuing other markets).

Forbes recently published this article:
Delta Airlines, the Transportation Security Administration (TSA), and a travel tech company called Pangiam have partnered up to bring facial recognition technology to the Hartsfield–Jackson Atlanta International Airport (ATL).
As of next month, Delta SkyMiles members who use the Fly Delta app and have a TSA PreCheck membership will be able to simply look at a camera to present their “digital ID” and navigate the airport with greater ease. In this program, a customer’s identity is made up of a SkyMiles member number, passport number and Known Traveler Number.
Of course, TSA PreCheck enrollment is provided by three other companies…but I digress. (I’ll digress again in a minute.)
Forbes goes on to say that this navigation will be available at pre-airport check in (on the Fly Delta app), bag drop (via TSA PreCheck), security (again via TSA PreCheck), and the gate.
Incidentally, this illustrates how security systems from different providers build upon each other. Since I was an IDEMIA employee at the time that IDEMIA was the only company that performed TSA PreCheck enrollment, I was well aware (in my super-secret competitive intelligence role) how CLEAR touted the complementary features of TSA PreCheck in its own marketing.
Now I have no visibility into Pangiam’s internal discussions, but the company obviously has a long-term growth plan that it is executing.
So what will its next step be?




