You’re Not Lost in the Supermarket. The Supermarket Knows Exactly Who and Where You Are.

I’m all lost in the supermarket
I can no longer shop happily

Facial recognition laws and regulations vary from jurisdiction to jurisdiction, and as organizations apply facial recognition, they can’t just assume that facial recognition laws are the same as other privacy laws.

Caution urged as UK supermarkets check out facial recognition

This is the point that UK professor Fraser Sampson makes in a Biometric Update article. Among other things, Sampson (former UK Biometrics & Surveillance Camera Commissioner) notes the following:

This is not just any data processing, this is biometric processing. Major retailers have deep and wide experience handling customer data at macro level, but biometrics are elementally different. Using a biometric recognition system in the UK means they are processing ‘special category data’ and biometric data differs even from other types of special categories. This brings a number of significant risks, obligations and restrictions, some technological, some legal, some societal. The opportunities for missteps are many and the consequences profound. An early decision for the supermarket would be whether they want to be the controller, joint controller or processor; an early mistake would be to think it doesn’t matter.

Data controllers and data processors

For those who don’t inhabit the world of GDPR, the UK GDPR, and other privacy laws, here is Data Grail’s definition of a data controller:

A data controller is a service provider or organization determining the purposes and means of processing personal data. In simpler terms, a data controller decides why and how personal data collection, storage, and use occurs. They have the ultimate responsibility of ensuring data processing activities comply with applicable privacy laws and regulations. Data controllers bear the legal obligations associated with data protection, including providing transparency, obtaining consent, and safeguarding the personal data of data subjects.

Contrast that with a data processor:

Data processors are entities or organizations that process personal data on behalf of data controllers. They act under the authority and instruction of data controllers and handle personal data for the specified purposes defined by the data controller. Data processors are contractually bound to ensure data security and confidentiality. They don’t have the same decision-making power as data controllers and must adhere to the instructions provided by the data controller.

If you’re a supermarket in the United Kingdom, and you’re collecting facial biometric (and other) data, do you want to be a data controller or a data processor? And how will you manage the privacy aspects of your data collection?

Enter the facial recognition vendor

And if you’re a vendor of facial recognition software selling to UK supermarkets, how will you advise them?

And…you should have known this was coming…how will you provide content for your prospects and customers that educates them on the nuances of facial recognition privacy regulations?

If you need help with your facial recognition product marketing, Bredemarket has an opening for a facial recognition client. I can offer

  • compelling content creation
  • winning proposal development
  • actionable analysis

If Bredemarket can help your stretched staff, book a free meeting with me: https://bredemarket.com/cpa/

Bredemarket has an opening for a facial recognition client.

(All images from Imagen 3)

Facial Recognition Marketing Leaders, Riding on the Metro

I just read a story about a young man who went to the Metro, was identified by a facial recognition system, and was snatched up by authorities.

Who wanted him to fight in Ukraine.

Now some of you are puzzled and wondering why Trump wants to send U.S. troops to fight in Ukraine. That…um…doesn’t sound like him.

I forgot to clarify something. This wasn’t the Washington DC Metro. This was the MOSCOW Metro.

“Timofey Vaskin, a lawyer with the nonprofit human rights project Shkola Prizyvnika, told independent Russian TV channel Dozhd that the illegal detention of those potentially liable for conscription had become a massive problem this year, with young males most at risk of being snatched while using the Moscow metro, which has an advanced facial recognition system in place and police officers on duty at every station.”

For the record, use of facial recognition for this purpose is legal in Russia. In the same way that use of facial recognition for national security purposes is legal in the U.S.A. Because when national security is at stake—or when government agencies say national security is at stake—most notions of INFORMED consent go out the window.

Know your use cases…or get someone who does

Facial recognition isn’t only used for national security, or for after-the-fact analysis of a crime such as the Boston Marathon bombings. It’s also used for less lethal purposes, such as familiar face detection on doorbell cameras…except in Illinois.

If you are marketing a facial recognition product, you need to understand all the different use cases for facial recognition, and understand which use cases your product marketing should address, and which it should not.

And if you need help with your facial recognition product marketing, Bredemarket has an opening for a facial recognition client. I can offer

  • compelling content creation
  • winning proposal development
  • actionable analysis

If Bredemarket can help your stretched staff, book a free meeting with me: https://bredemarket.com/cpa/

Imagen 3. Bredemarket has client openings.

How Can You Maximize Your Facial Recognition Or Cybersecurity Marketing Impact?

(This news was originally supposed to be embargoed until Monday April 21, but…well…things happen.)

Facial recognition and cybersecurity marketing leaders,

Stretched?

Is a stretched team holding you back from creating stellar marketing materials? Are competitors taking your prospects from you while you remain silent?

I’m John Bredehoft from Bredemarket, and I currently have TWO openings to act as your on-demand marketing muscle for facial recognition or cybersecurity:

  • compelling content creation
  • winning proposal development
  • actionable analysis
CPA?

Bias can be good when it’s a bias to action.

Bias?

Satisfy your immediate needs and book a call: https://bredemarket.com/cpa/

Zoom Scam With Faces

An interesting variant on fraudulent deepfake scams.

Kenny Li of Manta fame was sucked into a scam attempt, but was able to perceive the scam before any damage was done.

Li responded to a message from a known contact, which resulted in a Telegram conversation, which resulted in a Zoom call.

“In the call, there were team members who had their cameras on, and [the] Manta founder could see their faces. He mentioned that “Everything looked very real. But I couldn’t hear them.” Then came the “Zoom update required” prompt…”

Li didn’t fall for it.

(Imagen 3)

And one more thing…

The formal announcement is embargoed until Monday, but Bredemarket has TWO openings to act as your on-demand marketing muscle for facial recognition or cybersecurity:

  • compelling content creation
  • winning proposal development
  • actionable analysis

Book a call: https://bredemarket.com/cpa/ 

The Facial Recognition Vendor Is Not At Fault If You Don’t Upgrade Your Software

This is the second time that I’ve seen something like this, so I thought I’d bring attention to it.

Biometric Update recently published a story about an Australian agency that is no longer using Cognitec facial recognition software.

Why? Because the facial recognition software the agency has is not accurate enough.

Note “the facial recognition software the agency has.” There’s a story here.

Police and Counter-terrorism Minister Yasmin Catley clarifies that Cognitec has released numerous updates to the product since its deployment, but the police did not purchase them. As with other developers, Cognitec’s legacy algorithms have higher error rates for various demographic groups.

Important clarification.

Now perhaps the agency had its reasons for not upgrading the Cognitec software, and for using other software instead.

But governments and enterprises should not use old facial recognition software. Unless they have to run the software on computers running PC-DOS. Then they have other problems.

(A little aside: when I prompted Google Gemini to create the Imagen 3 image for this post, I asked it to create an image of a 1980s IBM PC running MS-DOS. Those in the know realize my prompt was incorrect. I should have requested a 1980s IBM PC running PC-DOS, not MS-DOS. PC-DOS was the version of MS-DOS that IBM licensed for its own computers, leaving Microsoft able to provide MS-DOS to the “clone computers” that eventually eclipsed IBM’s own offering.)

The Chinese Version: How to Recognize People From Quite a Long Way Away

Remember in January when OpenAI announced some great achievement, and then a few days later we learned that the Chinese firm DeepSeek could boast the same performance, only much better?

These Chinese leapfrogs don’t only happen in artificial intelligence.

One kilometer facial capture

In February, I wrote about something that I initially heard of via Biometric Update. My post, “How to Recognize People From Quite a Long Way Away,” told of an effort at Heriot-Watt University in Edinburgh, Scotland in which the researchers used light detection and ranging (LiDAR) to capture and evaluate faces from as far as a kilometer away.

In normal circumstances, we capture faces from a distance of mere meters. So one kilometer facial capture is impressive.

Or is it?

One hundred kilometer facial capture

Some Chinese researchers replied, “Hold my Tsingtao,” according to a Chinese Journal of Lasers paper (in Chinese) that was reported on by Live Science (in English). (And again, I learned of this via Biometric Update.)

Scientists in China have created a satellite with laser-imaging technology powerful enough to capture human facial details from more than 60 miles (100 kilometers) away….

According to the South China Morning Post, the scientists conducted a test across Qinghai Lake in the northwest of the country with a new system based on synthetic aperture lidar (SAL), a type of laser radar capable of constructing two-dimensional or three-dimensional images.

Qinghai Lake, from Google Maps.

Writers will note that the acronym SAL incorporates the L from the acronym LiDAR. This is APO, or acronym piling on.

Since I cannot read the original report, I don’t know if the researchers actually performed tests with actual faces. But supposedly SAL “detected details as small as 0.07 inches (1.7 millimeters),” based in part upon the benefits of its technology:

[T]his new system operates at optical wavelengths, which have much shorter wavelengths than microwaves and produce clearer images (though microwaves are better for penetrating into materials, because their longer wavelengths aren’t scattered or absorbed as easily).

All the cited articles make a big deal about the 100 kilometer distance’s equivalence to the boundaries of space. But before you get too excited, remember that a space-hosted SAL will be ABOVE any human subjects, and therefore will NOT capture the face at an optimal angle…

Can you identify Bart Everson’s face from this picture? For all I know it could be Moby. CC-BY-2.0, https://www.flickr.com/photos/editor/158206278.

…unless you’re lying on the beach sunbathing and therefore facing TOWARD space where all the Chinese satellites can see you.

Oh, and one more thing. The Chinese tests were conducted in optimal weather conditions, and obviously you can’t get the same results in bad weather.

But in the ideal conditions, perhaps you CAN be identified remotely.

(Snowman from Imagen 3)

Examples of Biometric Technology Misuse

If I become known for anything in biometrics, I want to be known for my extremely frequent use of the words “investigative lead.” 

Whether you are talking about DNA or facial recognition, these types of biometric evidence should not be the sole evidence used to arrest a person.

For an example of why DNA shouldn’t be your only evidence, see my recent post about Amanda Knox.

Facial recognition misuse in law enforcement

Regarding facial recognition, I wrote this in a social media conversation earlier today:

“Facial recognition CAN be used as a crowd checking tool…with proper governance, including strict adherence to a policy of only using FR as an investigative lead, and requiring review of potential criminal matches by a forensic face investigator. Even then, investigative lead ONLY. Same with DNA.”

I received this reply:

“It’s true but in my experience cops rarely follow any rules.”

Now I could have claimed that this view was exaggerated, but there are enough examples of cops who DON’T follow the rules to tarnish all of them. 

Revisiting Robert Williams’ Detroit arrest

I’ve already addressed the sad story of Robert Williams, who was “wrongfully arrested based upon faulty facial recognition results.”

At the time, I did not explicitly share the circumstances behind Williams’ arrest:

“The complaint alleges that the surveillance footage is poorly lit, the shoplifter never looks directly into the camera and still a Detroit Police Department detective ran a grainy photo made from the footage through the facial recognition technology.”

There’s so much that isn’t said here, such as whether a forensic face examiner made a definitive conclusion, or if the detective just took the first candidate from the list and ran with it.

But I am willing to bet that there was no independent evidence placing Williams at the shop location.

Why this matters

The thing that concerns me about all this? It just provides ammo to the people who want to ban facial recognition entirely.

Not realizing that the alternative—manual witness (mis)identification—is far more inaccurate and far more racist.

But the controversy would pretty much go away if criminal investigators only used facial recognition and DNA as investigative leads.

How Much Does Synthetic Identity Fraud Cost?

Identity firms really hope that prospects understand the threat posed by synthetic identity fraud, or SIF.

I’m here to help.

(Synthetic identity AI image from Imagen 3.)

Estimated SIF costs in 2020

In an early synthetic identity fraud post in 2020, I referenced a Thomson Reuters (not Thomas Reuters) article from that year which quoted synthetic identity fraud figures all over the map.

  • My own post referenced the Auriemma Group estimate of a $6 billion cost to U.S. lenders.
  • McKinsey preferred to use a percentage estimate of “10–15% of charge offs in a typical unsecured lending portfolio.” However, this may not be restricted to synthetic identity fraud, but may include other types of fraud.
  • Thomson Reuters quoted Socure’s Johnny Ayers, who estimated that “20% of credit losses stem from synthetic identity fraud.”

Oh, and a later post that I wrote quoted a $20 billion figure for synthetic identity fraud losses in 2020. Plus this is where I learned the cool acronym “SIF” to refer to synthetic identity fraud. As far as I know, there is no government agency with the acronym SIF, which would of course cause confusion. (There was a Social Innovation Fund, but that may no longer exist in 2025.)

Never Search Alone, not National Security Agency. AI image from Imagen 3.

Back to synthetic identity fraud, which reportedly resulted in between $6 billion and $20 billion in losses in 2020.

Estimated SIF costs in 2025

But that was 2020.

What about now? Let’s visit Socure again:

The financial toll of AI-driven fraud is staggering, with projected global losses reaching $40 billion by 2027 up from US12.3 billion in 2023 (CAGR 32%)., driven by sophisticated fraud techniques and automation, such as synthetic identities created with AI tools​.

Again this includes non-synthetic fraud, but it’s a good number for the high end. While my FTC fraud post didn’t break out synthetic identity fraud figures, Plaid cited a 2023 $1.8 billion figure for the auto industry alone, and Mastercard cited a $5 billion figure.

But everyone agrees on a figure of billions and billions.

The real Carl Sagan.
The deepfake Carl Sagan.

(I had to stop writing this post for a minute because I received a phone call from “JP Morgan Chase,” but the person didn’t know who they were talking to, merely asking for the owner of the phone number. Back to fraud.)

Reducing SIF in 2025

In a 2023 post, I cataloged four ways to fight synthetic identity fraud:

  1. Private databases.
  2. Government documents.
  3. Government databases.
  4. A “who you are” test with facial recognition and liveness detection (presentation attack detection).

Ideally an identity verification solution should use multiple methods, and not just one. It doesn’t do you any good to forge a driver’s license if AAMVA doesn’t know about the license in any state or provincial database.

And if you need an identity content marketing expert to communicate how your firm fights synthetic identities, Bredemarket can help with its content-proposal-analysis services.

Find out more about Bredemarket’s “CPA” services.