Amazon’s Take on “Familiar Faces” is Not Available Everywhere

(Part of the biometric product marketing expert series)

Biometric Update reports that Amazon’s Ring products are offering a feature called “Familiar Faces.”

“In September, Amazon revealed a revamped Ring camera lineup featuring two notable AI features, Familiar Faces and Search Party. Familiar Faces uses facial recognition and lets users tag neighbors or friends so future alerts identify them by name rather than generic motion.”

If this sounds, um, familiar, it’s because Google also has a similar feature, called familiar face alerts, in its Nest offerings.

And like Google, Amazon’s Familiar Faces won’t be available to everyone. If you are, um, familiar withg the acronym BIPA, you will know why.

“The feature is slated for December, though it will be disabled in places with stricter biometric laws such as Illinois, Texas, and Portland.”

Is Illinois’ Biometric Information Privacy Act (BIPA) Nullified in Concert Venues?

Illinois music lovers, wanna see a concert? Sounds like you may have to surrender your BIPA protections. 

Specifically, if the concert venue uses Ticketmaster (who doesn’t?), and if the concert venue captures your biometric data without your consent, you may not have legal recourse.

“These Terms of Use (“Terms”) govern your use of Live Nation and Ticketmaster’s websites and applications…

“The Terms contain an arbitration agreement and class action waiver—along with some limited exceptions—in Section 14, below. Specifically, you and we agree that any dispute or claim relating in any way to the Terms, your use of the Marketplace, or products or services sold, distributed, issued, or serviced by us or through us, will be resolved by binding arbitration, rather than in court…

“By agreeing to arbitration, you and we each waive any right to participate in a class action lawsuit or class action arbitration, except those already filed and currently pending as of August 12, 2025.”

See https://legal.ticketmaster.com/terms-of-use/

A Californian, an Illinoisan, and a Dane Walk Into a Videoconference

I was recently talking with a former colleague, whose name I am not at liberty to reveal, and they posed a question that stymied me.

What happens when multiple people join a videoconference, and they all reside in jurisdictions with different privacy regulations?

An example will illustrate what would happen, and I volunteer to be the evil party in this one.

The videoconference

Let’s say:

On a particular day in April 2026, a Californian launches a videoconference on Zoom.

Imagen 4.

The Californian invites an Illinoisan.

Imagen 4.

And also invites a Dane.

Imagen 4.

And then—here’s the evil part—records and gathers images from the videoconference without letting the other two know.

The legal violations

Despite the fact that the Illinois Biometric Information Privacy Act, or BIPA, requires written consent before acquiring Abe’s facial geometry. And if Cali John doesn’t obtain that written consent, he could lose a lot of money.

And what about Freja? Well, if the Danish Copyright Act takes effect on March 31, 2026 as expected, Cali John can get into a ton of trouble if he uses the video to create a realistic, digitally generated imitation of Freja. Again, consent is required. Again, there can be monetary penalties if you don’t get that consent.

But there’s another question we have to consider.

The vendor responsibility 

Does the videoconference provider bear any responsibility for the violations of Illinois and Danish law?

Since I used Zoom as my example, I looked at Zoom’s EULA Terms of Service.

TL;DR: not our problem, that’s YOUR problem.

“5. USE OF SERVICES AND YOUR RESPONSIBILITIES. You may only use the Services pursuant to the terms of this Agreement. You are solely responsible for Your and Your End Users’ use of the Services and shall abide by, and ensure compliance with, all Laws in connection with Your and each End User’s use of the Services, including but not limited to Laws related to recording, intellectual property, privacy and export control. Use of the Services is void where prohibited.”

But such requirements haven’t stopped BIPA lawyers from filing lawsuits against deep pocketed software vendors. Remember when Facebook settled for $650 million?

So remember what could happen the next time you participate in a multinational, multi-state, or even multi-city videoconference. Hope your AI note taker isn’t capturing screen shots.

Why Does TPRM Fail? Not Because of the TPRM Software Providers.

For years I have maintained that the difficulties in technology are not because of the technology itself.

Technology can do wonderful things.

The difficulties lie with the need for people to agree to use the technology.

And not beg ignorance by saying “I know nothing.”

(Image of actor John Banner as Sgt. Schultz on Hogan’s Heroes is public domain.)

Case in point

I just saw an article with the title “TPRM weaknesses emerge as relationship owners fail to report red flags.

Unlike some clickbait-like article titles, this one from Communications Today succinctly encapsulates the problem up front.

It’s not that the TPRM software is failing to find the red flags. Oh, it finds them!

But the folks at Gartner discovered something:

“A Gartner survey of approximately 900 third-party relationship owners…revealed that while 95% saw a third-party red flag in the past 12 months, only around half of them escalate it to compliance teams.”

Among other things, the relationship owners worry about “the perceived return on investment (ROI) of sharing information.”

And that’s not a software issue. It’s a process issue.

wildebeests on a stairway, young to old, with the oldest wildebeest possessing a trophy
Wildebeest maturity model via Imagen 3.

No amount of coding or AI can fix that.

And this is not unique to the cybersecurity world. Let’s look at facial recognition.

Another case in point

I’ve said this over and over, but for U.S. criminal purposes, facial recognition results should ONLY be used as investigative leads.

It doesn’t matter whether they’re automated results, or if they have been reviewed by a trained forensic face examiner. 

Facial recognition results should only be used as investigative leads.

Sorry for the repetition, but some people aren’t listening.

But it’s not the facial recognition vendors. Bredemarket has worked with numerous facial recognition vendors over the years, and of those who work with law enforcement, ALL of them have emphatically insisted that their software results should only be used as investigative leads.

All of them. Including…that one.

But the vendors have no way to control the actions of customers who feed poor-quality data into their systems, get a result…and immediately run out and get an arrest warrant without collecting corroborating evidence.

And that’s not a software issue. It’s a process issue.

No amount of coding or AI can fix that.

I hope the TPRM folks don’t mind my detour into biometrics, but there’s a good reason for it.

Product marketing for TPRM and facial recognition

Some product marketers, including myself, believe that it’s not enough to educate prospects and customers about your product. You also need to educate them about proper use of the product, including legal and ethical concerns.

If you don’t, your customers will do dumb things in Europe, Illinois, or elsewhere—and blame you when they are caught.

Illinois, land of BIPA. I mean Lincoln.

Be a leader in your industry by doing or saying the right thing.

And now here’s a word from our sponsor.

Not the “CPA” guy again…

Bredemarket has openings

There’s a reason why this post specifically focused on cybersecurity and facial recognition.

If you need product marketing assistance with your product, Bredemarket has two openings. One for a cybersecurity client, and one for a facial recognition client. 

I can offer

  • compelling content creation
  • winning proposal development
  • actionable analysis

If Bredemarket can help your stretched staff, book a free meeting with me: https://bredemarket.com/cpa/

Bredemarket has openings. Imagen 3 again.

Revisiting Amazon One

Because my local Amazon Fresh post is taking off, it’s a good time to revisit the “one” thing Uplanders will encounter when they get there.

I’ve talked about Amazon One palm/vein biometrics several times in the past.

Meanwhile, Amazon One is available at over 400 U.S. locations, with more on the way.

And it’s also available (or soon will be) on TP-Link door locks. But the How-To Geek writer is confused:

“TP-Link says that these palm vein patterns are so unique that they can even tell the difference between identical twins, making them safer than regular fingerprint or facial recognition methods.”

Um…fingerprints? Must be a Columbia University grad.

And the TP-Link page for the product has no sales restrictions. Even Illinois residents can buy it. Presumably there’s an ironclad consent agreement with every enrollment to prevent BIPA lawsuits.

(Picture from Imagen 3)

Biometric Product Marketers, BIPA Remains Unaltered

(Part of the biometric product marketing expert series)

You may remember the May hoopla regarding amendments to Illinois’ Biometric Information Privacy Act (BIPA). These amendments do not eliminate the long-standing law, but lessen its damage to offending companies.

Back on May 29, Fox Rothschild explained the timeline:

The General Assembly is expected to send the bill to Illinois Governor JB Pritzker within 30 days. Gov. Pritzker will then have 60 days to sign it into law. It will be immediately effective.

According to the Illinois General Assembly website, the Senate sent the bill to the Governor on June 14.

While the BIPA amendment has passed the Illinois House and Senate and was sent to the Governor, there is no indication that he has signed the bill into law within the 60-day timeframe.

So BIPA 1.0 is still in effect.

As Photomyne found out:

A proposed class action claims Photomyne, the developer of several photo-editing apps, has violated an Illinois privacy law by collecting, storing and using residents’ facial scans without authorization….

The lawsuit contends that the app developer has breached the BIPA’s clear requirements by failing to notify Illinois users of its biometric data collection practices and inform them how long and for what purpose the information will be stored and used.

In addition, the suit claims the company has unlawfully failed to establish public guidelines that detail its data retention and destruction policies.

From https://www.instagram.com/p/C7ZWA9NxUur/.

What is Your Biometric Firm’s BIPA Product Marketing Story?

(Part of the biometric product marketing expert series)

If your biometric firm conducts business in the United States, then your biometric firm probably conducts business in Illinois.

(With some exceptions.)

Your firm and your customers are impacted by Illinois’ Biometric Information Privacy Act, or BIPA.

Including requirements for consumer consent for use of biometrics.

And heavy fines (currently VERY heavy fines) if you don’t obtain that consent.

What is your firm telling your customers about BIPA?

Bredemarket has mentioned BIPA several times in the Bredemarket blog.

But what has YOUR firm said about BIPA?

And if your firm has said nothing about BIPA, why not?

Perhaps the biometric product marketing expert can ensure that your product is marketed properly in Illlinois.

Contact Bredemarket before it’s too late.

From https://www.instagram.com/p/C7ZWA9NxUur/.

BIPA Remains a Four-Letter Word

(Part of the biometric product marketing expert series)

If you’re a biometric product marketing expert, or even if you’re not, you’re presumably analyzing the possible effects to your identity/biometric product from the proposed changes to the Biometric Information Privacy Act (BIPA).

From ilga.gov. Link.

As of May 16, the Illinois General Assembly (House and Senate) passed a bill (SB2979) to amend BIPA. It awaits the Governor’s signature.

What is the amendment? Other than defining an “electronic signature,” the main purpose of the bill is to limit damages under BIPA. The new text regarding the “Right of action” codifies the concept of a “single violation.”

From ilga.gov. Link.
2(b) For purposes of subsection (b) of Section 15, a
3private entity that, in more than one instance, collects,
4captures, purchases, receives through trade, or otherwise
5obtains the same biometric identifier or biometric information
6from the same person using the same method of collection in
7violation of subsection (b) of Section 15 has committed a
8single violation of subsection (b) of Section 15 for which the
9aggrieved person is entitled to, at most, one recovery under
10this Section.
11(c) For purposes of subsection (d) of Section 15, a
12private entity that, in more than one instance, discloses,
13rediscloses, or otherwise disseminates the same biometric
14identifier or biometric information from the same person to
15the same recipient using the same method of collection in
16violation of subsection (d) of Section 15 has committed a
17single violation of subsection (d) of Section 15 for which the
18aggrieved person is entitled to, at most, one recovery under
19this Section regardless of the number of times the private
20entity disclosed, redisclosed, or otherwise disseminated the
21same biometric identifier or biometric information of the same
22person to the same recipient.
From ilga.gov. Link. Emphasis mine.

So does this mean that Google Nest Cam’s “familiar face alert” feature will now be available in Illinois?

Probably not. As Doug “BIPAbuzz” OGorden has noted:

(T)he amended law DOES NOT CHANGE “Private Right of Action” so BIPA LIVES!

Companies who violate the strict requirements of BIPA aren’t off the hook. It’s just that the trial lawyers—whoops, I mean the affected consumers make a lot less money.

Time for the FIRST Iteration of Your Firm’s UK Online Safety Act Story

By Adrian Pingstone – Transferred from en.wikipedia, Public Domain, https://commons.wikimedia.org/w/index.php?curid=112727

A couple of weeks ago, I asked this question:

Is your firm affected by the UK Online Safety Act, and the future implementation of the Act by Ofcom?

From https://bredemarket.com/2023/10/30/uk-online-safety-act-story/

Why did I mention the “future implementation” of the UK Online Safety Act? Because the passage of the UK Online Safety Act is just the FIRST step in a long process. Ofcom still has to figure out how to implement the Act.

Ofcom started to work on this on November 9, but it’s going to take many months to finalize—I mean finalise things. This is the UK Online Safety Act, after all.

This is the first of four major consultations that Ofcom, as regulator of the new Online Safety Act, will publish as part of our work to establish the new regulations over the next 18 months.

It focuses on our proposals for how internet services that enable the sharing of user-generated content (‘user-to-user services’) and search services should approach their new duties relating to illegal content.

From https://www.ofcom.org.uk/consultations-and-statements/category-1/protecting-people-from-illegal-content-online

On November 9 Ofcom published a slew of summary and detailed documents. Here’s a brief excerpt from the overview.

Mae’r ddogfen hon yn rhoi crynodeb lefel uchel o bob pennod o’n hymgynghoriad ar niwed anghyfreithlon i helpu rhanddeiliaid i ddarllen a defnyddio ein dogfen ymgynghori. Mae manylion llawn ein cynigion a’r sail resymegol sylfaenol, yn ogystal â chwestiynau ymgynghori manwl, wedi’u nodi yn y ddogfen lawn. Dyma’r cyntaf o nifer o ymgyngoriadau y byddwn yn eu cyhoeddi o dan y Ddeddf Diogelwch Ar-lein. Mae ein strategaeth a’n map rheoleiddio llawn ar gael ar ein gwefan.

From https://www.ofcom.org.uk/__data/assets/pdf_file/0021/271416/CYM-illegal-harms-consultation-chapter-summaries.pdf

Oops, I seem to have quoted from the Welsh version. Maybe you’ll have better luck reading the English version.

This document sets out a high-level summary of each chapter of our illegal harms consultation to help stakeholders navigate and engage with our consultation document. The full detail of our proposals and the underlying rationale, as well as detailed consultation questions, are set out in the full document. This is the first of several consultations we will be publishing under the Online Safety Act. Our full regulatory roadmap and strategy is available on our website.

From https://www.ofcom.org.uk/__data/assets/pdf_file/0030/270948/illegal-harms-consultation-chapter-summaries.pdf

If you want to peruse everything, go to https://www.ofcom.org.uk/consultations-and-statements/category-1/protecting-people-from-illegal-content-online.

And if you need help telling your firm’s UK Online Safety Act story, Bredemarket can help. (Unless the final content needs to be in Welsh.) Click below!

What Is Your Firm’s UK Online Safety Act Story?

It’s time to revisit my August post entitled “Can There Be Too Much Encryption and Age Verification Regulation?” because the United Kingdom’s Online Safety Bill is now the Online Safety ACT.

Having passed, eventually, through the UK’s two houses of Parliament, the bill received royal assent (October 26)….

[A]dded in (to the Act) is a highly divisive requirement for messaging platforms to scan users’ messages for illegal material, such as child sexual abuse material, which tech companies and privacy campaigners say is an unwarranted attack on encryption.

From Wired.
By Adrian Pingstone – Transferred from en.wikipedia, Public Domain, https://commons.wikimedia.org/w/index.php?curid=112727

This not only opens up issues regarding encryption and privacy, but also specific identity technologies such as age verification and age estimation.

This post looks at three types of firms that are affected by the UK Online Safety Act, the stories they are telling, and the stories they may need to tell in the future. What is YOUR firm’s Online Safety Act-related story?

What three types of firms are affected by the UK Online Safety Act?

As of now I have been unable to locate a full version of the final final Act, but presumably the provisions from this July 2023 version (PDF) have only undergone minor tweaks.

Among other things, this version discusses “User identity verification” in 65, “Category 1 service” in 96(10)(a), “United Kingdom user” in 228(1), and a multitude of other terms that affect how companies will conduct business under the Act.

I am focusing on three different types of companies:

  • Technology services (such as Yoti) that provide identity verification, including but not limited to age verification and age estimation.
  • User-to-user services (such as WhatsApp) that provide encrypted messages.
  • User-to-user services (such as Wikipedia) that allow users (including United Kingdom users) to contribute content.

What types of stories will these firms have to tell, now that the Act is law?

Stories from identity verification services

From Yoti.

For ALL services, the story will vary as Ofcom decides how to implement the Act, but we are already seeing the stories from identity verification services. Here is what Yoti stated after the Act became law:

We have a range of age assurance solutions which allow platforms to know the age of users, without collecting vast amounts of personal information. These include:

  • Age estimation: a user’s age is estimated from a live facial image. They do not need to use identity documents or share any personal information. As soon as their age is estimated, their image is deleted – protecting their privacy at all times. Facial age estimation is 99% accurate and works fairly across all skin tones and ages.
  • Digital ID app: a free app which allows users to verify their age and identity using a government-issued identity document. Once verified, users can use the app to share specific information – they could just share their age or an ‘over 18’ proof of age.
From Yoti.

Stories from encrypted message services

From WhatsApp.

Not surprisingly, message encryption services are telling a different story.

MailOnline has approached WhatsApp’s parent company Meta for comment now that the Bill has received Royal Assent, but the firm has so far refused to comment.

Will Cathcart, Meta’s head of WhatsApp, said earlier this year that the Online Safety Act was the most concerning piece of legislation being discussed in the western world….

[T]o comply with the new law, the platform says it would be forced to weaken its security, which would not only undermine the privacy of WhatsApp messages in the UK but also for every user worldwide. 

‘Ninety-eight per cent of our users are outside the UK. They do not want us to lower the security of the product, and just as a straightforward matter, it would be an odd choice for us to choose to lower the security of the product in a way that would affect those 98 per cent of users,’ Mr Cathcart has previously said.

From Daily Mail.

Stories from services with contributed content

From Wikipedia.

And contributed content services are also telling their own story.

Companies, from Big Tech down to smaller platforms and messaging apps, will need to comply with a long list of new requirements, starting with age verification for their users. (Wikipedia, the eighth-most-visited website in the UK, has said it won’t be able to comply with the rule because it violates the Wikimedia Foundation’s principles on collecting data about its users.)

From Wired.

What is YOUR firm’s story?

All of these firms have shared their stories either before or after the Act became law, and those stories will change depending upon what Ofcom decides.

But what about YOUR firm?

Is your firm affected by the UK Online Safety Act, and the future implementation of the Act by Ofcom?

Do you have a story that you need to tell to achieve your firm’s goals?

Do you need an extra, experienced hand to help out?

Learn how Bredemarket can create content that drives results for your firm.

Click the image below.