Privacy regulations can change when you cross country or even city lines, and they can also change depending on who you are: an individual, a business, or a government agency.
On the other extreme, some entities in some jurisdictions must obtain express written consent. If I am a homeowner in Schaumburg, Illinois, and I use a doorbell camera to identify friends or foes approaching my door, the Biometric Information Privacy Act (BIPA) prohibits me from capturing their biometrics without their consent, and lets them sue me if I do it anyway.
Before you collect PII, check the laws in your jurisdiction first.
Oh, and check the laws in other jurisdictions in case they try to enforce their laws in your jurisdiction.
By the way: if you’re a software or hardware vendor, don’t assume that you bear no responsibility and that only your customer does.
You must educate your customers.
And Bredemarket can help you with my content-proposal-analysis services.
Yet another state has passed its own data privacy law, with the Oklahoma Consumer Data Privacy Act signed last month and taking effect in 2027. The key particulars:
“OKDPA grants consumers a set of rights…including rights of access, deletion, correction, and portability, and rights to opt-out of targeted advertising, sale, or profiling “in furtherance of a decision that produces a legal or similarly significant effect concerning the consumer.””
As for enforcement:
“Enforcement authority rests with the Oklahoma Attorney General.The bill includes a mandatory 30-day cure period, which does not sunset. The law imposes civil penalties of up to $7,500 per violation.”
As of now, between 19 and 22 states have privacy laws, depending upon how you count.
Some aren’t counting Florida because of its limited scope. It only applies to companies with over $1 billion in revenue.
Some aren’t counting Illinois because BIPA only applies to biometrics.
Some aren’t counting Oklahoma yet because it’s so new.
But we can agree that many states have privacy laws.
For now
And if some have their way, they will all disappear, to be replaced by a single uniform federal law. However, the level of preemption of state laws is an issue of discussion. The Future of Privacy Forum has addressed preemption here.
And if you need to write about privacy, biometric or otherwise, Bredemarket can help. Click below to book a free meeting with me.
In the past, I have gone on ad nauseam about how mobile driver’s licenses are more private than physical driver’s licenses. Here is how I stated it in July 2024:
“When you hand your physical driver’s license over to a sleazy bartender, they find out EVERYTHING about you, including your name, your birthdate, your driver’s license number, and even where you live.
“When you use a digital mobile driver’s license, bartenders ONLY learn what they NEED to know—that you are over 21.”
Which is extremely limited information.
But some age verification systems may provide your age in years, without necessarily revealing your exact date of birth.
That single number—whether it is 17, 27, or 57—reveals a lot more than we realize.
Let’s say that we know that Jill is 57 years old. This means that she was born in either 1968 or 1969. If Jill has lived her entire life in the United States, we immediately know several things about her with some certainty.
First, we know that she is part of Generation X, which means she may exhibit skepticism rather than corporate loyalty, and a comfort level with email rather than Telegram or what we now refer to as “voice calls.”
Second, we know the types of experiences she probably had in her childhood and teenage years. She probably played with Star Wars toys as a kid. She knew a little bit about Billy Carter, the funny Presidential brother. She feared for the lives of the hostages in Iran.
Third, we know the types of experiences she didn’t have. She never saw a cigarette commercial on TV. If she watched Star Trek, she saw it on an “independent” station, not on NBC during prime time. She never feared for the lives of the Israeli Olympians in Munich.
It’s not a lot to go on, and it may not be 100% accurate if Jill grew up in a household that viewed television as demonic.
But it’s enough for a product marketer to shape age-sensitive product marketing.
But if your product appeals to some ages more than others, knowing the ideal age of your target audience personas shapes your content. If your target audience is just out of college, “I can’t believe I ate the whole thing” is meaningless to them.
As I’ve noted before, healthcare is a pioneering user of artificial intelligence, although (hopefully) under robust controls to maintain accuracy and preserve HIPAA-level privacy.
“We are living through a generational shift, one where AI doesn’t just augment how organizations work but fundamentally transforms them from the inside out,” said Mohamad Makhzoumi, Co-CEO of NEA, who will join Qualified Health’s Board of Directors in conjunction with the financing. “From NEA’s nearly five decades of company-building experience, we believe the organizations shaping the next era of healthcare innovation will be those helping health systems reimagine every administrative and clinical workflow from the ground up, and Qualified Health is exactly that company. We are thrilled to lead this financing and to partner with Justin and team to accelerate healthcare’s AI transformation and shape the future of healthcare enterprises across the country.”
“Health systems today are operating under extraordinary pressure, from rising labor costs to tightening reimbursement, while managing increasing complexity in patient care,” said Jared Kesselheim, MD, Managing Partner at Transformation Capital. “What stood out to us about Qualified Health is that the team approaches this work as medical care specialists, with a deep understanding of the realities health systems face every day. That perspective allows them to identify where AI can create meaningful clinical and operational impact. We’re excited to partner with Justin and the Qualified Health team as they help leading health systems navigate this next phase of healthcare.”
Many years ago I was driving on Holt Boulevard in Montclair, California, preparing to make a left turn on Central. I followed the vehicle behind me and made my left turn…only then noticing that the left turn light was now red.
As the registered owner of the vehicle I was driving, I received an email from the city of Montclair a few days later. Because this is when Montclair was using cameras for traffic enforcement.
Off to traffic school.
Montclair doesn’t use traffic cameras any more, but all sorts of cameras are owned by, or accessible to, law enforcement agencies.
But how should they be used?
404 Media reported that the Georgia State Patrol accesses Flock cameras, for the intended purpose of gathering information for serious crimes. But what happens when the camera captures something not serious?
“Georgia State Patrol used its system of Flock automated license plate reader (ALPR) surveillance cameras to issue a ticket to a motorcyclist who was allegedly looking at his cell phone while riding, according to a copy of the citation obtained by 404 Media….The incident happened December 26 in Coffee County, Georgia. The ticket lists the offense as ‘Holding/supporting wireless telecommunications device,’ and includes the note ‘CAPTURED ON FLOCK CAMERA 31 MM 1 HOLDING PHONE IN LEFT HAND.’”
The man went to court and the ticket was dropped, but 404 Media is still outraged that the ticket was issued in the first place. Not because of Georgia’s policies, but because of other policies.
“Many police departments go out of their way to tell community members that Flock cameras are not used for traffic enforcement. For example, the City of Glenwood Springs, Colorado, states in a FAQ that “GSPD [Glenwood Springs Police Department] does not use Flock cameras for traffic enforcement, parking enforcement, or minor code violations.” El Paso, Texas, tells residents “these are not traffic enforcement cameras. They do not issue tickets, do not monitor speed, and do not generate revenue. They are investigative tools used after crimes occur.” Lynwood, Washington tells residents “these cameras will not be used for traffic infractions, immigration enforcement, or monitoring First Amendment-protected expressive activity” (Flock cameras have now been used for all of these purposes, as we have reported.)”
You will recall that I addressed another Flock Safety case, in which a citizen made public records requests from two Washington state jurisdictions. The jurisdictions said that they didn’t have the data; Flock Safety did. Flock Safety said that it had deleted the data.
Basically, Flock Safety is controversial, and some people are going to oppose ANYTHING they do. Even when Flock Safety technology protects people from dangerous drivers.
My view is that if a camera is used by a law enforcement agency, and there is no law prohibiting the law enforcement agency from using a camera for a particular purpose, then the agency can use the camera. There appears to be no such law in Georgia, so I’m not bent out of shape over this.
What are your thoughts? Is this a privacy violation?
A little more detail, courtesy EU Brussels, regarding the policy brief published by the EU Innovation Hub for Internal Security, coordinated by eu-LISA together with the European Commission, Europol and Frontex.
As I noted earlier today, one proposal is for Europe to perform its own independent biometric testing, reducing Europe’s dependence on the American National Institute of Standards and Technology (NIST).
“The second is a centralised evaluation and testing platform connected to that repository, allowing standardised, independent and continuous assessment of biometric technologies, including benchmarking across vendors.”
But if there is a second proposal (European testing) in the cited European biometric policy brief, there must also be a first proposal—one I failed to discuss this morning.
“The first is a common EU biometric data repository containing datasets that comply with European rules, reflect the demographics and use-cases relevant to EU authorities and are stored in a secure environment.”
Makes sense. If you are going to test you need test data. And NIST has no obligation to ensure its test data complies with the General Data Protection Regulation (GDPR). The subjects in NIST test databases rarely provided the “explicit consent” mentioned in GDPR, and the “right to erasure” from a NIST database is…laughable.
Yes, it’s extremely challenging to construct a testing database that complies with GDPR.
And NIST certainly ain’t gonna do it.
Will a European entity construct it?
And if the right to erasure is maintained, how will you maintain historical consistency of test results?
I’ve seen twoarticles about a possible move by Europe to set up a Europe-wide biometric testing agency, bypassing the need for National Institute of Standards and Technology (NIST) biometric testing.
One reason is that a European-controlled testing methodology can incorporate European regulations, such as the General Data Protection Regulation (GDPR).
A second related reason for Europe to bypass NIST biometric testing is that U.S. government agencies, including NIST and the Federal Bureau of Investigation (FBI), naturally place prime importance on American interests.
Remember when the U.S. House of Representatives Select Committee on the Chinese Communist Party complained that the FBI Certified Products List contained Chinese biometric vendors (the Certified Communist Products List)?
Wait until they discover all the Chinese companies that participate in NIST testing.
And wait until someone in the legislative or executive branches decides that the FBI or NIST shouldn’t list products from other countries deemed unfriendly to the United States. Denmark? Germany? France?
For these reasons, Europe may be compelled to set up its own biometric testing organization.
If you’re on a platform such as Facebook, you sometimes receive advertisements that are VERY specific. Such as, “This is the perfect drink holder for California white males over the age of 50!” It’s almost as if they know everything about you…because they do.
Unless you implement privacy restrictions and don’t allow platform advertisers to reference your personal information.
Of course, if the advertiser isn’t able to narrowcast directly to you, the advertiser will broadcast to everybody.
And Facebook will start showing you advertisements in Chinese.
Qiaobi.
And if you complain to Facebook and ask why you’re seeing Chinese ads, Facebook will simply reply, “We are prohibited from using your personal information. Since there are a billion Chinese, we take a guess that you’re Chinese and show you those ads.”
Which brings us to age and social media.
The Under 16s Are Blocklisted
Back when Marky Mark created The Facebook, he initially targeted college-age users. But as time went on, Facebook and its competitors started aiming for younger ages.
This makes sense. Advertisers want to target consumers who are suspectible to changing their minds and are not set in their ways. So while a super kewl soft drink manufacturer isn’t going to target me, it is going to target 18 year olds…and 16 year olds…and 14 year olds…and 12 year olds.
A recent DKC report stated that 42% of all household spending is influenced by 8- to 14-year-olds, and that this age group is DIRECTLY spending over $100 billion per year.
So you can bet that advertisers are clamoring to purchase ad time on Facebook, TikTok, and the other social media services to get a pipeline to the brains of these 8 to 14 year olds…whoops, 12 to 14 year olds, since most social media services require you to be at least 12 years old to have an account.
But what if access to that entire age group is cut off entirely?
We’re seeing all over the world that jurisdictions are enacting or trying to enact bans on the use of social media for people under 16 years of age. The latest country to propose such a move is Indonesia:
“Authorities in the country, which is Southeast Asia’s largest economy, said Friday they expect social media platforms to deactivate the accounts of under-16s from March 28, starting with YouTube, TikTok, Facebook, Instagram, Threads, X, Bigo Live and Roblox.”
In other words, all the popular sites that teens love.
And in certain jurisdictions, the companies will implement age verification and age estimation technology to ensure that kids don’t like about their ages to get in.
Assuming these prohibitions stand, this causes a huge problem for B2C marketers that target teens: how do you market to them when the direct pipelines to this age group are cut off?
I’m just thankful that Bredemarket and its clients sell to adults. You don’t really see 13 year olds buying biometric technology.
The facial recognition brouhaha in southeastern Wisconsin has taken an interesting turn.
According to Urban Milwaukee, the Milwaukee County Sheriff’s Office is pursuing an agreement with Biometrica for facial recognition services.
The, um, benefit? No cost to the county.
“However, the contract would not need to be approved by the Milwaukee County Board of Supervisors, because there would be no cost to the county associated with the contract. Biometrica offers its services to law enforcement agencies in exchange for millions of mugshots.”
“Milwaukee Police Department has also attempted to contract Biometrica’s services, prompting pushback, at least some of which reflected confusion about how the system works….
“The mooted agreement between Biometrica and MPD would have added 2.5 million images to the database.
“In theory, if MCSO signs a contract with Biometrica, it could perform facial recognition searches at the request of MPD.”
See Bredemarket’s previous posts on the city efforts that are now on hold.
No guarantee that the County will approve what the City didn’t. And considering the bad press from the City’s efforts, including using software BEFORE adopting a policy on its use, it’s going to be an uphill struggle.
“Abstract 3D render of a human silhouette made of shimmering frosted glass, iridescent light refracting through, symbolizing secure data encryption and zero-knowledge proofs, elegant and high-end.”
Personally I think it’s TOO abstract, but perhaps that’s just me.