A biometric expert (I’m not the only one) was challenged to find a picture of a particular cartoon character in a particular setting, but was worried about copyright infringement.
I suggested that the expert substitute some other character in place of the copyrighted cartoon character.
I can’t share the particular example above, but the picture in this post illustrates the point. You subconsciously know which characters are being referenced, but the substitute characters (pre-copyright days) take care of the copyright issue.
As long as the rest of the image doesn’t infringe on copyright either. MLB may visit me, even if “the fruit company” doesn’t.
“The State Bar of California announced Friday that its beleaguered leader, who has faced growing pressure to resign over the botched February roll out of a new bar exam, will step down in July. Leah T. Wilson, the agency’s executive director, informed the Board of Trustees she will not seek another term in the position she has held on and off since 2017. She also apologized for her role in the February bar exam chaos.”
(With a special message at the end for facial recognition and cybersecurity marketing leaders)
Years ago, when I was in Mexico City on a business trip, one of my coworkers stated that he never uses biometrics to protect the data on his smartphone.
His rationale?
Government officials can compel you to use your biometrics to unlock your smartphone. They can’t compel you to provide your passcode to government officials.
Ironically, we both worked for a biometric company at the time.
But my former coworker isn’t the only one making this statement. With the recent protests, and with the recent searches of people crossing the U.S. border by plane or otherwise, this same advice is echoed everywhere.
ZDNET quotes law firm managing partner Ignacio Alvarez on passcodes:
“But the majority of the courts have found that being required by law enforcement to give your code to your devices violates your Fifth Amendment right against self-incrimination.”
Note what Alvarez said: the MAJORITY of the courts. So if you end up before the “wrong” court, you might have to provide your passcode anyway.
ZDNET also quotes attorney Joseph Rosenbaum:
“Passwords or passcodes, because they represent information contained in a person’s mind, seem to generally be considered the same as requiring someone to testify against themselves in court or in a deposition,” he told ZDNET. “That information is more likely to be legally protected under the Fifth Amendment as potentially self-incriminating.”
Notice his “seem to generally be” and “more likely to be” language. Again, you could still be compelled to give your passcode.
But that’s the easy part.
Biometrics: it’s complicated
But passcodes are the easy part. Biometrics are much more of a gray area.
The rationale behind not giving up your biometric is similar to the rationale behind the Miranda warning. As Dragnet fans know, “Anything you say can and will be used against you in a court of law.” Regarding passcodes, the courts…well, some of the courts, hold that since a passcode can be “spoken,” it’s covered under Miranda and therefore can’t be given without violating your Fifth Amendment rights.
What about biometrics? (Excluding voice biometrics for the moment.)
“…since a biometric isn’t spoken, production of that biometric may not legally qualify as the act of testifying against yourself and therefore, you can be compelled to unlock a phone or an app without necessarily having your rights violated.”
Again, note the use of the words “may not.” It isn’t clear here either.
And even these wishy-washy definitions may change.
“This area of law is a seriously moving target. Over time, things could favor passcodes being non-testimonial or biometrics being testimonial.”
In other words, a few years from now lawyers may advise you to use biometrics rather than passcodes to protect your private data on your smartphone.
Or maybe they’ll say both methods protect you equally.
Or maybe they’ll say neither method protects you, and your private data is no longer private.
But most likely they’ll say “It depends.” In the same way that our 18,000 law enforcement agencies have 18,000 different definitions of forensic science, they could have 18,000 different definitions of Miranda rights.
And one more thing…
Bredemarket has two openings!
The formal announcement is embargoed until Monday, but Bredemarket has TWO openings to act as your on-demand marketing muscle for facial recognition or cybersecurity:
Regarding facial recognition, I wrote this in a social media conversation earlier today:
“Facial recognition CAN be used as a crowd checking tool…with proper governance, including strict adherence to a policy of only using FR as an investigative lead, and requiring review of potential criminal matches by a forensic face investigator. Even then, investigative lead ONLY. Same with DNA.”
I received this reply:
“It’s true but in my experience cops rarely follow any rules.”
Now I could have claimed that this view was exaggerated, but there are enough examples of cops who DON’T follow the rules to tarnish all of them.
“The complaint alleges that the surveillance footage is poorly lit, the shoplifter never looks directly into the camera and still a Detroit Police Department detective ran a grainy photo made from the footage through the facial recognition technology.”
There’s so much that isn’t said here, such as whether a forensic face examiner made a definitive conclusion, or if the detective just took the first candidate from the list and ran with it.
But I am willing to bet that there was no independent evidence placing Williams at the shop location.
Why this matters
The thing that concerns me about all this? It just provides ammo to the people who want to ban facial recognition entirely.
In business, it is best to use a three-legged stool.
A two-legged stool obviously tips over, and you fall to the ground.
A four-legged stool is too robust for these cost-conscious days, where the jettisoning of employees is policy at both the private and public level.
But a three-legged stool is just right, as project managers already know when they strive to balance time, cost, and quality.
Perhaps the three-legged stool was in the back of Yunique Demann’s mind when she wrote a piece for the Information Systems Audit and Control Association (ISACA) entitled “The New Triad of AI Governance: Privacy, Cybersecurity, and Legal.” If you only rely on privacy and cybersecurity, you will fall to the ground like someone precariously balanced on a two-legged stool.
“As AI regulations evolve globally, legal expertise has become a strategic necessity in AI governance. The role of legal professionals now extends beyond compliance into one that is involved in shaping AI strategy and legally addressing ethical considerations…”
This is a remote education post, but not an educational identity post.
I have previously discussed online test taking, and I guess the State Bar of California reads the Bredemarket blog because it decided that an online bar exam would be a great idea, since it would reduce the costs of renting large halls for test taking purposes.
“The online testing platforms repeatedly crashed before some applicants even started. Others struggled to finish and save essays, experienced screen lags and error messages and could not copy and paste text from test questions into the exam’s response field — a function officials had stated would be possible.”
No surprise, but the remote bar exam debacle was so bad that students are filing…lawsuits.
“Some students also filed a complaint Thursday in the U.S. District Court for the Northern District of California, accusing Meazure Learning, the company that administered the exam, of “failing spectacularly” and causing an “unmitigated disaster.””
You may remember the May hoopla regarding amendments to Illinois’ Biometric Information Privacy Act (BIPA). These amendments do not eliminate the long-standing law, but lessen its damage to offending companies.
The General Assembly is expected to send the bill to Illinois Governor JB Pritzker within 30 days. Gov. Pritzker will then have 60 days to sign it into law. It will be immediately effective.
While the BIPA amendment has passed the Illinois House and Senate and was sent to the Governor, there is no indication that he has signed the bill into law within the 60-day timeframe.
A proposed class action claims Photomyne, the developer of several photo-editing apps, has violated an Illinois privacy law by collecting, storing and using residents’ facial scans without authorization….
The lawsuit contends that the app developer has breached the BIPA’s clear requirements by failing to notify Illinois users of its biometric data collection practices and inform them how long and for what purpose the information will be stored and used.
In addition, the suit claims the company has unlawfully failed to establish public guidelines that detail its data retention and destruction policies.
Image from the mid-2010s. “John, how do you use the CrowdCompass app for this Users Conference?” Well, let me tell you…
Because of my former involvement with the biometric user conference managed by IDEMIA, MorphoTrak, Sagem Morpho, Motorola, and older entities, I always like to peek and see what they’re doing these days. And it looks like they’re still prioritizing the educational element of the conference.
Although the 2024 Justice and Public Safety Conference won’t take place until September, the agenda is already online.
Subject to change, presumably.
This Joseph Courtesis session, scheduled for the afternoon of Thursday, September 12 caught my eye. It’s entitled “Ethical Use of Facial Recognition in Law Enforcement: Policy Before Technology.” Here’s an excerpt from the abstract:
This session will focus on post investigative image identification with the assistance of Facial Recognition Technology (FRT). It’s important to point out that FRT, by itself, does not produce Probable Cause to arrest.
Re-read that last sentence, then re-read it one more time. 100% of the wrongful arrest cases would be eliminated if everyone adopted this one practice. FRT is ONLY an investigative lead.
And Courtesis makes one related point:
Any image identification process that includes FRT should put policy before the technology.
Any technology that could deprive a person of their liberty needs a clear policy on its proper use.
September conference attendees will definitely receive a comprehensive education from an authority on the topic.
But now I’m having flashbacks, and visions of Excel session planning workbooks are dancing in my head. Maybe they plan with Asana today.
Biometric security company Facewatch…is facing a lawsuit after its system wrongly flagged a 19-year-old girl as a shoplifter….(The girl) was shopping at Home Bargains in Manchester in February when staff confronted her and threw her out of the store…..’I have never stolen in my life and so I was confused, upset and humiliated to be labeled as a criminal in front of a whole shop of people,’ she said in a statement.
While Big Brother Watch and others are using this story to conclude that facial recognition is evil and no one should ever use it, the problem isn’t the technology. The problem is when the technology is misused.
Were the Home Bargains staff trained in forensic face examination, so that they could confirm that the customer was the shoplifter? I doubt it.
Even if they were forensically trained, did the Home Bargains staff follow accepted practices and use the face recognition results ONLY as an investigative lead, and seek other corroborating evidence to identify the girl as a shoplifter? I doubt it.
Again, the problem is NOT the technology. The problem is MISUSE of the technology—by this English store, by a certain chain of U.S. stores, and even by U.S. police agencies who fail to use facial recognition results solely as an investigative lead.
A prospect approached me some time ago to have Bredemarket help tell this story. However, the prospect has delayed moving forward with the project, and so their story has not yet been told.
Does YOUR firm have a story that you have failed to tell?