It is the DOCUMENTED trail of every person who accessed (“touched, moved, or stored”) a piece of evidence. The documentation is chronological and ideally logged in real time, not hours after the fact.
What could go wrong?
Let’s say that at 1:23 pm, Diana placed a piece of evidence in a locker and locked the locker. Then at 3:53 pm, Denis went to the locker, which was unlocked, and removed the evidence from the locker. Defense attorneys will have a field day with that, and may pressure the judge to state that the evidence is inadmissible because of possible tampering by an unidentified person.
Cross-Contamination
Closely related to the chain of custody is the possibility of cross-contamination. In the case of DNA, this happens when DNA from a crime scene is mixed with another person’s DNA.
I’ve told the story of the mystery female criminal whose DNA was found at crime scenes in three different countries over a decade. The only problem was that the woman’s DNA was found on the burned body of a MAN. Eventually the authorities figured out that the woman worked in a packing center specializing in medical supplies…and that her DNA had contaminated the medical supplies and ended up in the crime scene samples.
How to Minimize Errors
Forensic’s Blog suggests several ways to avoid chain of custody issues and cross-contamination. These include established and enforced procedures for handling, packaging, and labeling data; separating tasks; and establishing zones of exclusion at crime scenes.
In addition to establishment and enforcement, education is essential. (Sorrye.) Not only for the forensic professionals gathering the evidence, but for everyone who deals with the evidence…including biometric vendors and their customers.
If you need assistance in creating educational content for proper evidence handling, Bredemarket can help. Talk to me.
How should businesses, governments, and individuals treat private data? In the United States, the answer varies from state to state, and even from city to city.
A proposed national solution to the hodgepodge of differing state and local privacy regulations is the so-called SECURE Data Act.
“Congress’s latest foray is a new bill called the SECURE Data Act, a piece of garbage cooked up by Republicans as a gift to industry in a climate where the public is deeply concerned about privacy, outraged at the harms tech is causing, and yearning for ways to hold Big Tech accountable.
“I can’t stress enough how awful this bill is. On balance, if passed into law, it will do dramatic harm to privacy. It will leave people less protected than if it didn’t exist. I’d call it more of an anti-privacy law than a privacy law.”
Um…I don’t think Solove likes it.
Preemption
One of the huge issues that privacy advocates have is preemption. To ensure uniform privacy across the United States, the proposed SECURE Data Act preempts any state laws that exceed its protections. Therefore the privacy protections in states such as California, Illinois, and Texas could be revoked.
Of course, this is a classic struggle in state-federal relations. The staunchest states’ rights advocate will suddenly switch sides if they agree with a federal regulation, and vice versa. Citing one example, gay marriage began as a states’ rights issue when only a few states supported it, then became a federal rights issue.
Another example is the federal drive to eliminate the U.S. Department of Education. This has not happened yet. And it never will, once the powers that be realize that its elimination will allow blue states to teach whatever they want.
Secure
But let’s go back to the title of the bill, the SECURE Data Act. Upon seeing the title, the average voter would assume that the bill secures our individual privacy rights.
But the privacy advocates believe that the bill actually secures the right of entities to do what they want with our personally identifiable information, with minimal restrictions.
I think the bill’s title may have been written by Eric Arthur Blair.
(Image by Google Gemini. Sorry, Ontario Canada; Gemini may make mistakes.)
“The term “scientific” to describe his opinion “arguably verged on suggesting that the ACE-V process is more scientific than warranted,” and there was one instance in which Dolan testified without using the term “opinion.” The court concludes that there was no error because, “viewed as a whole,” his testimony was largely expressed in terms of an “opinion” and his testimony did not claim that the ACE-V process was infallible or absolutely certain.”
“The friction ridge examination process is commonly referred to as ACE‐V: Analysis, Comparison, Evaluation, and Verification.
“Analysis: An initial information‐gathering phase in which the examiner studies the unknown print to assess the quality and quantity of discriminating detail present. The examiner considers information such as substrate, development method, various levels of ridge detail, and pressure distortions. A separate analysis then occurs with the exemplar.
“Comparison: The side‐by‐side observation of the friction ridge detail in the two impressions to determine the agreement or disagreement in the details
“Evaluation: The examiner assesses the agreement or disagreement of the information observed during Analysis and Comparison and forms a conclusion.
“Verification: In some agencies is a review of an examiner’s conclusions with knowledge of those conclusions; in other agencies, it is an independent re‐examination by a second examiner who does not know the outcome of the first examination.”
To make things easier for latent examiners, some automated biometric identification system (ABIS) software packages incorporate ACE-V either as a separate module or as an integrated part of their workflow. I know that IDEMIA and Thales include it, and it appears that CSIpix, Eviscan, Noblis, and ROC also include it.
But I’m going to talk about Innovatrics in this post.
“Innovatrics is expanding ACE-V capabilities in ABIS for Criminal Investigation with new features that help forensic teams manage examinations more clearly, support different review models, and keep unresolved latent evidence active as investigations move forward….
“ACE-V is widely used in forensic work, but the way it is applied can vary across agencies, regions, and countries, often shaped by local legislative requirements. Some agencies require clear separation between roles, while others rely on smaller teams with a more flexible way of dividing work. Innovatrics is shaping the workflow to support different use cases and agency ACE-V policies, from small teams to large departments, without forcing agencies to change the established workflows they already rely on.”
Privacy regulations can change when you cross country or even city lines, and they can also change depending on who you are: an individual, a business, or a government agency.
On the other extreme, some entities in some jurisdictions must obtain express written consent. If I am a homeowner in Schaumburg, Illinois, and I use a doorbell camera to identify friends or foes approaching my door, the Biometric Information Privacy Act (BIPA) prohibits me from capturing their biometrics without their consent, and lets them sue me if I do it anyway.
Before you collect PII, check the laws in your jurisdiction first.
Oh, and check the laws in other jurisdictions in case they try to enforce their laws in your jurisdiction.
By the way: if you’re a software or hardware vendor, don’t assume that you bear no responsibility and that only your customer does.
You must educate your customers.
And Bredemarket can help you with my content-proposal-analysis services.
I’ve discussed the electronic health record (EHR) before, and plan to do so again. But before I dive into EHRs and “the A word,” I want to take a look at WHY we have EHRs.
When dinosaurs roamed the earth
In the old days, even within the lifetimes of some of us, there were no ELECTRONIC health records. There were PAPER health records, stored in large file cabinets. If you were lucky, the health records were typed; heaven help you if they were in a doctor’s famously illegible handwriting.
When a relative’s doctor retired in the 20th century, the relative requested their health records and received a huge pile of paper dating back to who knows when. In that form, it was about as useful as the huge file cabinets in which the U.S. Federal Bureau of Investigation used to store its millions of fingerprint cards. And unfortunately, paper health records didn’t have the health equivalent of a “Henry system” to find individual records quickly.
The two purposes of an electronic health record
So now that we have electronic health records, why do we have them?
To make life easier for the doctor? Of course not.
To make life easier for the patient? Definitely not.
Electronic health records have evolved to serve two OTHER parties.
First, electronic health records serve the billers
I can’t speak to countries other than my own, but in the United States the health “system” is a mishmash of multiple parties. For example, when I had a colonoscopy a few years ago, the following entities were somehow involved:
The doctor who performed the colonoscopy.
The facility where the doctor performed the colonoscopy.
The anaesthesiologist who assisted with the colonoscopy.
My insurance company.
My former company (via COBRA) who provided me with the insurance.
And probably a half dozen other entities that I missed who somehow got a cut.
So this one procedure created one, or perhaps multiple, electronic health records (perhaps even with pictures) describing every chargeable thing that could be itemized during my time in the facility. All with the proper billing codes (Current Procedural Terminology or CPT codes) and the like, so that every entity can pay what they’re supposed to pay. And if a particular thing wasn’t covered by insurance, then I had to pay it.
Google Gemini.
The most important thing is to get the billing codes right…never mind how hard it is to ENTER all the billing codes.
“By integrating EHR and billing software, healthcare providers can automate various aspects of the billing workflow, resulting in increased efficiency, reduced manual work, and other tangible benefits.”
Second, electronic health records serve the lawyers and regulators
But it’s not only the billers who need information.
To practice medicine in the State of California, you have to perform a colonoscopy in accordance with medically approved procedures. And you have to document that you did so.
If I had died on the operating table during my colonoscopy, then a number of private and government entities would have a keen interest in what was performed during the colonoscopy. And the electronic health record would be one of the main sources of information about what happened, and perhaps what went wrong. And who was responsible. The doctor? The facility? The anaesthesiologist? Someone else?
“The “EHR mandate” refers to the federal requirement for eligible healthcare providers to adopt and use certified EHR technology. Primarily affecting providers who accept Medicare, participation in MIPS and the Promoting Interoperability program requires CEHRT to avoid negative payment adjustments, which effectively necessitates EHR use.”
The result
So now the medical field has these wonderful EHRs that comply with billing requirements and legal requirements.
“For instance, emergency medicine physicians at one health system must click 14 times to order Tylenol—that’s a lot. Yet, those at another health system using the same EHR must click 61 times!”
And that’s just for Tylenol. I’m sure it’s a lot worse for the camera that looked at my colon.
It could have been worse, because many Americans are not healthy.
“[O]ur patients have increasingly complex health needs. More than 40% of American adults have at least two chronic conditions, one-third take at least three medications, and one-fifth suffer from mental illness.”
Put these and other things together, and EHRs have become (as I said before) “a pain in a particular body part.”
Google Gemini.
So that’s the problem with EHRs. Later I’ll look at the solutions, including:
Yet another state has passed its own data privacy law, with the Oklahoma Consumer Data Privacy Act signed last month and taking effect in 2027. The key particulars:
“OKDPA grants consumers a set of rights…including rights of access, deletion, correction, and portability, and rights to opt-out of targeted advertising, sale, or profiling “in furtherance of a decision that produces a legal or similarly significant effect concerning the consumer.””
As for enforcement:
“Enforcement authority rests with the Oklahoma Attorney General.The bill includes a mandatory 30-day cure period, which does not sunset. The law imposes civil penalties of up to $7,500 per violation.”
As of now, between 19 and 22 states have privacy laws, depending upon how you count.
Some aren’t counting Florida because of its limited scope. It only applies to companies with over $1 billion in revenue.
Some aren’t counting Illinois because BIPA only applies to biometrics.
Some aren’t counting Oklahoma yet because it’s so new.
But we can agree that many states have privacy laws.
For now
And if some have their way, they will all disappear, to be replaced by a single uniform federal law. However, the level of preemption of state laws is an issue of discussion. The Future of Privacy Forum has addressed preemption here.
And if you need to write about privacy, biometric or otherwise, Bredemarket can help. Click below to book a free meeting with me.
Many years ago I was driving on Holt Boulevard in Montclair, California, preparing to make a left turn on Central. I followed the vehicle behind me and made my left turn…only then noticing that the left turn light was now red.
As the registered owner of the vehicle I was driving, I received an email from the city of Montclair a few days later. Because this is when Montclair was using cameras for traffic enforcement.
Off to traffic school.
Montclair doesn’t use traffic cameras any more, but all sorts of cameras are owned by, or accessible to, law enforcement agencies.
But how should they be used?
404 Media reported that the Georgia State Patrol accesses Flock cameras, for the intended purpose of gathering information for serious crimes. But what happens when the camera captures something not serious?
“Georgia State Patrol used its system of Flock automated license plate reader (ALPR) surveillance cameras to issue a ticket to a motorcyclist who was allegedly looking at his cell phone while riding, according to a copy of the citation obtained by 404 Media….The incident happened December 26 in Coffee County, Georgia. The ticket lists the offense as ‘Holding/supporting wireless telecommunications device,’ and includes the note ‘CAPTURED ON FLOCK CAMERA 31 MM 1 HOLDING PHONE IN LEFT HAND.’”
The man went to court and the ticket was dropped, but 404 Media is still outraged that the ticket was issued in the first place. Not because of Georgia’s policies, but because of other policies.
“Many police departments go out of their way to tell community members that Flock cameras are not used for traffic enforcement. For example, the City of Glenwood Springs, Colorado, states in a FAQ that “GSPD [Glenwood Springs Police Department] does not use Flock cameras for traffic enforcement, parking enforcement, or minor code violations.” El Paso, Texas, tells residents “these are not traffic enforcement cameras. They do not issue tickets, do not monitor speed, and do not generate revenue. They are investigative tools used after crimes occur.” Lynwood, Washington tells residents “these cameras will not be used for traffic infractions, immigration enforcement, or monitoring First Amendment-protected expressive activity” (Flock cameras have now been used for all of these purposes, as we have reported.)”
You will recall that I addressed another Flock Safety case, in which a citizen made public records requests from two Washington state jurisdictions. The jurisdictions said that they didn’t have the data; Flock Safety did. Flock Safety said that it had deleted the data.
Basically, Flock Safety is controversial, and some people are going to oppose ANYTHING they do. Even when Flock Safety technology protects people from dangerous drivers.
My view is that if a camera is used by a law enforcement agency, and there is no law prohibiting the law enforcement agency from using a camera for a particular purpose, then the agency can use the camera. There appears to be no such law in Georgia, so I’m not bent out of shape over this.
What are your thoughts? Is this a privacy violation?
“[T]he technology is easy. The business part is the difficult part.”
But Chris Burt of Biometric Update phrased it more succinctly:
“[P]olicy chases modernization”
As Burt notes, examples of policy chasing modernization include:
Digital sovereignty, a topic of discussion with everyone from ID4Africa to an organization called the World Ethical Data Foundation. (As an aside, a Bredemarket client and I were recently discussing the pros and cons of managing digital identities in the cloud vs. peer-to-peer synchronization.)
Cybersecurity and digital identity, a topic of discussion in government (the White House, NIST) and industry (Jordan Burris of Socure).
Other topics, including police facial recognition policy. (Hmm…I recall that both government and vendor biometric policies were the topic of a Biometric Update guest article last year.)
All of you recall Pandora’s Box. I’ve used the story multiple times, including when discussing my creation of Bredebot and its nearly-instantaneous hallucinations. Yes, I do have “policies” regarding this “modernization,” including full disclosure.
On Wednesday, I described how Meta’s Kenyan data labelers ended up watching explicit videos from people who presumably didn’t know that smart glasses were recording their activity.
“In the newly filed complaint, plaintiffs Gina Bartone of New Jersey and Mateo Canu of California, represented by the public interest-focused Clarkson Law Firm, allege that Meta violated privacy laws and engaged in false advertising.
“The complaint alleges that the Meta AI smart glasses are advertised using promises like “designed for privacy, controlled by you,” and “built for your privacy,” which might not lead customers to assume their glasses’ footage, including intimate moments, was being watched by overseas workers. The plaintiffs believed Meta’s marketing and said they saw no disclaimer or information that contradicted the advertised privacy protections.”
“Clear, easy device and app settings help you manage your information, giving you control over what content you choose to share with others, and when.”
Except that according to Clarkson, people can’t opt out of the data labeling process.
A private social media comment got me thinking. I will gladly credit the author, with their permission.
“If a U.S. federal court says that you can’t copyright AI generated content, an appellate court upholds that ruling, and the SCOTUS refuses to hear the case, what are the implications for software generated by LLMs?”
Think about that the next time Company X publishes its marketing message “we use AI.”
What if Company X’s code and prompts were themselves written with AI?
Couldn’t Company Y take Company X’s non-copyrightable code and run it without penalty, like open source code?
Now Company X would be forced to prove that it does NOT use AI. For its code, anyway.