Your LMM Pharmacy

On Threads, Dr. Jen Gunter called our attention to the newly-introduced H.R. 238, “To amend the Federal Food, Drug, and Cosmetic Act to clarify that artificial intelligence and machine learning technologies can qualify as a practitioner eligible to prescribe drugs if authorized by the State involved and approved, cleared, or authorized by the Food and Drug Administration, and for other purposes.”

Ultra-modern healthcare?

Presumably these non-person entities would not be your run-of-the-mill consumer generative AI packages, by rather specially trained Large Medical Models (LMMs).

Kinda like my “Dr. Jones, NPE.

Even so, don’t count on this becoming law in the next two years. For one, Rep. David Schweikert introduced a similar bill in 2023 which never made it out of committee.

Why is Rep. Schweikert so interested in this and related topics? Because medical bills are too damn high:

““How do I make sure we’re embracing technology and using it to bring disruptive cures to market, or other opportunities to market?” Schweikert asked. “And does that also now help lower drug pricing?””

Before you reject this idea entirely, Rep. Schweikert cited one example of technology decision-making:

“Schweikert noted that the FDA last month approved Apple Watch’s atrial fibrillation feature for use in clinical trials — the first such digital health tool approved for inclusion in the agency’s Medical Device Development Tools program.”

But before anything like this will ever happen with prescriptions, the FDA will insist on extremely rigorous testing, including double-blind tests in which some prescriptions are written by currently-authorized medical professionals, while other prescriptions are written by LMMs.

And even when the ethical questions surrounding this are overcome, this won’t happen overnight.

If Tik Stops Tokking (in the U.S.)

Not that it’s authoritative, but I asked Google Gemini what may happen to TikTok users on Sunday, January 19. The text below was written by Gemini.

On Sunday, January 19th, 2025, if the current legal situation remains unchanged:

  •  New users will not be able to download TikTok.
  •  Existing users will likely still be able to use the app on their devices. However, the app will no longer receive updates or bug fixes, potentially leading to performance issues and security vulnerabilities over time.

Important Note: This is based on the current understanding of the legal proceedings. The situation could change if:

  •  The Supreme Court intervenes: They could potentially block or delay the ban.
  •  TikTok is sold: If TikTok is sold to an approved buyer, the ban may be lifted.

I recommend staying updated on the latest news and legal developments regarding this issue.

Idiot

(Imagen 3 Image)

On a Bredemarket Instagram story shared Friday afternoon (to disappear Saturday), I noted Meta’s AI advice NOT to call someone who flew a drone near firefighting equipment an “idiot.” I respectfully disagree. The term is appropriate.

Let me clarify that Meta is not trying to curtail free speech. Only governments can curtail free speech. Private entities cannot.

For example, if I still worked for IDEMIA, and used IDEMIA social media channels to declare the Thales ABIS the best ABIS ever, IDEMIA has every right to delete that post—and me.

In the same way, if Zuck insists that Meta users cannot refer to people threatening lives as “idiots,” that is Meta’s right.

But it’s idiotic.

NEC’s Other “Biometric” Information: Digital Pathology

(Image: AI-predicted cell detection results of Biomy’s DeepPathFinder™. From the January 10, 2025 NEC press release.)

When I interact with the worldwide company NEC, I am usually dealing with automated biometric identification systems (ABIS).

Of course, ABIS is only a small part of what NEC does. It’s also involved in healthcare.

Consider…artificial intelligence and deep learning-powered digital pathology (“a field involving the digitization and computational analysis of pathology slides”).

Per today’s press release:

“NEC Corporation (NEC; TSE: 6701) and Biomy, Inc. (Biomy) have signed a Memorandum of Understanding (MoU) for a joint marketing partnership to develop and expand artificial intelligence/deep learning (AI/DL)-based analytical platforms in the field of digital pathology. Through this partnership, the two companies aim to promote precision medicine for cancer patients and contribute to the advancement of the healthcare industry.”

So what is Biomy contributing?

“Biomy, which aims to realize personalized medicine through pathological AI technology, has developed DeepPathFinder™, a proprietary, cloud-based, AI/DL automated digital pathology analytical platform.”

And NEC?

“NEC has positioned healthcare and life sciences as a core pillar of its growth strategy. With a strong foundation in image analysis and other AI technologies, NEC has a long history of providing medical information systems such as electronic medical records to healthcare institutions.”

As I’ve said before, healthcare must deal with privacy concerns (protected health information, or PHI) similar to those NEC addresses in its other biometric product line (personally identifiable information, or PII). I personally can’t do nefarious things if I fraudulently acquire your digital pathology slide, but some bad actors could. Presumably the Biomy product is well protected.

Vision Language Model (VLM)

(Flood image from Indian Navy)

A Vision Language Model (VLM) is a particular type of Large Multimodal Model (LMM).

Hugging Face:

“Vision language models are broadly defined as multimodal models that can learn from images and text. They are a type of generative models that take image and text inputs, and generate text outputs…. There’s a lot of diversity within the existing set of large vision language models, the data they were trained on, how they encode images, and, thus, their capabilities.”

Ipsotek:

“VISense (is) a groundbreaking addition to its VISuite platform that redefines real-time video analytics with Vision Language Models (VLMs). VISense represents a major advancement in Generative AI integration, using VLMs to achieve detailed scene understanding and contextual insights empowering operators to make informed decisions promptly….

“VISense allows users to ask questions like, “Let me know when something unusual is happening in any camera view” and receive a detailed response describing the unusual aspect of the captured behaviour. For instance, it might respond, “Yes, there is a flood; water levels are rising in the northern section, and several vehicles are stranded, causing heavy traffic congestion,” providing actionable insights that enable quick decisions.”

Type AI

In conclusion—and I will delve into this later—your beloved AI detector may deliver a bunch of false positives, or Type I errors.

For example, if every word in a post is spelled correctly, that’s an obvious sign the text wasn’t written by a human—correct? In the ever-expanding world of virtual communication, correct spelling is a dead giveaway of non-human content—as is the use of characters unavailable on a standard keyboard. Motörhead made a bunch of £ and € despite not being real. As the band never said,

“Timothy Leary’s dead

No, no, no, no, he’s outside, looking in”

(I had to include one hallucination in this post.)

Use MFAID (multi factor AI detection) to increase accuracy when you claim to detect generative AI.

(Timothy Leary image public domain; lyrics from the Moody Blues, “Legend of a Mind”)

If you want to delve into so-called signs of generative AI writing, see

When AI Jumped the Shark

Most product marketing references to artificial intelligence are meaningless. Some companies think that they can simply promote their product by saying “We use AI,” as if this is a sufficient reason for prospects to buy.

I’ve previously observed that saying “we use AI” is the 2020s equivalent to saying “we use Pentium.” 

It’s a feature without a benefit.

It’s gotten to the point where meaningless references to AI have jumped the shark.

Literally.

“(Several organizations) received a three-year, $1.3 million National Science Foundation grant to teach Florida middle school teachers and students how to use artificial intelligence (AI) to identify fossil shark teeth….Florida teachers learn to use a branch of AI called “machine learning,” to teach computers how to use shape, color, and texture to identify the teeth of the extinct giant shark megalodon.”

(From https://www.floridamuseum.ufl.edu/earth-systems/shark-ai/)

Now I come from the identity/biometrics industry, which uses machine learning extensively. But customers in this industry don’t really care about the “how,” (machine learning). They care about the “why” (identify individuals). For all the customers care, the vendors could use Pentium for identification. Or blockchain. Or Beatrice. As Loren Feldman says, “It doesn’t matter.”

Remember this the next time you want to identify extinct megalodon shark teeth. Now I admit the exercise serves an educational purpose by exposing teachers to the capabilities of machine learning. But if your sole interest is tooth classification, you can simply purchase the non-expurgated version of Olsen’s Standard Book of Extinct Sharks and get the job done.

Marketing executives, AI is no longer a differentiator. Trust me. If you need assistance with a real differentiator, I can help.

If you want to win business, learn more about Bredemarket’s content – proposal – analysis services here.

Career Detective: My AI-generated “Podcast”

I normally don’t listen to 20+ minute podcasts, but I listened to this one because it was all about me.

Seriously…there’s a 20 minute podcast that focuses on me.

The two people on the podcast spent the entire time talking about my most recent ten years of professional experience.

Except…the people weren’t people.

NotebookLM file-to-audio creation

The people were Google bots, powered by Google’s NotebookLM.

Upload PDFs, websites, YouTube videos, audio files, Google Docs, or Google Slides, and NotebookLM will summarize them and make interesting connections between topics, all powered by Gemini 1.5’s multimodal understanding capabilities.

With all of your sources in place, NotebookLM gets to work and becomes a personalized AI expert in the information that matters most to you….

Our new Audio Overview feature can turn your sources into engaging “Deep Dive” discussions with one click.

I uploaded the most recent version of my resume to NotebookLM.

Technically, this is not my resume; this is a PDF version of a portion of my LinkedIn profile. But my resume has similar information.

NotebookLM used the resume as source material to create a 20+ minute podcast called “Career Detective.” In the podcast, a male and a female pair of bots took turns discussing the insights they gleaned from the resume of John E. “Breedehoft.” (I use a short e, not a long e, but people can call me anything if I get business from it.)

Surprisingly, they didn’t really hallucinate. Or at least I don’t think they did. When the bots said I was deeply qualified, as far as I’m concerned they were speaking the truth.

They even filled in some gaps. For example, I used the acronyms for KYC, KYB, and AML on my resume to save space, so one of the bots explained to the other what those acronyms meant, and why they were important.

Probably the most amusing part of the podcast was when they noted that I had worked at two very large companies. (Just so you know, my resume only goes back to 2015, so Motorola isn’t even discussed.) While Incode and IDEMIA are both multinationals, I wouldn’t characterize Incode as massive.

Anyway, judge for yourself

So here’s the audio episode of “Career Detective” that focuses on…me.

By the way, I learned about NotebookLM via the Never Search Alone Slack workspace, but still need to explore NotebookLM’s other features.

Dr. Jones MD, NPE

I have a telehealth appointment next week with a medical professional whom I have previously met. And I assume she will participate in the telehealth appointment.

In the future, of course, she may not.

Way back in April 2013, I wrote a tymshft piece entitled “You will still take a cab to the doctor’s office. For a while.” It speculated about a future 2023 medical appointment in which the patient took a driverless cab to a medical facility. In the office, the patient was examined by remote staff…or so she thought.

“Well, I’m glad you’ve gotten used to the procedure,” replied the friendly voice. “I hope you like me!”

“I do,” said Edith. “You’ve been very helpful. But I’ve always wondered exactly WHERE you were. If you were in Los Angeles, or in Mississippi, or perhaps in India or China, or perhaps even in one of the low-cost places such as Chad. If you don’t mind my asking, exactly where ARE you?”

“I don’t mind answering the question,” replied the friendly voice, “and I hope you don’t take my response the wrong way, but I’m not really a person as you understand the term. I’m actually an application within the software package that runs the medical center. But my programmers want me to tell you that they’re really happy to serve you, and that Stanford sucks.” The voice paused for a moment. “I’m sorry, Edith. You have to forgive the programmers – they’re Berkeley grads.”

“Oh,” said Edith after a moment. “This is something new. I’m used to it in banking, but I didn’t realize that a computer program could run an entire medical center. Well…who picks up the trash?”

“That’s an extra question! Just kidding,” replied the friendly voice. “Much of the trash pickup is automated, but we do have a person to supervise the operation. Ron Hussein. You actually know him – he was your cab driver in 2018 when you came here.”

Re-reading this 2013 piece, I was amused at three things I got wrong.

  • First, Google, Facebook, and Apple did NOT merge to form Gaceapple, “the important merger that saved the tech industry in the United States from extinction.” American tech firms are still powerful…for now.
  • Second, my assumption of cab companies adopting driverless cars assumed the continued existence of cab companies. Ride share services have reduced the presence of traditional companies dramatically.
  • Third, my assumption that medical firms would sink untold sums of money into centralized automated medical examination rooms could be questioned…especially for routine appointments like Edith’s. Why not just let Edith’s smartphone—perhaps with a single attachment—gather the data?

Of course, there are medical ethics questions that underlie this entire discussion of remote telehealth and the use of non-person entities (NPEs). And we are struggling with those right now.

Image of Dr. Jones MD, NPE from Google Gemini.