Can an AI Bot Decipher Medicare?

(Imagen 3) 

I’m not the only person interested in AI applications in health. Kerry Langstaff is exploring various AI applications in a series of LinkedIn articles, and her recent article is entitled “How AI Became My Caregiving Superpower: Managing Medical Tests, Doctor Visits, and More.”

Langstaff explores six possible applications. I’m not going to delve into all of them; read her article to find out about her success in using generative AI to understand medical tests, take appointment notes (with consent), understand terminology, organize medications, and figure out how to fold a wheelchair to fit in a car.

Understanding a health insurance plan

But I will look at her fourth application: navigating Medicare and medical equipment.

Medicare, or any U.S. health insurance plan (I can’t speak to other countries), definitely needs navigation assistance. Deductibles, copays, preventive, diagnostic, tiers, or the basic question of what is covered and what isn’t. Or, as Langstaff put it, it’s like solving a Rubik’s Cube blindfolded.

Such as trying to answer this question:

“How do I get approval for a portable oxygen concentrator?”

The old way

Now if I had tried to answer this question before reading the article, I would find a searchable version of the health plan (perhaps from the government), search for “portable oxygen concentrator,” not find it, finally figure out the relevant synonym, then confirm that it is (or is not) covered.

But that still wouldn’t tell me how to get it approved.

Langstaff was warned that the whole process would be a “nightmare.”

The new way

But generative AI tools (for example, NotebookLM) are getting better and better at taking disparate information and organizing it in response to whatever prompt you give it.

So what happened to Langstaff when she entered her query?

“AI walked me through the entire process, from working with her doctor to dealing with suppliers.”

But we all know that generative AI hallucinates, right? Weren’t those instructions useless?

Not for Kerry.

“I got it approved on the first try. Take that, bureaucracy.”

But wait

But I should add a caution here. Many of us use general purpose generative AI tools, in which all the data we provide is used to train the algorithm.

Including any Protected Health Information (PHI) that we feed into the tool.

Imagine if Langstaff had inadvertently included some PHI in her prompt:

“Here is the complete prescription for Jane Jones, including her diagnosis, date of birth, Social Security Number, home address, and billing credit card. The prescription is for a portable oxygen concentrator. How do I get it approved?”

Oh boy.

Most medical providers freak out if you include PHI in an email. What happens when you submit it to Stargate?

Be careful out there.

Leave a Comment