Interoperability and AI with FHIR presented by Mark Braunstein

Interoperability and AI with FHIR presented by Mark Braunstein

Show Video

Pleasure to be here today. I am going to move away from research now to, to talk about the, not only the practical  applications of some of this,   but even commercial applications  of some of this in an area that I pay close attention to. First, SMART and FHIR have both been mentioned but I, I want to make it  absolutely clear to everyone what the implications of those 2 technologies are to the practical use of some of the tools and applications that have been alluded to earlier. Everyone here has a smartphone in their pocket, or in their purse, or  somewhere on their possession I'm sure. And we all take it for granted that whenever  we wish we can download apps on to that phone and that with our permission,  hopefully with our permission,   those apps can access whatever data is available on that phone. Well, the practical  ramifications of FHIR and SMART

are that you can do the same thing with  an electronic health record system. You should be able to download apps that a physician or some other  care provider, or even a patient, feels would be of value to  them, gives the apps permission and they can access whatever  data they need to operate. And indeed, that's not only  happening, it's happening commercially particularly the United States, where  ,I don't have time to go into what the   federal government did, but the federal  government has essentially mandated that this stuff should happen, it's the law. So  all of the major EMR vendors, and many of the smaller vendors now support what I just  described. We're going to look at Cerner.

I would be happy to look at Epic,   but they won't let you show their  stuff. It's kinda a strange company. And I'm going to illustrate  some of this using three commercially available SMART on FHIR apps. First is DoseMe, which is  Australian, comes from Brisbane. Second is Rimidi. for full disclosure I'm an  advisor to the company and on their board. And they're based in Atlanta  and the third is Suki, which is in Silicon Valley.

Before we do that, I'd like  to outline what I feel,   are some of the key success factors  that are going to be necessary for this technology to to do  what it has the potential to do. The first is we need a predictable data model. These apps are designed to work across  EMRs. I can't overemphasize to you   the importance of that to a commercial  enterprise. You have one product where,

it's not quite that that true,  but close to one product,   and it can work with all these EMRs.  That has enormous ramifications for the cost of operating the company   secondly, and this is was alluded to  earlier, I think by Sean, but I'm not, I think it was Sean,   and I cannot overemphasize important to  this. The tools, no matter how good they are, to be successful must be integrated into the   work flow and process of the  intended user, particularly. if it's a physician user and it can't be duplicate data  entry. We've spent 50 years   actually demonstrating that if you don't do this you can develop wonderful  tools and nobody uses them. The 3rd thing is one of the objectives of, of   many of these technologies is to  provide clinical decision support.

To help physicians, other  care providers, patients, make good decisions, something  that they demonstrably don't do as often as we would like them to.  The key here is not just to provide the support, but to provide it appropriately.  If it doesn't fit this patient, or it doesn't fit the current  clinical status of the patient, that quickly generates something that has got  a name alert fatigue and it's been shown that providers turn off and don't pay any attention  to the alerts even when they are appropriate.

And finally it's becoming increasingly important that we consider it and indeed apply Artificial Intelligence and  natural language processing   into these clinical tools to make them smarter. Something that you've already heard about, I think  some of the points I'm going to make, have already   been made, but I'll, I'll remake them, I guess.  Well, FHIR provides the predictable data model. I haven't gotten any detail, but, The word "resources", the last of  the 4 words in the FHIR acronym refers to the units of that data model which are highly predictable, although not  totally prescriptive, descriptions of patients,   providers, clinical observations, conditions,  medications, and a host of other things.

And an app developer can count on  these resources being quite similar,   no matter what the underlying EMR  is. That's incredibly important. The 2nd is integration of FHIR apps  into the workflow and the process.   And here's an example of that. This is DoseMe. As I said, an Australian FHIR app  running within Cerner's power chart. Here on the left, you see that it's  initiated by clicking on a menu item that's in line with all the other menu items  that the provider is using to do charting.

So it looks to the provider like part of the EMR,   and even looks more like part of the EMR when you  consider that the area where they app operates is the same area where  charting and clinical review is already done, so not only are  these apps integrated into the EHR,   they appear to be the EHR. The major  difference is the user interface is, often quite a bit better than the EMR. Uand if I had enough time,  I would show you some apps whose primary purpose is that. 3rd, and this was alluded to  again by Sean I think it was, these apps can get the data they require at  launch, using a facility built within SMART so that there is no duplicate data  entry. In fact, some of them just   come up with the results the provider wants  there's no need to enter any data at all. The 3rd is providing critical success factors,  providing appropriate clinical support.

So, here, using Rimidi is an example of that, leveraging a technology from the same  group in Boston that provided SMART. And it's called clinical  decision support or CDS hooks,   and this allows the embedding of clinical logic into the process, so that the clinical decision support is provided at the  appropriate time for the appropriate patient. Here we see that the whole  paradigm is event driven so clinical decisions support is  evoked when the chart is open, when orders are written, places when it might be  appropriate to provide advice. So, I, I emphasize   it not only needs to be appropriate for the  patient, but for the point in the patient's care.

And this is real, this is from Rimidi here. A physician is provided with a CDS hooks recommendation based on data from a continuous glucose monitoring  device in the patient's home. So, interesting stuff going on here,  all of a sudden data from the patient, pr from devices in the patient's  own is part of the EHR workflow   in process. Something we've talked  about for years but now it's real. And this alerts based on the percent of  the time, the patient is in glucose range.

And here is an alert provided  based on a guideline, in this case,   the American Diabetes Association  guideline for diabetic care, And it's recommending that the  physician order some stuff,   because the patient's most recent  hemoglobin A1c, a test that measures the control of diabetes over the  previous 90 days is higher than it ought to be. And finally, at least from Rimidi, bills all of this into the EHR's  order flow. So if the physician   clicks the button to order what was  just recommended, they're taken directly in this case, this is part of  Cerner's order flow, into the. order flow of the EMR, making this as seamless and built into the workflow process as possible. So, we've looked at DoseMe, and we looked  at Rimidi now we're going to look at Suki. As the talk, I was asked to talk about AI powered SMART on FHIR apps.

Suki is to health care, or to charting what Siri and Google assistant and Alexa are to everything we do every day. So here the physician says  Suki create a clinic note. It creates a note based on what's going on in this patient and because it's learned providers behavior when presented  with certain clinical situations.

Provider goes on to say, insert my  normal review of systems and it does. And then the provider can say, but I want  you to change part of that to something else and it does then, so the provider can switch from giving commands to charting  seamlessly and Suki can understand what's going on. It's also used for what  I choose to call intuitive data retrieval. So here the providers say, show  me Mrs Ramirez's medications.

And it does and provider notices this, let's  say they are not familiar with Mrs Ramirez,   notices that 2 of them are  diabetes medication. So it says, Send me a graph of the hemoglobin A1c over time. And it does. So, how does this work?

I am not a PhD computer scientist, like  Michael and other people here. So I'm   going to give you a fairly high level  view, but that's probably appropriate. So and what technologies it  use, well, this is a FHIR. Or I wouldn't be talking about it  and it pulls the data. It needs

from the chart using FHIR and SMART. Then it uses natural language processing. 2 specific technologies that the company  tells me they use a transfer learning where   it applies knowledge gain from elsewhere  and training the neural network against the kind of healthcare data sets  that the app is going to see. It understands context and intent.  This is critically important and you'll see applications of  AI and medicine that don't   understand context and then they do dumb things.

That, like, chatGPT is is is capable of doing. So, it understands the action that's conveyed,  what it is that the provider wants to do. And it has through another technology, the ability  to extract structured information from text,   keep in mind, most of charting is still text. Talk about that in a bit and then it  can push the note to the patient's chart using FHIR, or where it has to the  API is provided by the vendor then the provider reviews, makes any changes  they want, and signs the note. And they're done.

This is this, this is not the I  made 1 change to this presentation,   and I didn't get in here. So, let me just talk  to it. Pretend like you're looking at a slide from the American Academy of family physicians, the largest physician group in the United  States, and a major advocate for digital health. In fact, the AAFP operates an incubator for  digital health companies and Suki is their most successful example. They did a study, and they offered a 130 odd family  positions, the opportunity to try Suki and about half of them decided to buy it, to adopt it. And the academy  surveyed those 60 odd positions

and found that the amount of time they were  spending doing documentation was reduced by 72%. I want to make sure that sinks in - 72%. Now, why is that important? Well, obviously,  physician time is valuable. Secondly, if you   survey physicians about electronic health records,  it doesn't matter whether they like their record,   or they hate it. Their number 1 complaints  going to be "It just takes too much time."

So, what about the future? Where are we headed. There's the slide, I put it in the wrong place. Apologies to David and Marianne.  You did give me my, I just I did  

this last night. I realized,  you know how your brain works,   I realized I forgot a slide that it will be in  here. So, but I've already covered all that. So, the National Health Service in England.

Actually commissioned Eric Topol the famous physician researcher,  futurist, digital health guru, to look at the impact of digital technologies on workforce, genomics, digital medicine,  AI and robotics, and there is a link if you   want to read the report, it's really  interesting to read, at the bottom. And this is an illustration from the report on the likely impact of these  technologies over time, with the darker colors. meaning greater impact. We're going to focus for a second, on  1 of the areas that they looked at. The impact of automated image processing using AI,  something we've already heard about this morning and they actually did a study and projected that the widespread use  of that technology would free up all of the time of 500 radiologists in the UK.

Well, this stuff, this report's a few  years old, this stuff has moved from speculation to and reality,  and in fact, 2 days ago,   1 of the front page stories in the New York Times was about the success in using the  technology to detect breast cancer and   the concerns radiologist starting to have  about whether they're going to have a job in the future. Now, we're going to look  at something else that they focused on which is the use of speech recognition and natural language processing. Now this study was released in 2019. Just 4 years ago, and there's an illustration  of how fast things are moving, The study then, this is Eric Topol speaking, said as digital  technologies become more prevalent, there is a   risk that a deluge of automatically transmitted  data will overwhelm health professionals.

Well, and I will stop and  say, and if you look today, at the patients who drive most health care  costs in the U. S. and in Australia and the   rest of the advanced industrialized  world, patients with a variety of chronic diseases are seen by the  healthcare system with great frequency, their charts are already unwieldy and not, for all practical purposes, usable. So the  application of API to generate patient summaries, they provide a clinical useful solution to this   problem and I actually think this is 1 of  the most important potential applications of the technology. First to save  physician time and secondly,  

to make things sure things aren't missed. Well, this is 2023 and lo, and behold, there is in New York, not far from where I live,  a new startup company called Abstracted Health. spun out of Cornell Tech, the new   technology Institute built in the middle  of the East River on Roosevelt island. This company is less than a year old, and it it focuses on, currently is focusing  on discharge summary, something that physicians manually write. I can tell  you, I'm a physician and physicians   hate doing this. When I was in training  they wouldn't give me my paycheck until   I had done my discharge summaries. The  only way they could get us to do it.

And we already know that physicians complain about  spending too much time doing charting anyway.   And there are statistics that suggest  that for one, each hour patient care. physicians spend 2 hours in  her electronic health record. So could you automate the production of a discharge summary, which  tends to contain a history of the present illness, a summary of the daily course of care And follow ups, and the most complicated part  of that is the middle part, the daily narrative? So the company says they use encoder decoder  sequence to sequence transformer models. to get sentences that summarize each day of the care and put it in chronologic order.

Well, that's all cool. Does  it work? Well, it currently, besides this company is not a year old yet, there summaries as evaluated by physicians - currently 62% of them meet  the physician's definition of sufficient quality to be clinically useful. And are they going to get these slides? So Quick quiz, uh, it's going to be an easy  quiz. I'm going to give you the answer,   but we have time to to do it.

Which is which? I can tell you,  if you read these carefully,   it's less than obvious which  is which. One of these is the discharge summary written by   Abstractive Health and the other is a  discharge summary written by a physician. In fact, 1 on the left is done by the computer  and was evaluated by a panel of physicians on a scale of 10 has 8 on these 4 criteria -  quality, readability, factuality and completeness. This is the 1 done by the physician and it  ranks slightly better, but not that much better. And in fact, what I've  highlighted here are the major, it's hard to know why the  physician ranked higher because, there's only 1 place where this is noted in there.

Well, I guess that's right  the physician's rank higher,   because it's only 1 place. I  got that backwards and the main, criticism On both sides was that there were things mentioned And there was no detail given,  not that there was an error, just didn't go into enough detail. Something that  I assure you the computer can get better at. So, how do they do this? Well, I want to thank  the CEO of the company who's worked with me to get this in a language that almost  anybody should be able to understand. So, at the heart is not GPT3. but a similar tool developed  by Facebook called BERT.

There are 2 reasons why they use this and not  GPT3, is BERT was designed to summarize text. If you use Facebook, the news  summary you can see are done by BERT. Secondly, it's open source, which  means the company can operate it, which is important if you're talking about putting   protected health information into the  thing, you can't just have it anywhere. And the open AI people who they did talk  to about using GPT3 refused to make it   open source and refused to sign what in the  U. S we call a business associates agreement, which you have to have if you're  using protected health information. So, for practical reasons, they're  using Facebook's BERT and I can't   help but point out it was trained on  CNN, that's how it learned English, And it was trained on health care  data and that's how it learned the   specifics of health care data. This  is how this stuff's actually done.

And then uses something called Constrained  Beam Search. Michael might know what that is,   I really don't, but I know what it does. It tells the model that there are certain  words, it must be included in the output.

Think about it. That's pretty important. Then, down here, it uses google's BERT,  which is sort of the Swiss army knife   of natural language processing.  It's good at a lot of things. And in this case, it's good at figuring out what's important and identifying follow on care, I'm meant to point  out there are 3 parts of the discharge. I mean,   the last part is follow on care and that  you can't forget that that has to be.

In the summer, and of course, then there's  the most important technology at all. FHIR. That's how it gets the data  from the EHR, and that's how it puts the summary back in the EMR, this  thing is, in fact, a FHIR app. This is just all the technology sits behind it,   the physician just clicks on the app  and they get the discharge summary. Quick commercial for my book, if you want more, if you want, my book has about 40 odd  examples of of real world applications   of FHIR and FHIR apps. If you're  interested in that sort of thing.

And if you have an, listen, if you have  an academic affiliation, it's free. Can't beat that, just go to Springer link and   your institution. You can download the  book for nothing. Thank you very much.

2023-04-07 06:54

Show Video

Other news