Meditech leader: AI should automate tasks and augment clinical decision making
Editor’s note: This is the third in a series of features on top voices in health IT discussing the use of artificial intelligence in healthcare. To read the first feature, on Dr. John Halamka at the Mayo Clinic, click here. To read the second interview, with Dr. Aalpen Patel at Geisinger, click here.
Meditech is one of the largest and longest-standing electronic health records developers in the industry, and recent years have seen it diving deep into an array of artificial intelligence initiatives and partnerships:
-
Meditech adopted advanced natural language processing and large language models through a collaboration with Google, established in 2021.
-
The company has integrated Google Health’s search and summarization capabilities within its Expanse EHR using NLP and LLMs to understand the intent behind a clinician’s query.
-
It is working with Google on generative AI for auto-generating clinical documentation, including the hospital course narrative.
-
Meditech is using conversational AI in its virtual assistant to search the chart and place orders.
-
And it’s working on advancing its voice strategy through ambient listening to reduce documentation burden.
We spoke recently with Helen Waters, executive vice president and chief operating officer at Meditech, to glean a deeper understanding of this major vendor’s AI strategy and how healthcare provider organizations can integrate these kinds of AI technologies into their daily work, particularly through their EHRs.
Q. Meditech has a formal and succinct position on the incorporation of artificial intelligence technology into its systems. Please talk about what the position statement means to the company.
A. AI aligns quite well with our mission to provide technology that will enable healthcare organizations to deliver safe, efficient and impactful care. Our specific AI statement is that we embrace the potential of various forms of AI and data to help achieve that mission through task automation, through augmentation of clinical decision-making and intuitive user design.
Our approach to integrating AI into EHRs, within Expanse in particular, is thoughtful, it’s deliberate, and is driven by an understanding that its incorporation must happen within daily workflows and should squarely focus on safety and enhance experience.
It’s most important for our care teams for addressing the challenges in healthcare today we see with physician burnout related to technology, the workforce shortage, and obviously to other members of the care team and for the patient experience.
We carefully evaluate various types of AI. We determine which AI can provide the strongest stability to improve the end user experience and, obviously, more important, care outcomes.
We judiciously incorporate those capabilities into our workflows so we can support health systems in their objective to deliver the very best possible care with the highest probability of a quality outcome for their patients.
Q. Meditech has been exploring advanced natural language processing capabilities and large language models through its partnership with Google, established in 2021. You say you had a vision for where the market was headed and set off to execute on it. Describe the vision and how you have begun executing on it.
A. Our exploration into these areas began in earnest with Google in 2021. But our exploration with regard to natural language processing and how to embed newer technologies into the EHR began quite a bit before then.
In 2019, we moved to host Meditech customers in the public cloud and thereby begin to develop applications in a native cloud environment. No longer historical development approaches but taking native cloud capabilities and developing an application suite.
We also worked with Nuance several years back at the beginning of their incorporation of NLP and AI into conversational AI with a tool we released called Virtual Assistant, enabling the caregiver to talk to the tablet to ask to review information and add orders so the physician can place orders verbally and have very simple instructions for any interactions.
So, we’ve taken AI to a number of different places over the last several years in a positive manner, going back to our mission. As far as delivering impactful care, one of the most obvious challenges in our industry documented across all EHRs is the clinical burnout [connected to] the technology.
In 2021 we began discussions with Google about their tool set, and we decided together to embark on a project that we have now delivered to a customer, actually to embed naturally the Google Search and Summarization capability within our record.
By Search and Summarization, I mean taking advantage of capabilities within the Google platform and within our Expanse workflows and using natural language processing and a large language model to understand the intent behind a clinician’s inquiry. Instead of relying on specific word matches, clinicians can more naturally interact with the patient’s record, both their current record and their historical records from a legacy Meditech system or perhaps another vendor’s EHR.
They can use terminology, abbreviations and medical language that actually makes sense to them. Google is the premier technology company with regard to search. We saw an opportunity to embed that within and make it easier and simpler to find information in a record. Physicians can also pull information from unstructured data so they have the ability to bring in information in their inquiry that will include scan data, handwritten documents that then exist within the existing record and legacy platforms.
We’re using the LLM to understand the context and the concepts behind a user search, so we can also offer reassurance to the clinician that they have the most relevant information in order to make their next clinical decision. It’s brought forward in a very simple, expedient manner. Instead of physicians hunting and pecking through records, beyond simply just synthesizing the information, the Google Health Search and Summarization will intelligently organize and present that information so that the most important conditions are highlighted.
So, it’s correlating data based on the LLM that’s contained within the solution in AI, which was done in advance of HIMSS23, where the AI hype really started. We saw the opportunity to really help intelligently surface information that physicians could use interactively in making the next clinical decision.
At the end of the day, they want as much information as they can gather as quickly as they can find it in order to make that best optimal decision. So, condition-related health data is brought in, like labs and vital signs and medications. Also, it’s easily accessible for immediate review by the physician and the nurse. I should add it reduces cognitive burden for clinicians.
We saw an opportunity. We worked with Google in concept in the fall and winter of 2021. We began coding together throughout all of 2022. And we have now delivered that to a customer and have seen a very high degree of interest from a lot of physicians on the Expanse platform and on non-Expanse platforms.
Q. You also are working with Google on a generative AI capability for automatically generating clinical documentation, including the hospital course narrative. Please elaborate on your work here.
A. We have seen a whole host of tools coming of age now that can interpret and bring back information that most would call generative AI. We’re looking initially at the hospital course of stay.
One of the greatest issues today in healthcare is clinician burnout. I don’t care whose medical record system they’re using, you can go to organizations that are large, that are small, that have spent a little on technology, that have spent a lot, and everyone’s complaining about the same thing. It’s greatly enhanced the rapid rate of burnout.
So, we looked for ways for AI to help. The course of documentation, of course, is one of the biggest challenges for physicians. So, we could take one of the most time-consuming elements of their day associated with a patient’s discharge and enhance that by allowing the system to meaningfully look through the record in the encounters, the medications, the tests, and bring back the hospital course narrative.
The AI tool would allow us to begin to build their note quickly and efficiently without them having to go back through the entire record and decide which elements they wanted to bring in. Transitions of care are certainly a well-recognized, vulnerable place for patients, and certainly that happens at the discharge process.
So, we think it could be facilitated nicely in terms of leveraging AI to take advantage of all the information that’s in the record, communicate it in a very concise manner and ensure we maintain a high quality of care and, more so, an efficient transition for the patient to their next environment, whether that’s going home or going on to another facility.
If you look at patients, certainly with high acuity levels, which tend to be those in the hospital, some who have had maybe perhaps an extended length of stay, then we can imagine how long and how many data points exist in that record in order for that patient to be discharged and that discharging physician to assess it.
We think AI is a perfect tool to enable a course narrative for the entire hospital stay and begin to relieve the physician of that burdensome final task of the discharge process.
And we will look for other use cases, of course, that have potential time savings that we’re evaluating relative to the nurse transition capability. Shift transitions can be arduous and long. The system’s been collecting all kinds of information about that patient and what the nurse has been documenting, what the assessment and plan speaks to, bringing that together in a concise manner for the next nurse that’s coming on.
We think AI has potential value and speed and eliminates a little bit of the burden for our busy and overloaded nurses today.
Q. Meditech uses conversational AI in its virtual assistant system to search the chart and place orders. How does this work, and why is AI so important here?
A. The Expanse virtual assistant, which we worked with Nuance on, allows the Nuance speech recognition to respond to verbal commands placed by the clinician. It can retrieve the information clinicians need and present it in a very efficient moment at the point of care, so a physician with a mobile device can see the same power of AI that powers smartphones and tablets and other intelligent devices.
By using the voice commands, providers save time placing orders. They save time in finding the last set of lab results and reviewing whatever medications the patient’s been on, allowing them more opportunities to maintain their eye contact with the patient and to use a natural form of communication, which is still voice in our day and age.
It’s ironic, we went from fighting originally with the pen, to the transition to a keyboard, eventually to tapping and swiping. And now we’re full circle back to the potential power and capacity of voice. So, our first phase was to provide the patient history to be able to see transitions in new care settings and allow them to round.
We recently added the ordering capability. So, examples of commands you might ask for: open my rounding list, present my lab results, show me the last INO summary for 24 hours, show me the last three sets of vital signs. And of course, now medication ordering and non-medication ordering, which is so critical for the inpatient and ambulatory settings. Physicians can clearly and easily use their voice, which is a big step forward.
We’re trying to provide a multitude of different tools to ensure our customers this industry has choice. And we are happily working with companies advancing in these areas at a rapid level of speed. Obviously, Google has been a massive force and we have a wonderful partnership together, as is the case with Nuance. But there will be other players that will make as big an impact in their own way through the use of these technologies.
Q. So you’re doing a lot of work with AI and healthcare. That’s clear. What would you caution healthcare provider organizations to beware when it comes to working with systems that incorporate AI? What should they know? What should they test? How should they proceed?
A. I would advise organizations to start by evaluating their use cases. Think about their clinicians and what is really a very stressed and overtaxed workforce everywhere you turn. Think about administrative teams and obviously the patient experience, which continues to be at the forefront. Which workflows are time-consuming and impacting repetitively on a monotonous day that’s causing some of the burnout.
Start to prioritize those use cases with a higher degree of feasibility to implement and gain some experience. To walk cautiously toward the areas where you can have an impact in deploying new technologies and work through the associated change management that will be there to support these transitions.
Have a very strong governance team that’s reflective of the organization from every different angle. Know the terms and concepts that you’re looking for. There are a lot of terms out there in AI that I think can be a bit confounding for people.
So, if you’re an organization that hasn’t delved all the way deeply into this, think about understanding what machine learning is versus deep learning versus natural language processing and large language models. The industry is still on its journey forward. There’s been some statements from Gartner that the true usability and applicability for this in the broad scale will be two to five years out.
Organizations need to really be introspective about their desires. Look at your partnership with your existing vendors, understand what things they’re working on and realize that technology is still evolving. There’s been a lot of appropriate cautionary tales about the fact that it’s not foolproof at this point in time, in every capacity.
That there’ll always be the need for human intervention and particularly clinician human intervention, in every decision that gets made relative to patients. But it should be able to relieve a lot of the administrative burden, overhead and frustrations they have and those monotonous everyday tasks.
And there’s lots of low-hanging fruit there that we think can be helpful in the near term. We’re going to be an active participant for a vendor that’s been in business as long as we have. We feel as energized today as we’ve ever been about this next era that we’re sitting on top of.
We’ve digitized all this information. The organizations have spent a lot of money. The country has spent a lot of money. Now it’s time to leverage that, to recognize EHRs are a component of this digital shift in healthcare. But there’s augmentation to be had out there that can really make a difference for healthcare.
Editor’s Note: To watch the video of this interview, which contains bonus content not in this story, click here.
Follow Bill’s HIT coverage on LinkedIn: Bill Siwicki
Email him: bsiwicki@himss.org
Healthcare IT News is a HIMSS Media publication.
Source link