This transcription is provided by artificial intelligence. We believe in technology but understand that even the smartest robots can sometimes get speech recognition wrong.
Newsday: We Implemented Ambient AI Backwards with Angel Mena
Speaker 47: [00:00:00] You already know healthcare operations are challenging fragmented systems, manual processes, a ton of separate solutions. Symplr brings essential workflow operations together, supporting the back office functions that make healthcare go. Nine out 10 hospitals and 400 plus health plans already use Symplr.
Learn more at symplr.com/cio
Speaker: I'm Bill Russell, creator of this week Health, where our mission is to transform healthcare one connection at a time. Welcome to Newsday, breaking Down the Health it headlines that matter most. Let's jump into the news.
Bill Russell: All right, it's Newsday, and today we are joined by hel Mena, CMO, for Simr. And of course our intrepid, Intrepid, that's a good word, intrepid, , regular
angelmena: Okay.
Bill Russell: To Ford and Sarah Richardson. Welcome everybody. Look forward to the conversation.
Sarah Richardson: [00:01:00] Yeah.
Drex DeFord | This Week Health: Good to.
Bill Russell: I'm looking forward to, I, I, you know, as always, I keep you guys in the dark up until the last minute because I like the spontaneity, of the discussion. which I know that some, some of you on this call may not like as much. But, uh, today we're gonna talk about, uh, you know, we are gonna, since we have a CMO here.
We are gonna talk about the, the clinicians, the impact of AI on the clinicians, uh, workforce changes, impact on quality, safety, those kinds of things. Um, maybe even, you know, what this looks like for training the next generation of, uh, of clinicians now the. the, the thing you have over the three of us is we've all worked in hospital systems, we've all been on the IT side. None of us is a clinician. You're the only clinician on this show. So you, you get to speak with a lot of authority in this space. And then, and then we get to, uh, chime in on, on our, from our experience standpoint, um, I, I'm curious to just to start, start this conversation off where. What is the, what is the state of mind of clinicians today with the, uh, change, uh, the technology that's being pushed [00:02:00] at them? Uh, you know, there was a time where we were talking about the EHR and it needed to be optimized and it was never quite optimized well enough, and it was creating more problems than it was solving for clinicians. Uh, then it got a little better. Um, and then all of a sudden we, we threw this whole AI engine on top of it.
And now we have things like, uh, agent factory and whatnot coming down the pike. We have a lot of stuff. How are, what's, what's the mindset of clinicians as, as, uh, with regard to technology in healthcare today?
angelmena: That, that's a great question. But before I hit on the technology side, I, I'm gonna say that, you know, um, we are clinicians. That's true. And that's the difference between, you know, the four of us here. But I'll say something we've realized as physicians is that we can't deliver care. By ourselves. We really have to acknowledge that there's a team of specialists across our healthcare systems in our industry that are required to improve the outcomes of our patient care.
So, um, we, we need all of us to improve our care. Um, in regards to the mindset, I mean, I think I, I. What I would say is in this journey, um, I, I've been [00:03:00] blessed and lucky enough that I've been, uh, able to follow the path of, uh, uh, being a, a teacher, a educator for the next generation of physicians, but also I was involved in technology.
Early on in my training days, um, the first I, uh, I, I started in, in doing paper charts. So I was part of that transition from paper charts to EHR, and I saw how EHR retired a few physicians. Uh, so we, early on we acknowledged that there was gonna be a disruption of our workflows. And I think that's key here, right?
Is understanding how technology has. Uh, improved, you know, clinical outcomes and improved our work-life balances and, and how we deliver care. But it does disrupt our workflows and we have to understand and acknowledge that. Um, in, in coming to today, what is the mindset? Uh, I, I would go back, sorry, eight years ago where I first encountered the first ai, or at least the first company that added AI to.
To their technology. And it was, it was very interesting. I remember at that, at that time, this was a [00:04:00] stroke, uh, diagnosis technology. And, and I went to my CEO at that time and said, we, we need to, we need to invest. We need to talk to them. This is gonna change the, the way that we deliver care. And it did and, and we still see it today.
Uh, it wasn't up until what I call the commercial ai, the gen ai, the chat GBTs of the world that we started to look into what other problems could we solve, what workflows could we solve? So the mindset now is that we can really implement the technology that we have today to solve the problems that we've been seeing for.
Decades in healthcare and deliver better care to our patients. But we have to understand it's still gonna disrupt workflows.
Bill Russell: I feel like Uncle Drex. What was your first recollection of AI in healthcare?
Drex DeFord | This Week Health: I think for us it was just the, know, I don't know that we necessarily called it AI at the time. A
angelmena: Yeah.
Drex DeFord | This Week Health: analytics, right? We had a lot of [00:05:00] data and we were doing things like, um, figuring out if you had these 12 indicators on a patient and they were moving in this direction, that probably meant that sometime in the next hour they were gonna code or they were gonna have some kind of an incident.
And it, so it was sort of like. Figuring out how to use data to intercept the past. It wasn't really AI in the AI sense of, of, you know, uh, GPT and anthropic and the things we think of today, but we were using it to try to do predictive analytics. And um, you know, that was kind of the, when you think about how long have we been doing ai, we've been doing AI ish stuff for a really long time.
So, it's, it's interesting., Let me ask you a question too. so given the everything that's going on in the hospital and all the things that our physician, a physician's doing on a regular, you know, daily basis. How aware are they of all this other stuff [00:06:00] that's going on in the background with AI and, and the work that's coming and, and, and all of those sorts of things.
They have a chief medical officer who's watching this stuff happen and maybe has their hand on the wheel, but for the daily frontline physician, are they just trying to make, make it through their day?
angelmena: For, for the most part, I would say yes. I mean, as you know, physicians are, are extremely busy and our quotas, uh, for patient care has have been increasing. We continue to ask ourselves how, how are we gonna, um, uh, finance the technology these days? And that, that comes at a cost. Now some healthcare systems are acknowledging.
That this has to be a pillar because it does provide efficiencies. It does bring better care to their patients, and it's, it, it, it really facilitates the day-to-day of the, the physicians. But for the most part, I don't think they're aware until it's in front of 'em,
Drex DeFord | This Week Health: Until the new tech
angelmena: it into the, into new, uh, [00:07:00] for the most part, obviously.
Drex DeFord | This Week Health: Mm-hmm.
Bill Russell: I go to see my, um. Uh, primary care doctor, I'll, I'll, I'll talk to the, the nurse who brings me in and, and whatnot. And they're, you know, if I say Agent Factory, they just look at me like I'm insane. if I ask them if they're using ai, they're, they, they have no idea.
But that was, that's what we want technology to be, right? want it to fade into the background. And we want people to go about doing their jobs like they normally do, but just with the assistance they're like, man, the, the, the machine has gotten smarter. servicing the information I need at the right time.
It understands that I'm seeing this kind of patient versus this kind of patient. Therefore, this information's more important to me. Uh, maybe it's doing some proactive things before, uh, Sarah shows up in the office. And it's saying, oh, by the way, you're seeing Sarah today. Boom, boom, boom, boom, boom. Make sure you ask her these questions.
How long has it been since this? Has she done this test? Uh, you know, those kinds of things. I mean, we want the machine to get smarter, but do we really want the [00:08:00] clinician to, to know that it's ai? I mean, wouldn't it be best if it didn't know it was
angelmena: That, that's, That's, that's a great point. And, and I remember when we started having this conversations, um, uh, how do we implement ai? How do we understand what problems do we need to solve? You remember the days of the, the hype, you know, was it too much? Was it not enough?
Bill Russell: are we past the days of the
hype
angelmena: Yeah, exactly. But at the beginning it was, and I would say probably in, you know, in the past three, four years, it was more about everyone wanted to use AI for absolutely everything, and we really needed to organize ourselves and put a framework around that, put the right governance, because it's not just about how we implemented, but also does it have the right cybersecurity?
Does it have the right use cases to solve? But have we, did we get it wrong? And let me explain myself, right? One of the biggest pushes in, in the AI world in healthcare has been, uh, ambiance ai, right? So your [00:09:00] documentation, but if you think about it. We've been dictating, like, I, I actually had to dictate when I started training, you know, and, and someone used to transcribe it in the background and they would call me and say, Hey, I didn't understand half of what you said in this, because we would go a hundred, you know, miles an hour, but.
Um, the, we have to understand why documentation became such a problem, and that was because of all the regulatory needs that we needed to meet, but also the billing, the coding that was behind it. So I, I question myself as we implement more of this ambience ai, for example, should we have gone first to the.
Reason behind the documentation becoming a problem, which is a revenue cycle, which is, well again, the billing and coding and all the regulatory stuff. Solve that first and then come back to the documentation because we actually enjoy dictating. In fact, that's what we're doing nowadays to some extent, is having a conversation with the patient [00:10:00] and that's being transcribed.
Into the documentation, but there's still a lot of hallucination from the AI and we need to review the notes. So if we had fixed that other thing first, I think we would be in a better place.
Bill Russell: It's interesting, the starting point, you know, as we look at, as we look at transformation, if you were, if you were doing this. Greenfield. Like if you were coming in as a consultant, one of the first things you would say is, Hey, let, let's not, let's not just accept everything the way it is.
So ask why, why, why, you know, the five why's around something. And we would look at this like, you know, why are we doing this? What, what, what we recognize is we're in a highly. Regulated environment, highly structured environment, payer, um, provider kind of relationships have been there for decades. And so we are bringing AI into this world and saying, Hey, make this highly, we all agree. That's highly inefficient. I don't know what I'm being charged. I don't know what I'm being charged. Even after I receive the bill and read the bill, I don't know what I'm being charged. I wait until I get that last thing that says, Hey, you haven't paid this. I'm like, oh, I guess I need [00:11:00] to pay this. 'cause I don't understand the bill. Uh, if you ask the doctor how much it costs, they have no idea. So they're not gonna be able to help you. You, you, um, you know, it's, and you have the, the, the prior auth issue and all this other stuff. So you have this highly inefficient thing. Um, and I heard a CIO in one of our 2 29 meetings, uh, talk about, uh, the, the AI wars and what they were referring to is the payer AI the provider ai. And they're like, that's the front line right now. 'cause they're, they're going back and forth on prior auths. They're going back and forth on coding. Uh, blue Cross Blue Shield came out with a thing with a study that they said the use of AI by provider, think about this, this is what Blue Cross Blue Shield did, said the, the use of AI by providers is gonna increase the cost of healthcare by, uh, you know, a couple billion dollars. course, that's their perspective. You, you could make the case on the other side that you know, the providers are actually getting, actually finally coding all the things that they're doing. [00:12:00] Right.
Drex DeFord | This Week Health: they're being paid, what the care actually costs. I mean, even that would be a debate point, but they're actually documenting all this stuff now. So, it is funny how the blame game around AI has turned into a
Bill Russell: Well, let, let.
Drex DeFord | This Week Health: AI versus provider
angelmena: Yeah, it's,
Bill Russell: let, let, let's go off, let's go off the, the cost stuff. Let's go to quality and safety,
angelmena: yeah.
Bill Russell: because I think we can all agree. We want, we want higher quality. We want, we want better safety, uh, within the system. We wanna make sure uh, you know, no more, uh, wrong side surgeries, no more, you know, just the quality events.
The safety events. Safety events go down, the quality goes up. Across the board. How is, how is AI technology being applied to those things, um, uh, quality and safety today? How are you seeing that?
angelmena: going back to Drexel's point about the back in the day where we would have all this data. Big data world, right? And we were trying to, to do an analytics on them, uh, in [00:13:00] quality safety. It's a great representation of that problem. We have all this data. We, we lack the, the, um, we, we have problems attributing the data.
To the right providers, to the right departments, to the right units in the hospital. And then we lack, uh, information in how do you make it actionable? What is the next step? And, and I think that's where AI has a big opportunity on helping in quality and safety. Um, also redefining. The goals, the targets, the, the, the benchmarks.
Uh, I feel we've been targeting the same quality and safety, uh, metrics, and we are not necessarily seeing an improvement on the overall health of our communities. So, going back, if you remember the, oh my God, the, the, the, in baseball, what's that movie called? Uh, the book? Um.
Bill Russell: Money.
angelmena: Right. So basically they redefined what are the metrics that, that, that, that, that will explain who is a great player, who's the player that [00:14:00] you need.
I think we need to do that in healthcare. We need to start redefining quality and safety, and that's where AI can really help us.
Bill Russell: redefining it., Would, would we really redefine it or just redefine what, what, yeah. I'm trying to figure out what the question is here. It
angelmena: Yeah,
Bill Russell: It's, I mean,
angelmena: well
Bill Russell: The metrics are the metrics, right?
angelmena: the, the metrics are the metrics, but are they really capturing. Quality and safety. Um, uh, let, let me try to give you an example. Um, so in clinical practice, we, we go by, in diabetes care, right? We, we measure an A1C, which is a three month average of your sugars. Uh, now we have this devices called CGM, that, that provide you with, uh, a, a frequent reading of your sugars.
So not a three month average where you can have a really high and a really low and then average out. It's a frequent reading of your sugars and we've redefined the metric. Now it's, are you, what percentage of time are you at Target? So it's, it's, but that came [00:15:00] because we can gather more data, we can analyze it in a different way.
So that's my point is that we are gonna have to find out new metrics that are gonna help us def define how we deliver care.
Bill Russell: When we talk about the, the practice of medicine in terms of what, what are the biggest time wasters right now for, uh, for clinicians?
angelmena: Administrative tasks. So anything that, uh, uh, takes me away from spending time with the patient, it's, it's a waste of time. Um, and that's where we've implemented some technology that has helped us, obviously in documentation, you know, we call it pajama time, right? As a time that you spend documenting, putting orders, reviewing records, reviewing labs.
After you finish your day. So we've seen significant improvement in that, but there's a lot of other mundane things that we do in a hospital, in, in, in the healthcare setting that can be facilitated. Let me give you an example. So when I'm being credentialed in a hospital system so that I can work at that [00:16:00] hospital system, right?
Um, it, it, I, I, I believe the, the latest figures that we have, it's more than three months. Okay. And that's just emails back and forth. Some systems still fax documents. I mean, we can't, we're 20, 30 years behind. We, we need to leverage the technology that we have in place to make those systems more efficient.
Bill Russell: Yeah, the, the, the. There, there's a couple startups that are looking at that whole credentialing process. Not only looking at it, have been doing stuff for the last couple of years around it. And um, it's one of those areas that I was kind of surprised when I said, all right, um, give me a list of the all the doctors that can practice in our hospitals. This was one of several questions. Like I remember when I walked in and said, Hey, gimme an inventory of all of our computers on the network. And they said, well, here it is, plus or minus 10%. I was like, plus or minus 10%. 200 machines. Like, what do you mean, plus or minus 10%? They're like, oh, and, but when I asked about. You know, Hey, tell me what doctors can practice in our hospitals. That number wasn't nearly as solid as I thought it should be. In [00:17:00] fact, it was kind of nebulous. And I was like, well, wait a minute. Like we don't know the exact number of doctors that are, uh. can practice in our hospitals. And they sort of looked at me like, well, we sort of do.
I mean, we, you know, they're not gonna show up and start doing surgeries tomorrow. Like, we, we will catch that if that happens. But, no, we don't have the li the list isn't something we could easily generate. I'm like, that's,
angelmena: Yeah,
Bill Russell: to me.
angelmena: you would, you, you would hope though, I mean, one of the challenges, like no one's gonna come in and do a surgery. I, I would agree with that, but then what if they have some certification that's last, you know, and, and, and I think we missed a lot of that. And, and then joint commission comes and see, says, I wanna see your list of doctors and can they practice?
And what are the quality metrics? Right.
Bill Russell: cr, the credentialing systems weren't owned by the health system, oddly enough. So I, I couldn't get access to 'em without permission from the physician groups that, that, that manage them. And I was like, okay. because I was like, oh, well this is easy. We'll just go to the physician [00:18:00] credentialing systems that'll give us the list.
I'm like, oh yeah, we're not giving you access. I'm like, I.
angelmena: it's, it's, I, I, I, I know of systems that, um, in, in their bylaws state that they have to vote for their officers, right? For the medical staff. And they've had to redo the votes or the, the, the, the, uh, uh, a few times because just they, they had the wrong list of, of doctors,
Bill Russell: This, this is one of those things where the more we, we dig in, the more we talk about it, you realize technology, um, technology still remains a small part of the answer to a lot of these
angelmena: of course.
Bill Russell: Like we, we, we have to, we have to, we have to plow through, the, the political environment that is healthcare.
We have to plow through the regulatory environment that is healthcare,
angelmena: Yeah.
Bill Russell: uh, complexity around coding and billing that is healthcare. And, um, and so none of those things actually change. When you introduce ai, they're, they're all still there. Um. question becomes, as we start to introduce ai, are we going to take the time to look at these processes and say, if we do this with ai, we no longer [00:19:00] need to do X, Y, and Z, or are we just going to get AI to fill out the paperwork that we did with the old processes?
angelmena: I believe you can do both separately. You have to review your processes and procedures and then see where the gaps are. Work in AI work, as you said, if you start implementing technology and the wrong processes, you're just gonna continue to get the, the not the desired result of the process.
And, and some places are missing. They, they don't have the right governance in place. Also, I have to say, I, I don't want to put more red tape around implementing technology that's gonna really help us. And, and I do see a lot of that happening because we're trying to be very, very careful about it and very structured, which I agree, but we have to be careful not to delay the implementation of the right technology.
Bill Russell: Right. Sarah, you've been quiet today. In fact, I'm not even sure you've said a word yet. Come on. What do you got?
Sarah Richardson: Can you imagine that? No, I, the more I dig [00:20:00] into the best practices around AI and its responsibilities. Exactly to the point we just made. If you do not truly have an investment in Six Sigma and change management and data strategy, before you start to throw all the different elements of AI on top of it, you're destined to fail at one of those three points, and they are equally important in order. For clinicians and patients to wanna trust ai. Back in like 2019, there was the ethics guidelines. Now there's been a ton of soft rules put out there about AI across the continuum, but if you really are thinking about human agency and oversight, technical robustness and safety, what you need from privacy and data governance, transparency, making sure that your systems aren't biased in how they are pulling information forward, how that affects your patient populations and who's ultimately accountable.
Yeah, tenets along with those three human beings bring forward a perspective [00:21:00] that Dr. Mena and others would begin to trust more thoughtfully in an organization. And so if you don't have everything in place to make things work well, as Drex loves to say, you're just gonna make that training wreck go a whole lot faster.
Drex DeFord | This Week Health: Your point bill, uh. The idea that sometimes we, the processes that we do, the processes that we run, we run because it's part of a larger, very inefficient process.
And so sometimes we bring things like AI to that it just makes. Like it is, there's a lot of like, can we empty all the toys out of the toy box and figure out what's the right way to put this Lego set back together Again, understanding that it's inside of this much bigger Lego set, that there is no moving the parts of the federal government or insurance companies or whatever.
You have a finite amount of control, but trying to make sure that you're rejiggering that as as much as you can to, to make the AI work on the right things.
angelmena: Yeah.
Bill Russell: know, I'm looking, I'm looking at the signal report and, and, and as I'm looking at the [00:22:00] signal. It, it there, there's still a strong emphasis towards administrative and on the clinical side on, on the, on the backroom operation side. in, in terms of the use of ai. Uh, the, the clinical is, you know, uh, ambient listening is the predominant clinical use case across the board. imaging we've seen a fair amount of AI get implemented, uh, within imaging, again, from a cost standpoint that there's this, the, it has to be determined whether there's a, valid use case around imaging. So I hear a lot of people have. in imaging, but they have not expanded it significantly as they struggle to find those, those points.
And I think the advent of Agent Factory is going to make builders and coders out of a lot of people who are going to start to. Reach into that clinical data set and do certain things. I don't even know what those things are yet. I haven't had a really good, [00:23:00] discussion around agent factory. I know a handful of systems that are, uh, that are, early adopters and, and doing some things around it, but, uh, I don't have the use cases yet, but I think we will see those clinical use cases go up significantly in the next couple of months.
angelmena: Yeah. And, and if you don't mind, I'll touch in two points that we just made for change management. We would need another hour. Because I agree a hundred percent with you. I mean, you can have obviously the right technology, the right processes, but if you don't have that right change management, the stakeholders, your adoption is gonna be very poor.
And we, we, we tend to explore technologies and then we implement, or we, uh, adopt 10% of their capabilities. And then we have everyone in the healthcare system asking you. So, um, can, can this do this? And I said, yes. Since we implemented two years ago, we were able to do this, but they didn't adopt it, it well.
So change management is necessary to understand having the right [00:24:00] partner and vendor. Going back to your point of the agent ai. Uh, we, we have to acknowledge that the hospital and healthcare systems, they have the data. Patients have the data, right? And, and, and we have to partner with startups and companies that are trying to solve problems.
Uh, but we have to provide the data. So how do we create a framework around that so that they can validate their use cases, create the agents that are gonna truly solve the problems in healthcare, if we don't, if we don't collaborate.
Bill Russell: No, I think that's, I think that's true, and I think that's part of the. Observability of these models moving forward and, uh, and creating those, the ability to capture those statistics that we need in order to validate, uh, the use cases. And of course, ex examine quality as it its impact on quality and safety as it moves forward. and how Sarah Drex, that's all for today. But I wanna thank you guys for being here.
Sarah Richardson: Thank you.
angelmena: [00:25:00] Thank you very much for having us.
That's Newsday. Stay informed between episodes with our Daily Insights email. And remember, every healthcare leader needs a community they can lean on and learn from. Subscribe at this week, health.com/subscribe. Thanks for listening. That's all for now.