This transcription is provided by artificial intelligence. We believe in technology but understand that even the smartest robots can sometimes get speech recognition wrong.
Newsday: AI, EHRs, and the Fight Health Systems Can't Ignore with Jacob Hansen
Speaker 50: [00:00:00] Over 1200 hospitals rely on Ava Azure's AI enabled virtual care platform to enable their smart hospital strategy. It's the eyes and ears in the room. Scalable, clinically grounded and built for enterprise impact that translates into smarter labor optimization and improve throughput where it matters most.
Find out more at this week, health.com/asure.
Speaker: I'm Bill Russell, creator of this week Health, where our mission is to transform healthcare one connection at a time. Welcome to Newsday, breaking Down the Health it headlines that matter most. Let's jump into the news.
Bill Russell: All right. It's Newsday and today we're gonna have some fun and it's gonna be spontaneous. I remember when I, when I interviewed Mark Cuban and I said, Hey Mark, did you get the questions? And he said, you know, bill, I never read the questions 'cause it loses the spontaneity.
Well, Jacob is prepared. Drex and Sarah have [00:01:00] prepared, and I just told them we're not doing any of that. So we're gonna, we're gonna go in a little different direction, so this will be spontaneous and off the cuff. I wanna start with, um. Uh, Uh, congratulations and talk a little bit about it. So, uh, Brad Remer, uh, Sanford Health, CIO, uh, took on a new role, right?
CTDO. Little different. We see CIDO. We see CDIO, whichever order they wanna put 'em in, but Chief Information Digital Officers, what we've seen, this one is Chief Transformation Digital Officer, okay. CTDO. I'm curious what you guys think of that one. So they're not the only one. There's, there's some others out there that are, are starting to throw the, the, the tee, towards the IT organization, which I find very fascinating.
Um, does that resonate with you? Does that, does that make sense in, in terms of where the transformation and where the, uh, I don't know. I is, is the CIO really sitting at the center of that intersection where they can actually help transform the, the [00:02:00] health organization?
jacobhansen: Certainly from my perspective, you can't, you can't wander through a discussion about technology selection at a. Health system without, of all, finding point solutions that were selected by different, probably clinical decision makers. CIO has to sit in the middle of of these is providing the best value.
How can it serve all of the different outcomes? How do we rationalize these purchases into an enterprise contract and select one that actually changes clinical outcomes? To me, transformation's probably the best title I've heard for a, an IT leader in a while because it. Highlights the need for ROI and meaningful change in a way that just CIO doesn't necessarily speak to.
Bill Russell: Are we ready for the tea? Drex, Sarah? We ready for the T in in our title.
Drex DeFord | This Week Health: I think it's a sort of a natural evolution to the job. Um, there are a lot of folks who are already doing that. I was talking to a chief innovation officer this morning that was [00:03:00] sort of part of that. Conversation too. The CIO is also, in many cases now, the chief innovation officer and the transformation officer, and in some ways they almost feel like they're the chief operating officer
Bill Russell: gonna say, why don't we just give 'em the E,
Drex DeFord | This Week Health: they see so many things that are happening in the organization and they can kind of map these things together.
They're in a interesting role where no one else really, you know, hardly plays, and so. can bring efficiencies where there aren't efficiencies and they can bring innovation where it's, you know, challenging to, to have innovation.
Bill Russell: see, just sees things that other people don't see, right? You're having clinical conversations, you're having supply chain conversations. You're, you just see things that other people don't see. I mean, you just have a purview of data.
Jacob, we're gonna come back to a data conversation. Uh, 'cause you're seeing that a lot. They see the data. Nobody else really sees the data, per se, in its raw form. I mean, there's, there's so many things that they're, it's interesting to think about them as a transformation officer. What do you think, Sarah?
I.
Sarah Richardson: I wanna be the, [00:04:00] the CGSD because that's always what I've been and done. And so honestly, I don't care what you call it, as long as if, if the organizational responsibility and accountability is with whatever you wanna call a person. That's what's important is how does the work actually get done internally?
How are you achieving the goals? Is your strategy aligned with the way you're rolling out your operating models, et cetera, et cetera? Call it whatever you wanna call it, but is the intent of the role effective within the structures your organization has created?
Bill Russell: Go ahead.
Drex DeFord | This Week Health: doing those jobs and doing well right now, are the folks too, that are more likely to keep their jobs. be marginalized into a sort of like you just keep the trains running, right? And they're more likely to be promoted to chief operating officers and CEOs and presidents of hospitals and all of that.
Bill Russell: So for those who have read my fiction, we fired Sarah Chen in the last chapter, which people were like, I can't believe you fired her. I'm like, yep, she got fired four years. Man, you gotta pull off a project over four years. There's an awful lot of things that can [00:05:00] go wrong. Well, something went wrong, she got fired.
We don't really know what. Um, but in the next, uh, the next book, which is written and being read by some people right now, hopefully you two, um, the, uh, Sarah gets hired back in and it's to an organization that already has a CIO and she gets put into that transformation officer role and we explore this whole idea.
'cause I, I think it is a transformation officer. And what was interesting to me is. As I did it, I sort of played around with this whole idea of how do clinicians view a chief transformation officer that's not a clinician. It's like, how can you transform a healthcare organization when you're not a clinician?
And, uh, it is one of the things that we have to wrestle with as technologists, as we might see. More of, uh, of, of the, uh, organization, but we don't necessarily have that clinical background. We're gonna have to rely on some people and be able to rally people and that kinda stuff. Jacob, I want to get
Sarah Richardson: than a clinician being a technologist? I mean, they need
Bill Russell: Oh,
Sarah Richardson: I mean,
Bill Russell: oh, [00:06:00] Sarah
Sarah Richardson: but at the end of the day, you
Bill Russell: did. Can you see the buttons on my back? Do you like just press each one? Is that what you're doing? It is like.
Sarah Richardson: I thought that was part of my job description,
Bill Russell: I think it's important that, uh, clinicians are, have very deep, um, uh. Influence on technology decisions, data decisions, all those things. I think that's absolutely true, and I think one of the mistakes, and I know many, uh, CIOs who are clinicians, and I, again, I think they're fantastic and I love 'em, I, and um, but the ones that choose to say I'm a clinician, I'm not a technologist.
I have really smart technologists. Um, I think that's a mistake. I think they, the, the ones that the clinicians who have excelled in technology in this role, CIO or CTO or whatever it's gonna be, have been the ones that have not only embraced the role, but they dive deep into the technology and, and try to understand it.
Um, Jacob, we, we started a conversation before we got on here and it was one of the stories that we were looking at before. Um. The, uh, data is becoming so [00:07:00] critical in, in the discussion, and I, I would almost put out there that at the executive table, at the senior leader executive table, very few people see the data the way that the CIO does.
And it's becoming critical across the board for, uh, for AI projects for sure, but also, uh, RevCycle, project E every, all projects, it's becoming critical for. How we organize the data, uh, define the data, um, how we structure it, how we, uh, how we handle it, the, the old terminology of data governance. I'm curious what you're seeing as you're out there working, uh, with clients.
jacobhansen: What amazes me is we are years into, an exercise as an industry. Um, that focuses around data capture and the use of data to train AI models to, drive, speaking of transformation, drive change across how technology's used. and yet I still run into settings where there's disagreement about [00:08:00] who owns the data, how is the data governed, and that's in spite of all this conversation that's been happening. uh, and then depending on who owns it, what can it be used for? it okay? Is it acceptable that a, that an organization who built AI models, deploys those in production, learns from that production environment, refines and improves those models? Does the EHR have some, control over that. Should they be allowed to have data controls over how the, the data's captured and used? Does the health system have some claim on the advances that were created in the AI models? These are all the things that are active discussions. We're excited to try and help lead the conversation because, um, so much of it is tied to being smart and avoiding risk along the way. Um, but those are some of the core variables we're spending a lot of time talking about.
Bill Russell: It [00:09:00] is.
Drex DeFord | This Week Health: about my days back as an independent consultant. Hey. I would go into a health system and work on some projects there, do something, learn a lot about that health system, but also about how they did things. That stuff, whether I wanted it to or not went into my head and was useful the next time that I went into a different consulting engagement and I helped them solve those problems.
This is just life and growing up in a lot of ways, right? You learn things as you go along and you re-leverage that information in the next experience that you have. That's a really interesting thought though, that uh, somebody controls the data and somebody controls, oh, well you learned this here, so we own part of that experience.
That's the kind of stuff you're talking about.
jacobhansen: I would say it's a pretty common question mark for a health system. Where's the line between deploying new technology to your benefit as a customer versus being a co-development partner? What's the difference between those things [00:10:00] we have some customers that we're talking to right now about what are very clearly co-development products or projects where we come to the table and they say, we have a novel problem we wanna solve.
We say, well, we have a, we have an AI foundation, uh, a platform that we could draw from if we took this thing and applied it this way and we're gonna learn from what you do. And we work together to create something together. That's, that's an easy one, right? That's we're gonna, we're gonna co-create something. But if we a, a health tech company come to a system and say, we have a thing and this is what we do with that thing, we're happy to deploy it in your environment. This is true of any software product, could be AI or an application. We're always going to learn from your use of it. so I think the thing that the market's trying to understand is. With, with software, with traditional like front end ui ux, you deploy it and [00:11:00] yes, you learn, but it's generally a thing. Before it's used in AI models, could have a totally different environment, pediatrics versus geriatrics. You could have, you could have rooms with a ton of light or very little interesting ambient, IM, you know, impact on those models. And you might have to deploy and just watch it in shadow mode for. A few months here and then then redeploy and watch again for a few months. How do you make sure that the health system understands the line between co-creator and user? That gives feedback,
Drex DeFord | This Week Health: kind of, you've kind of created a shortcut here, right? Because when I'm using software, I'm usually sending complaints to the software developer.
jacobhansen: right?
Drex DeFord | This Week Health: it do this? Can you make it do that? This isn't working well for me. Can you change that the vision models that you're fielding? You are almost fixing problems before they come up.
Right? That's part of what, what the AI [00:12:00] model does. Very
jacobhansen: Yeah, especially because the users have to trust it they have to have confidence in what the data is producing, right? Computer vision models are generally probabilistic models. This is not deterministic. Uh, logic driven rules engines, right? This is something that is, that is, um, um, trying to produce notifications or alerts or recommendations or summaries to clinical teams. Um, so I think, um, we're, we're always trying to make sure that we are being effective stewards of, um. customer's time, right? What are we demanding? What are we asking of them and participating in these deployments? And how clear can we be about novel versus unique use case, right? You're doing something we already do, but your environment is just unique or different from everywhere else that we've done it before.
And that's okay, but it doesn't mean co-creation.
Bill Russell: So Sarah is taking this MIT course on ai, so this is going to act [00:13:00] as a test for her. I'm gonna go to Jacob, then I'm gonna come back to Sarah. And, uh, Jacob. Uh, you saw the, um, a breach? No, it wasn't even a breach, it was a leak. It's just a leak of, uh, Claude Code to the world. And, uh, I mean the, the, so I wrote an article last week and essentially I said, you know, the biggest learning on this is it's the harness, like the, the amount of code that Claude puts around the AI model's pretty amazing.
And then what happened afterwards was people took that harness and they put it on a different model because they, they released the harness and they're like, all right, well, let's see how it. Performs with this other model, and it turns out the harness is very much, uh, marries the, the model itself. Um, but it does work, but it, it does marry the model itself that the, what it comes back with and whatnot.
Um, and, and so I'm, I'm curious what, what you learned and then Sarah. I'd, I'd love for you to, you know, use what you're learning in the course to, uh, to, to, to talk about the, [00:14:00] you know, I mean, why does this, why does this data conversation even matter? But Jacob, what did, what did, what did you learn from that release?
I.
jacobhansen: Um, I mean, I'm sure my takeaways were probably, uh, inappropriately biased to my day-to-day. But, um, any of those, especially with, with, um. Controls put around how models come together or work. Um, uh, for me, I'm especially interested in, when we look at, at computer vision models, in particular in our business. We're always trying to be really sensitive to, to what the promise of those models are as an industry how effective our AI platform is. The, the harness, so to speak, that fits these models because we, we definitely have been, as an industry, there have been some who've been pretty cavalier with statements of what AI can or can't do. And, um, I wish we were a little more transparent with what the models are in, in [00:15:00] the, we you see a leak and you're thinking, oh, now everybody has access to something. Well, maybe that transparency is good because it would reframe clinical users understanding or expectations of what's possible, um, and what's not. I dunno if that makes any sense, the way
Bill Russell: Okay.
jacobhansen: you were headed, bill, but.
Bill Russell: Yeah, no, it's, uh, and, and computer vision models are interesting. I mean, in, in, in almost, not in a different category, but they're, because they're probabilistic models, but they're, they're, they're very, um, different in how they interpret things. Sarah, what are we learning in class this, this week?
Sarah Richardson: So I've had four modules completed thus far. What I appreciate about the program is if you've, if you've never done AI or healthcare or any of those aspects, 'cause the use case they're using as a hospital. It's been kind of fun for sure. Um, you would be like probably overwhelmed by the expectations of what it takes to really do data and ai.
Well, for me, it's more of a reinforcement of truly everything we did in my last company about creating the right structures for, for data ownership, the [00:16:00] foundation, the strategy deployment, the insights, the responsibility layers. It's fascinating. So it all truly comes down to a bit of how you think about. Governance strategically. That's data governance. That's clinical and digital governance. That's your AI governance laid on top of it. One of the things I love about when we talk about the Toyota, you know, process that Drex is so intimately involved with is that Lean Six Sigma helped us fix problems faster. When you have trusted data and the right stewards and owners, AI can actually help you avoid them all together. However, your data programs are not
Bill Russell: Somebody made that point in New York, it was really interesting 'cause they were saying, oh, we gotta get our data right first. And somebody else just said, why don't you use AI to get your data right? Like, what are you doing?
Sarah Richardson: no, no. You have to get your data right so that you can then trust and keep bringing AI appropriately into the mix. The thing is, though. Getting your data correct is one of the hardest things that can [00:17:00] happen because everybody in the organization has to wanna be a part of that, and that data fabric is a forever thing.
You don't just fix it and walk away and hope that the new feeds are gonna be clean, the interoperability components, how you're sourcing it, how you're trusting it, how you're utilizing it. You have a partner like Jacob and you've got, let's just assume that you've got your data. In a space that you trust, and now whether you're co-creating or co-viewing or all the amazing things you can do with Avature, you now have the subset of information that could be light years ahead of the rest of it because the way your partner's allowing you to apply it. But if you don't have those structures in place ahead of time and the people willing to do it, what's so fascinating is the layer over all of it is your capability of true OCM and. Sponsorship from the executive suite. If the organization doesn't really buy into it and want to have the change component and the cost of keeping data clean, then it doesn't matter how much AI you apply to it. Um, you can get strong partners that reinforcement. So if Jacob and I were running [00:18:00] our a hospital and he was my partner, I'd say. Here's where we need to reinforce these capabilities so we have better outcomes and better clinical, uh, information that we need to serve patients and clinicians better. So there's a lot to be said for all of it, but you have to be willing to want to get it right, not take the easy.
AI is not an easy button.
jacobhansen: It's also not, the answer's not the same for everybody. That has to be Okay.
Sarah Richardson: Yeah.
jacobhansen: even in computer vision and healthcare, look at a, look at a a, a rehab or a sniff where they want patients ambulating up and moving around. the purpose of computer vision there? They need to know if somebody fell down after it happened because they want them up out of bed. And that's, that's quite deterministic. Literally, you're, you're, that's not probability it's, somebody, have they gone from vertical to horizontal and are they on the ground? Person object, relationship, and it's just a simple answer and and that's totally normal and fine in med-surg. It, it's [00:19:00] unacceptable to have somebody ever fall down. so even a simply, even a simple boundary model is not acceptable. Why? Because the minute they cross the boundary, there are moments from falling down anyway. Now I need probability. Now I need to know. Well, they're edging closer to the edge of the bed. They're, they're agitated. They probably need to go to the bathroom.
There's a lot of mo motion happening. Um, maybe they've verbally been expressing frustration and now I'm gonna signal to somebody, this person's probably going to get out of bed. Same use case, falls detection, two different care settings, but also same, you know, overall industry, the, the need for those different use cases, even in something that narrow. Super important and, and clarity about what you expect the data to do plays a role in that.
Bill Russell: We had an interesting conversation, a, a a at one of the city tour dinners, and again, it's, it's interesting you go into those meetings and you have. Uh, a view of AI that is, uh, the wonderful Oz, right? It's a, it's a easy button. It's a magic trick. Oh my gosh. It [00:20:00] can do amazing things. It's gonna be doing surgery next week kind of thing.
Uh, and then you have the people who are like, it's the bumbling, stumbling fool. Uh, your, your nephew who shouldn't really be in the business, but somehow because of nepotism, they happen to be in the business. And you're like, why are they even here? Like, they keep making mistakes. They should be kicked out.
Invariably we find people at our tables, and these are really smart people and people who have experience with AI on, on, on both ends of, of that spectrum. Um, and, you know, we, we start talking about, you know, what do you believe and where do you think it's going to be in the next, uh, three to five years?
And, uh, I'm gonna close this with this question and it's really about career and and trajectory. Um, I asked this question, uh, to kind of get people to a point and it was. Uh, would you recommend to your child, your, uh, senior in high school, junior in high school child that they should study? Um, computer science and computer programming when they go to college [00:21:00] in the year 2027?
Knowing what you know about AI and the amount of code it's writing. By the way, the amount of code just Claude is writing has increased to like 7% of all code that's being put in GitHub. So that's just Claude. That's not Code X and the rest of 'em. So if that's where coding's going, what advice are you giving to them?
Will. Well coding become, there's the really spectacular people who are going to MIT and Stanford and, and redefining these models. And then the middle section just gets wiped out because just writing code standard writing code becomes a commodity. How do you, how do, what advice do you give and how are you viewing this?
How fast is, and I use that because it seems to be the area where AI is making the most gains the quickest, and I'm using it as a proxy for will it make those kinds of gains elsewhere. Alright. Computer programming, what's your advice?
Drex DeFord | This Week Health: I hope the computer, the computer science [00:22:00] programs are evolving. Uh, fairly quickly too. Um, programming to me, as much as anything else is a, good way to learn to think and problem solve. And so I don't know that it's a bad idea, it's just if you're really getting soaked in a particular language and this is what I'm gonna do when I grow up, that may not be a job for you.
So I like the, I like the thinking process. I like the, the, the, the. You know, channeling the problem solving that kind of goes into programming. But I don't know if I would want to send my kid into, into that career field right now.
Bill Russell: the way, what's, what's your degree in?
Drex DeFord | This Week Health: Uh, I have a master's in health informatics and one in public administration.
Bill Russell: There you go. Interesting. Cool. Uh, Sarah, what about you? What, what advice are you giving?
Sarah Richardson: So my first degree is hospitality. My second degree is master's in business. Then I got a bunch of. Goofy. You know, certs on top of that differentiation is gonna be key. I honestly would probably have a dual major, if you love the engineering side of computer programming, and make sure you focus completely on the engineering capabilities. But I'd throw in something of [00:23:00] specificity for an industry you believe you wanna serve is that hospitality, is that healthcare is that finance. Because you're gonna need that next layer of learning and understanding of how to apply and bring all of these things together in ways that are most meaningful now and into the future.
Bill Russell: Cool and Jacob.
jacobhansen: I would say something similar to, to both what Drex and Sarah have said. depends on what you mean by the words. Computer science, if it's applied to computer science, where you are coming away with a degree that allows you to go and become. A translator for what AI is doing to become a, a rudder for AI in the direction it's aimed or pointed. If, if I, I, I believe, I mean, I, I've talked to somebody this morning who's, uh, deeply, uh, using real, really effectively, um, um, AI coding, and they're hiring more engineers than they've ever hired. And, and if you're passionate about computer science, go study [00:24:00] computer science and do it with all of the current events in mind.
Sarah Richardson: Mm-hmm.
jacobhansen: a, a pioneer in the, in, in that way. I would be, I, I would tell my kid, chase it, and here are the ways that, that you should at least be thinking about it as you go.
Bill Russell: I think that's interesting. The, the job will not look like it did before. It, it is not a job where it's like you, uh, you are actually writing the code. Again, I like what you guys were saying. Essentially it teaches you how to think like, I have my degree in economics.
Drex DeFord | This Week Health: The,
Bill Russell: teaches you how to think.
Drex DeFord | This Week Health: the job will not be like the job is today. Really. Almost applies to every job that exists today, right? I mean it, everything is changing so fast that it's, are very few jobs that probably aren't gonna be touched by AI over the next six months or a year.
Bill Russell: Yep. But I, I, I, I love the rudder idea of, you know, having conversations with the business, [00:25:00] understanding what the business is trying to solve, understand what the tool. Can do. Uh, and I mean from, you know, essentially talking to AI and coding it to, uh, the selection of ai, which is a, a deep dive I wanna do at some point, which is, um, you know, we're buying these, these AI tools.
Do we really know what we're buying? And how do we evaluate them? And, uh, you know, what does, what does that look like? It it, because that's, it's interesting to me. We're gonna be spending. As an industry, we're gonna be spending billions of dollars in the next couple of years around AI. And, um, you know, are we, are we the best people to evaluate what those things are and are our companies the best companies to be evaluating what those things are, uh, there And I realize there's a certain trust that goes along with it.
Um, yeah. Interesting. So no job is gonna be the same. Does that include the, uh, physician and the nurse? Hmm, probably. It's gonna be, it's gonna look a little different. Uh,
jacobhansen: Yes.
Bill Russell: yeah. That was the, that was the point I was trying to make in that meeting. Um, the doctors pushed back pretty hard on it, just so you know.[00:26:00]
And, uh, you know, no AI can, can do what a doctor does, empathy, whatever. I'm like, well, I mean, we're hearing it's more empathetic and they're saying, which if you really wanna tick off a room full of doctors, go ahead and just keep pushing on this one for a little bit.
Sarah Richardson: But the
Drex DeFord | This Week Health: The computer.
Sarah Richardson: one of the more AI resilient roles in healthcare is nursing.
Bill Russell: Right. It's, well, a significant portion of nursing is physical. Right?
Sarah Richardson: Yeah, I mean, it's a human being taking care of a human being
Bill Russell: Right.
Sarah Richardson: about 135 grand a year. So I would also be steering people. You want a, especially over the next 20, top five to 30 years with the population of the United States, you want a job that will
Bill Russell: But every day, Jacob, Jacob wakes up every day trying to think about how to make that less physical and more, more digital.
Sarah Richardson: Even if it's more digital, you still need a human being taking care of a human being.
Bill Russell: Yes.
jacobhansen: you could actually argue. We're trying to help it be more physical, meaning more time at the bedside, more time physically with the patient, less digital, less [00:27:00] time sitting in front of a computer, less time manually entering things into EHR. Mm-hmm.
Drex DeFord | This Week Health: And giving better coaching and teaching and advice and guidance, not by sitting at the keyboard, but in other ways that
jacobhansen: We trying to take, we're trying to make the digital part of it automated let the physical be physical.
Sarah Richardson: The
Bill Russell: Go.
Sarah Richardson: part. Sometimes being the empathetic conversation about here's how you take care of yourself when you get home. Here's what I know you're experiencing while you're here. Like so much of our lives have become digitized that the removal of the human element is exactly what Jacob's giving back.
And so I encourage people now they say, I want to go into healthcare, like think about nursing as an option, because there's so many facets of that available, especially over. The next two and a half to three decades inclusive of the utilization of digital tooling to make those more efficient. That is a space I would recommend to people every day.
Bill Russell: Jacob, I wanna thank you for, uh, for coming on the show. I wanna thank you for the conversation. Sarah [00:28:00] directs always, uh, always a pleasure to catch up with you guys.
That's Newsday. Stay informed between episodes with our Daily Insights email. And remember, every healthcare leader needs a community they can lean on and learn from. Subscribe at this week, health.com/subscribe. Thanks for listening. That's all for now.