This transcription is provided by artificial intelligence. We believe in technology but understand that even the smartest robots can sometimes get speech recognition wrong.
Newsday: Security Culture Saves Hospital and Resource Gaps with George Pappas
GMT20241127-170018_Recording: [00:00:00] This episode is brought to you by Intraprise Health, a health catalyst company.
Make cyber security a priority, not a headache. Cyber attacks put patients at risk and cost healthcare organizations millions. But with convoluted software systems and risk and vulnerability data lost in silos, leaders know their organizations are vulnerable and they feel little control over the safety of their patients, resources, and healthcare.
Reputations are bottom line. Intraprise Health brings together cybersecurity experts with over 100 years of combined experience in healthcare to offer a comprehensive suite of innovative software and services. It helps leaders finally unlock a unified human centric cybersecurity approach. With Intraprise Health, you can improve your cyber security posture, protect your patients, and simplify your employees lives.
Visit thisweekhealth. com slash Intraprise health to find out more.
Drex DeFord: Hey everyone. I'm Drex Deford, one of the principles of this Week Health and the 2 29 project [00:01:00] here. Our mission is healthcare transformation powered by community. This is Newsday on the UN hacked channel, breaking down the cyber and risk stories that are impacting healthcare. Here's some stuff you might want to know about.
hey everyone, I'm Drex and that's Sarah Richardson up there in the corner. Say Hi Sarah.
Sarah Richardson : Hi, Sarah
Drex DeFord: and George Pappas, who's the CEO of Intraprise Health, A health catalyst company is with us again today.
How's it going, George?
George Pappas : Great. Always a pleasure and great to have my old friend Sarah here along with you. Usually it's the, just the two of us, so this is a nice treat.
Sarah Richardson : I know. Yeah. Thanks. Lemme cross your party.
Drex DeFord: I feel like. A lot of these stories in one version or another we've talked about in the past, and we will probably talk about in the future.
So, as always, I'm glad you're well prepared and you came with a bunch of stories that you wanna talk about and so what do you think you want to just tie in?
George Pappas : You know, they say history rhymes, but it doesn't repeat. Well, we're seeing a lot of that in these, just picking [00:02:00] up these stories out of the ether of this last couple of weeks, ironically.
And the first one was really about this, hack the Cerner system.
Drex DeFord: Yeah, the Oracle stuff and regional
George Pappas : health system here. Yeah. Oracle, Cerner. And, you know, it just brought up so many things that we've helped clients navigate over the last several years at Intraprise because. , The third party problem that has been widely acknowledged has some pretty important kind of tectonic plates that keep preventing it from really being solved. Right. And sort of the primary one that starts is, wait a minute, if I have a mission critical vendor and they choose not to give me the things my lawyer wants. If they're gonna tell me to go pound sand.
Okay, well then what are you gonna do with that?
Drex DeFord: In the contract, you mean?
George Pappas : Right? Yes, in the contract, because, I've been doing contracts a long time. This is when we had perpetual licenses, which, , sounds old fashioned, right? But ultimately you have, you know, damages intellectual property, different forms of other liability that are carved out.
You have several things and [00:03:00] software pricing. And services and the margins of the software community are all based on a set of foundational assumptions.
That's why Drex, and we've talked in the past, you heard me mention the report that, the senator from here Virginia. Yeah. Senator Warner.
Speaker 4: Yeah.
George Pappas : Senator Warner. And it was like fall 22 and the smartest thing he had in there. Was some form of safe harbor as long as people were doing what they needed to do.
And so, of course in our political food fighting culture, you know, it really hasn't gone anywhere yet. But if you really think about where we are today.
We're not gonna make substantial progress on the sort of vulnerability a vendor has to put themselves in a place to, to truly be collaborating on incidents. Transparency, yes.
Drex DeFord: Is the key to the operation. Yeah. But when you're not protected. Right. Sarah, you've seen the same thing too, right? Where something bad happens and you would love to get [00:04:00] more out of your vendor partner, right?
But they are lawyered up and that's kinda where they stand.
Sarah Richardson : Yeah. The whole like investigation integrity versus your ability to have a timely patient notification right. Tends to get in the way. And that's really just. I mean more increasingly about your incident response maturity and some of those conversations that need to be brought to the table, either in the contracting phase or the renewal phase.
George Pappas : And in that article, by the way, this was kind of interesting. It was like a six month delay until they notified the health system. Law enforcement wanted us to do that. Really? Who? Who's law enforcement? Is that? Yeah. Is that your shareholder value? Law enforcement. Yeah. Is it your state or local law enforcement?
Because this is where, by the way, if we'll talk about certifications in a minute. If you look at the state laws that have been. Put in a law by New York last year, the Ohio law the Maryland law that was passed by the legislature survived the governor vetoed it?
They all had shorter notice periods.
Why? Because if [00:05:00] you're a health system serving a large chunk of a state, a regional operation, you can't wait. You know, the HIPAA security rule has like 60 days, right? No, not really. That's a long time for people
Drex DeFord: to be able to steal
George Pappas : your identity. Yeah. The practicalities of the situation demand more, and we all know what happened to the.
The HIPAA NPRM that's been kind of delayed and put on ice now for a while.
Yeah.
Just like that FCC thing that I would've complained with you for 30 minutes, that's why I pulled it off the list. , We're more safe by not having regulations on protecting our bumble networks. That's really good.
Good job guys. So ultimately, the notification has to be tighter. The transparency, but you have to allow the vendors to manage their liability and ultimately their financial posture before they do that openly, epic to its credit. They have real time or near real time feeds in your SOC and your sim.
They also have an encrypted sort of email channel of [00:06:00] updates. That's the general, hey, in audit logs that if you're an on-prem client. You've had an incident, you have some material to examine along with them besides your ancillary systems to determine what was the cause of the leak and everything else of the attack.
So they're doing, I think, a pretty good job within the realm of what they can handle. But ultimately, if you look at these like, and I use high hitrust 'cause, just 'cause it really is the gold standard of all these things.
If you look at all those different domains. They speak to how you use what you're buying, not just what you're buying, but then your next kinda derivation of your security risk is if you're buying a cloud system.
Be careful, make sure that the certifications for the application, not your cloud environment. Right, right. I've seen that kind of switcheroo game to some of our clients before and it's nice that. Your compute envelope is high Trusts certified or HIPAA compliant, but you have to use it [00:07:00] and your application in a high trusts certified or HIPAA compliant fashion.
So they're two different things.
Drex DeFord: Yeah. The, a lot of the cloud providers talk about that, as security. Of the cloud and security in the cloud and Right. They kind of draw some lines about who's responsible for what. Yeah. So, just because you're in the cloud doesn't mean you're able to sort of shed all your security responsibilities.
It's just. You still have all that risk and all that work that you have to do. And you know, you read a lot. I mean, I just read something actually this morning that was saying that a huge number of the breaches that are occurring right now with applications in the cloud is due to misconfigurations.
And it's just, you know, the world that we live in right now, a lot of folks are still making that transition from OnPrem to into the cloud. And they do that. They're not necessarily. Facile with that world yet, and they're making mistakes and leaving themselves exposed. So,
Sarah Richardson : When you're the partner, [00:08:00] as an example, essentially responsible for keeping a lot of the space safe, George, how often are you going into a client and saying, Hey, you know what?
Your old architecture is part of the issue. And so either securing intentional funding to retire or sandbox some of these legacy systems safely, or even requiring a checkoff of configuration management to a degree, like what level of recommendation versus requirement do you see coming of these types of vulnerabilities that are almost like little mouse holes in the wall, but they're the space with cheese.
Yeah. Getting in.
George Pappas : It's a great question, Sarah, and I'll tell you, because we engage with the leadership of every one of our clients. We give them the list. In fact, in several cases we present to their audit committee their annual security posture. Simple fact of the matter is many of them don't have the capacity to finish the job.
We even have automation that we provided that if they [00:09:00] don't have a headcount to use it alongside us. Or their CFO is looking and parsing the exact extent of the regulation. Oh, really? Do you do a hipaa SRA every year? Well, we did it once with an external person. Now we're gonna do our internal kind of check.
So, and it's not a complaint, it's a fact of the margin life of these hospital systems. That's why one of the reasons why I think the NPRM that was published. In January this year got kind of buried because it was tightening up all the loopholes that we've seen in clients and, but how are you gonna do it?
Where's the money coming from? This gets back to, I think, Drex one of our earlier discussions where, you know, previous administrations have had funding for. the critical access hospitals of the world or hospitals that are marginally profitable to do some of these controls, or Microsoft had their program, et cetera.
But [00:10:00] it's such a fundamental mismatch right now. The patterns are still there. It's gonna take a while to address. Yeah, Sarah, it's a great point. It's our responsibility to leave them with the list, but ultimately they have to act on it.
Drex DeFord: I think, you know, we just talked about this in a CISO summit too, the NPRM, which is essentially dead.
Yeah. But the NPRM material, the requirements, I think everyone said these are actually really good things. We agree. We should totally do them. The timelines. Seemed a little overly aggressive. Yeah. And then the fact that there was no money, there was no resources to actually make any of it work. Wound up kind of coming back to the reinforcing exactly what
you're saying Sarah, is that somebody can tell you that you have problems, but you don't have the resources to fix the problems. Then the problems just languish and you're probably exposed. Because you know about the problems and you haven't done anything about them.
George Pappas : Yeah. And there's also, in the third party problem mean there's this nexus where, and HITRUST is the most complete [00:11:00] repository we have of people who are certified, and it's the richest framework, but there's a nexus of, if you can agree on that common set.
And then each general council of a large system might have their three or four other things that they want. You can find a way to map this something Jeep, by the way we did at Intraprise.
But
essentially allowing the redundant work to be done once or twice and be shared by many, but allowing the extension of that for the particular needs.
I mean, that's the kind of direction that we're gonna have to go in as an industry. 'cause it certainly doesn't make sense. The way it's being done today, but you can't tell a general counsel or a cyber insurance carrier to accept, you know, a very vague promise of a form that was filled out three years ago.
Without the other questions you need to be answered. So, you know, there's gotta be some room there and if we can come up with some kind of a liability management of that and a little more transparency, we actually would make a lot of progress.
Drex DeFord: Maybe
George Pappas : my next life I'll look at [00:12:00] something like that.
Drex DeFord: Yeah. Right. I wanna hop to another story that you had sent the community hospital in Colorado family West Health, that had a cyber attack and kind of took the turn everything off approach protecting themselves. Do you see that often?
George Pappas : Not a lot, but what really struck me about it, which I appreciated.
because I looked them up after I saw the article. You know, so many critical access hospitals and they're 25 bed critical access designation, right?
You
walk into one of those places, you realize this is like a small business. That's doing a very complex thing with massive regulation and management of its payment model that keeps it on its knees, right?
So let's start with that. And so in that environment, a lot of these operators kind of just raise their hands, throw their hands up and say, oh, we can't do it. But these people had enough common sense and realizing that it isn't about fancy security policies, though they're [00:13:00] needed to some extent. It's about teamwork.
Oh, and having some kind of a plan. A plan, actually practicing the plan, right?
Yeah.
Actually then testing the plan. Right. And those are things that don't require a lot. I mean, I think within our network, the best person I've seen do this as an Anahi Christiana Care, she involves, she's a big system. She's been there a long time.
She's very smart, but she got. All the different stakeholders in a region involved.
Drex DeFord: I know. They do that every
year.
Yeah. Yeah.
George Pappas : But that's like, everybody should do that. It doesn't require, you know, a massive software license of 15 things. It really requires everyone else in the organization realizing that security is part of their job too.
Right. And reinforces that, so I really like that example for that reason. The other thing I would add is having a security partner on standby. You don't need to have a ongoing monthly retainer. You [00:14:00] can have a. One-off, you know, prearranged license agreement, maybe a couple thousand dollars to be on warm backup.
Right. Kinda like disaster recovery.
When it happens, get people in. That can be your separate set of eyes and ears. You don't have to make it super expensive. It's if you kind of slim down the service to the economic ability to sustain, you can actually make something that can live and work for a while.
Sarah Richardson : Well, their whole statement of it being like a culture of preparedness.
George Pappas : Yeah.
Sarah Richardson : Which I loved. It was like the staff was trained, they knew exactly what to do. Yeah. And I mean, they had the right technical controls to a degree, but they said that human readiness is what stopped the attack. Right. It was like, you know, intruder.
And I looked at, I mean, you think of a 25 bed critical access hospital. You're kind, it's almost like its own unit at a big system. But the fact that they had that wherewithal to know how to shut it down and run. Yeah. How many conversations do we have with different leaders that say, whose responsibility is it to main continuity of operations, of patient care when there [00:15:00] is a, let's just say a shutdown in this case?
It's not, it, like those floors need to know how to run on their own, but they need to understand those technical implications. But each unit being able to shut itself down and sustain through the isolation and recovery is pretty key. And if you can do that across multiple aspects of your health system more power to you.
Kudos to a 25 bed hospital saying to shut the whole thing down until we get this thing ameliorated. And they did. I mean, there was no exfiltration of patient data. Right,
George Pappas : right. And then they brought it back up when they were comfortable it was working. And again, easy to do in a small operation, but still the concept of teamwork and kind of pragmatism, I think sometimes gets lost in the need for all these things.
And you know, you mentioned earlier, drex, it is so hard. Because healthcare, I'm sure someone measures this somewhere, but the amount of technical debt
That these Intraprise systems have, these organizations are carrying, especially if they've had m and a activity. Because it's a very low margin [00:16:00] business for the operators, right.
It accumulates and how can you not make a configuration mistake? Right? How can you not do all these other things? Yeah. So that's really where a little more practicality would go a long way.
Drex DeFord: For me, one of the takeaways here too is there's a ton of work that you can do, but sometimes it's the really simple things like who in your health system. Has been assigned the authority. To disconnect from the internet to make the decision. We're gonna, makes the choice disconnect from the internet and we're gonna start shutting systems down. And we've got a process to do that. Like these guys are very small, you know, a lot of these 25 bed critical access hospitals only have two people in it, or three or four people in it.
And maybe they're doing supply chain and contractors. Right? In addition, in, in addition to the work in it. But. To have the wherewithal to kind of say, and sometimes we find about, we find out about this after the fact. When somebody does it, it turns out it was probably some low level person who, you know, like didn't [00:17:00] carry a VP title, but were just smart enough to go over and like, take down the switch.
They didn't have anybody tell 'em to do it. They just knew it was the right thing to do. Right. And it saved the hospital's bacon. It'll be interesting to kind of hear the rest of the story on this one, but.
Sarah Richardson : I'll be honest, in a previous lifetime I had my incident response guy was a formula nuke sub engineer.
I was like hired because
George Pappas : the better recovery procedures, you know,
Sarah Richardson : how to get out of a sub alive. And so I was like, you're perfect for the job. And so he didn't have as much security background, but he knew how to do that and so we taught him how to manage a security incident response and tell you the guy was amazing.
Yeah.
Yeah. You know, and another
George Pappas : dynamic of that I'm thinking back drex to, that one we had about you, Chicago, I think it was in our last session. But you know what doesn't get enough attention, especially in medium and large organizations, is crisis communications. Because that's where everything you say, remember now the class action lawyers are out there.
As soon as it happens,
Drex DeFord: They're cooking. Yep.
George Pappas : Has downstream impact. And so have you planned for [00:18:00] that? Do you know how to handle it? You won't know the answer even though you want to give them an answer. I mean, all these things that are really challenging.
If you're winging it the first time through, you're gonna cause downstream ripples that you really, you know, are gonna harm your
Drex DeFord: organization.
we can't get away without talking about ai. No. And there's a story about how AI adoption is moving faster than it can be secured or governed and all of that. What's your thought when you look at that story about the things we should be thinking about in healthcare?
George Pappas : Yeah. I mean. Yeah, obviously it's the topic of the decade the basic thing that I thought was relevant for our discussion today was that there's so many, so attractive possibilities that the risk reward kind of trade off almost gets a little overweighted emotionally. At the same time, the more important thing to realize is that we don't yet.
Have a truly reliable set of practices to test [00:19:00] AI in the way we need to. And I call it a few of those items because as an old software engineer and a data architect and all that jazz, right? You and I have been doing this a long time. And Sarah, you've been in this game a long time. Older technology, you could look for deterministic things the code was doing.
You could organize different kinds of black box te penetrate other tests. But because AI is so indirect, how do you indirectly see what indirect things it's slipping into the system that very clever cyber criminals are training. And I mean, there's so many layers of indirection that it's very hard. To see that going in, you've gotta actually invest more in some very potent penetration testing that is really the black box kind.
Make sure you know it's actually fully sequestered. And you know, the other thing that I would say, we did this with the agents we released in our product, you better keep what you're doing to a very narrow [00:20:00] task range that you can validate. I mean, one thing AI is very good for, by the way, is generating test data.
Test cases, right? They used to be the, that was the task the junior programmer had gotten. They hated it. The only thing worse was documentation, right? So this is a way to really. Scale a fairly important task, but you have to invest the time and consideration and some cash to have an outside organization really beat the tar out of it.
Now, OASP and hitrust, NIST. They're all grappling with AI standards. They're gonna change five times over the next 12 months. But if you look at those principles, think about what you wanna deploy, put it in its own box, long enough. At least you can minimize the probability right, of some of these issues because they're there and they're gonna pop up.
Drex DeFord: I was just gonna say, Sarah, there's a whole bunch of other aspects to this too. There's the software as a service stuff that we are running and I've talked about this before, but like on Tuesday, a new button shows up the AI [00:21:00] button in the application that we've already bought.
Sometimes we have folks that are using chat, GPT or other open source. Apps and they're doing it not in a bad way. Right. They're doing it 'cause they're trying to get their job done. And then we have the sort of model where we actually put some of this stuff in house and we use our own version.
Right. Of the LLM Sarah how do you reconcile all of these challenges?
Sarah Richardson : I put AI in the same category as we do, and I'm not oversimplifying, but like data governance. Like if you. No, you have to do these things. I mean, you've got data in your org and data's a precursor, obviously, to successful ai, but it's not about whether or not you're gonna use ai.
It's how you operationalize and secure it. But consider data governance. You have the data lineage, then you have your model validation, and then you have your access controls aspect of all of that. You do all of that the same way with ai. You're gonna be at least ahead of it enough to know that, hey. It's not that all the people are gonna get replaced by [00:22:00] ai, but if you're not being prepared.
With all of your humans in these spaces, then you're not gonna be ready for what is next. And so you'll get exposed if those aren't the things that are true. So I would go look and see what is the maturity level of an organization's data governance. If they have strong data governance, they're likely going to be able to bring AI governance in a very thoughtful way to the organization.
But if their data is a dumpster fire, there's no way they're gonna get ahead of anything when AI comes to town as well.
Drex DeFord: From a third party risk perspective, George, Sarah and I have had a lot of conversations about this how people are dealing with AI based on the vendor partner that they're working with.
So if this is you know, one of your core vendor partners and they're introducing some new AI capabilities, you may. Feel like you need to put as much attention on that as you do a brand new company that you're bringing in. Who's bringing in AI capabilities? And there's a lot of questions now that I have to ask.
Are you seeing that [00:23:00] is, is that similar thinking that you're seeing out there?
George Pappas : And that's why some of these isolate and examine things are so important because it's so easy to take the path of perceived least resistance, right?
There's trust already, but they're handing you something they don't fully understand either.
And I think Sarah, your strategy on governance is the exact right one. Part of the challenge is this is like liquid helium. We don't know how to control it yet, so how can we validate a model? We really can't. Right. And that's where it gets more challenging. And if you can at least bound the functionality, you have a better chance of validating it and testing its edge cases better.
But you know, some of the stuff that we've seen, you've been in the relational data world for a while. We all have where you used to have SQL injections, now that there's prompt injections. Yeah. From a model. I mean that is such a level of indirection. That you better be very clear for the things you're doing with it, that you're really ready to move properly.[00:24:00]
Speaker 4: And that's where I think
George Pappas : we have
Drex DeFord: work to do. Sarah is it like doing close monitoring of a new employee to make sure, I'm just trying to find an analogy that I think we can wrap our head around. Is it like with a new employee, you're gonna pay a lot more attention to the work that they're doing and how they're doing it and making sure that they're doing it right before over time you start to back away and let them operate on their own.
Is that a way to think about it?
Sarah Richardson : To would agree. Although I still believe there's a level of responsibility of all of this that lies within your engineering team, whether you do software or otherwise. And so having that occasional QA of certain things, you're going through all the validation modeling when it starts, and so you're like, okay, we feel good about this.
Well, it's gonna morph a change super fast because that's the rapidity of what's occurring. But if you're going in and you have people doing the actual QA work, so everyone's like, oh, I don't need all these QAs anymore. Well, you need some. This is an example of where their basic. Functions can be augmented by AI to give them more time to then go check on the more advanced aspects of ai.
So it's a little bit of this weird [00:25:00] loop, but it reallocates people to be looking at things in a more detailed manner. And then you can tell the risk side of it and you can tell the story side of it. And you're bringing those things together in a way that other people can understand. So constantly creating that trust mechanism inside of an organization is huge because you're as transparent as you can be.
And then the readiness aspect of what we're looking at, that's one of the things that if I was still running a software team or running a bigger team. I'd be reallocating people's ability to go and validate these things a little further down the chain and making sure they're staying up to speed on how to do that in a way that tells a story that the organization at least feels like they're doing everything that they know is right,
Drex DeFord: because it still comes down to value versus risk. You know, this is the benefit we're getting. We're gonna have to grant take some risks.
George Pappas : Well, that's the top of the funnel of your governance process, Sarah, because if the impact isn't worthy, why would you take the team's precious time and opportunity costs to go do all that when there are the things they could do, right?
Sarah Richardson : Yeah. I mean, your quality team and [00:26:00] clinical is the same as your quality team. Operationally, that's inclusive of what's happening within the IT organization, right?
Drex DeFord: George, we could talk all day about this stuff. I really appreciate you being on the show, sir Richardson, thanks for being on the show too.
Of course.
Sarah Richardson : Thanks for having me.
Drex DeFord: Thank you. Thank you. I'll catch up with you soon
That's Newsday on UNH. Hack with Drex De Ford. Get daily security insights delivered to your inbox because every healthcare leader needs a community to lean on and learn from. Sign up at this week, health.com/subscribe and stay safe out there. I'll see you around campus.