This transcription is provided by artificial intelligence. We believe in technology but understand that even the smartest robots can sometimes get speech recognition wrong.

[00:00:00] Today's episode is brought to you by Censinet, healthcare Organizations face Mounting enterprise and third party cybersecurity risks across vendors, applications, patient data, and medical devices. The CIT risk ops platform is the only cloud-based risk exchange. That's purpose built for healthcare, connecting hundreds of hospitals and health systems, and more than 50,000 vendors and products.

Take the risk out of healthcare with Cincinnati's collaborative and community-led network. Learn more at this week, health.com/censinet. .

I'm Bill Russell, creator of this week Health, where our mission is to transform healthcare one connection at a time. Welcome to Newsday, breaking Down the Health it headlines that matter most. Let's jump into the news.

Drex DeFord.: Hey everyone. Welcome to Newsday. I'm Drex and there's a whole gaggle of geese on the show today.

Bill Russell.: That's what we [00:01:00] are, the

Drex DeFord.: gaggle Sarah Richardson.

Say hi.

Sarah Richardson.: Hi, Sarah Richardson,

Drex DeFord.: and Ed Gaudet with us too for in Cincinnati. How you doing? Hey Drex, welcome back. Thank you so much. It's good to be here and it's good to have you on the show. I've been on your show a couple of times. I saw my name on the back of your t-shirt.

Wait a

Bill Russell.: minute, you've been on Ed's Show, Bill. You're always invited. Sarah, have you been on Eds

Show? Sarah? You're always invited. Oh my gosh, I haven't, but it's not

Sarah Richardson.: like, I haven't like. Now you're getting invites to get prime time.

Ed Gaudet.: You want to be on, because when we go to 200, we're getting close.

We're gonna send out the next installation of concert t-shirts. Ah, Sarah, your name could be on the back of that. It's so that would be

Drex DeFord.: so good. I can't believe, I don't somewhere in this office have mine here, but it has risk never sleeps on the front. And then it has like everybody's name who's been a guest on the back and it is, that's right.

Who's who. Of cybersecurity and healthcare leaders is, and

Bill Russell.: you're gonna do 200 and you're gonna hand embroider each one. That's [00:02:00] amazing. Yes, bill. Bill, we're gonna need extra space for your name. Oh man.

Drex DeFord.: Bold vitalic and a special font That's,

Bill Russell.: This is the first news day we've done.

Yeah. And we're already completely sidetracked.

Ed Gaudet.: Yeah. Is this the new format? Let lemme Oh, this is the new for I'm the first on the new format. Yeah. You are the first in the new format. I knew when I knew the new name

Drex DeFord.: showed up, I was like, this is gonna be, well, I think I love it. Love that in the email.

It's gonna be fun and probably a little chaos. This is what's gonna happen.

Ed Gaudet.: Dude, my, I got multiple plates spinning, so you'll keep me active. Anyway, I thought we played Jeopardy too. I came up with a new jeopardy.

Did. Careful.

Sarah Richardson.: It's like celebrity Jeopardy. That's still like my favorite SNL Skip of all time.

Bill Russell.: I'll tell you. Just be careful. Ed Spin's web and all of a sudden you're sharing stuff you didn't want out on the more of the ad wizards that came up with that bill.

Drex DeFord.: Alright, I'm gonna hit a couple of news stories here, ed.

All right. Oh, try to get us on track and then we can kind of just [00:03:00] see where we gonna want to go from there. There's a thing that's happened over the last few days. From a company called SalesLoft, and they have a product called Drift. It's like an AI chatbot that a lot of companies have integrated into their product as a way to engage customers, and the bad guys have figured out how to breach the SalesLoft drift chat bot guy and then move vertically inside to Salesforce and Slack and all the things that Drift is connected to.

So. What are you thinking about that? Have you seen any of that splash over into healthcare? Have you heard from anyone struggling with their SalesLoft drift or just in general talk about third party risk? What's ai? Sorry, we can't have a show without talking about AI

Ed Gaudet.: and maybe chat bots. Yeah, no, it's another example of a supply chain risk gone bad.

This time it's integration, right? So it's integration through the Salesforce partnership [00:04:00] program. We haven't heard from any customers yet directly. I'm sure it affected somebody out there in healthcare.

Bill Russell.: Well, we have two cybersecurity experts on, not, I'm not putting you down here, Sarah, 'cause I'm not putting myself in this category, but we have two experts on the line.

I'm curious if I put you guys in charge of, threat actors, like you're in charge of a group of threat actors. How would you be utilizing ai? This is for Drex and Ed, like, like I'm gonna, I'm gonna fund, I'm gonna bankroll you. I'm gonna be your vc Yeah. To get your hacking group up and running.

How are you gonna use AI to infiltrate my healthcare system? Yeah.

Drex DeFord.: This is one of those things we've kind of already seen. So this idea of a zero day that comes up or zero day is announced by a company or a product. We have a problem. We haven't figured out the patch yet, but just know that this is a problem that we're having.

So now the arms race is on, right? How do I get from the zero day to an exploitable product that I can [00:05:00] now run at all of those companies who run that product to try to get inside and steal data and do ransomware and all the other nasty things that they do. So it's all about speed. The company's trying to build a patch to fix the problem, and the bad guys now are using AI.

used to be a complic. It's, and it is a complicated thing. You have to take the zero day, the announcement, you have to kind of reverse engineer it, figure out where the problem is. Then you have to build the exploit. Then you have to figure out how you're gonna feel the exploit, and then you have to field it and execute the business plan, right, of the bad guy.

And this is where AI has really helped that bad guy go faster because it's much easier now. And a lot of people can do this without having expertise in how to reverse engineer zero days and create an exploit and then figure out how to field it. And so if I was a bad guy and I had a bunch of money, that's probably one of the places I would put it.

Bill Russell.: Are there specific devices you're looking at? I mean, do you care as long as you get on the network and are able to. I don't [00:06:00] think I use it that way. And

Ed Gaudet.: Drex, I don't think you answered his question either. Oh, okay. Go ahead. Then I might use the deep fake, I might use the deep, fake aspect of it to get done.

Oh my gosh. To get credentials. I might send out a Zoom invite to somebody and get on a zoom invite looking like that person's boss.

Or a colleague and and try to, from the help desk possibly. There's a whole group of hackers right now that are leveraging AI this way to get in.

Bill Russell.: And so you just wanna come in the front door, like, why not? It's easy. Good credentials. Get in, start tooling around. Yeah, I like that

Ed Gaudet.: exfiltrate. Sit around on the network, look around, really understand where the value is, and then hit them hard, or maybe coordinate it in a way that takes down the entire industry.

Bill Russell.: I don't think I'll get in trouble. I think the statute of limitations is off on this. But the last time I had Deloitte do a white hat attack on St. Joe's. The first thing they did is they set up a website instead of S-T-J-O-E s.org. It was [00:07:00] S-T-I-O-E s.org.

It looked hypo. I mean, it was perfect. It looked perfect. And then they sent the email out to a bunch of people and I think they had like with like only doing that for like a day. I think they had 85 good credentials that they could get into the system. And then once they were into the system, then they could start to tool around.

As you said, I mean, it's still the easiest way to get in, isn't it? The easiest way it is.

Drex DeFord.: I'm curious, can we

Ed Gaudet.: go on the topic, the initial topic? So how do you think they got into, how do you think they got into the SalesLoft drift?

Do you think it was a phishing attack? Because it haven't really. Yeah. Declared

Drex DeFord.: the entry point yet. Right. And I think you're probably right, it's probably was some kind of a phishing attack. 'cause it looks like they somehow compromised the OAuth credentials to be able to get into Drift. Right?

And then from there they moved vertically. But how did they get that first nibble, right? Or that first bite at the apple. Yeah, humans are the [00:08:00] weak point in the system. I mean, not that technology isn't weak too. There's a bunch of stuff now being published about all of the AI production of software now coding and how unsecure a lot of that stuff is. And so we're laying the foundation for a bunch of other holes that we're gonna, we're gonna be, you seem surprised.

Bill Russell.: No

Drex DeFord.: I'm

Bill Russell.: not, we're gonna end up with more followers now because they're just gonna, they're gonna send us out to all the physicians and the nurses.

This is how we really think about you guys. Like you're the weak link. If it weren't for you, we could have really secure networks.

Drex DeFord.: It doesn't matter if they're doctors and nurses or if they're, oh, it doesn't, yeah. No, that's true. I mean, it really is amazing. I think about even, my days at CrowdStrike some amazingly good phishing emails that even hardcore cyber pros like me fell for because they're really good and.

AI really has made them better, as you say, ed.

Ed Gaudet.: And then we do stupid things like put policies in place that say, after like the second offense, [00:09:00] we're gonna like, fine you or fire you or something. So then nobody. Brings any of these suspect. It's almost a disincentive.

It is. It's a disincentive. It's crazy. So, oh, I

Drex DeFord.: clicked on that link, but I don't want to tell anybody, but I

Ed Gaudet.: don't wanna tell anybody. It's my second strike. I could be fired. Yeah.

Drex DeFord.: What's the best approach for that, do you think?

Ed Gaudet.: I think you have to educate.

What was the best approach for hand washing? Right.

Drex DeFord.: There any, is there ultimately any penalty for clicking on stuff? I don't know. I

Ed Gaudet.: don't think so because it's just getting harder and harder. Right, so let's assume the next big thing comes out and you click on it after the third time, but it's some new innovation, like I should click on it.

Is it your fault? Probably not.

I think you hire good people, you do your background checks, you trust them.

Drex DeFord.: You and I have talked about this before, that whole, you have to create a culture where it's okay to say like, eh, I clicked on something and I don't know, I might have made a mistake.

And you gotta be able to say that and not get smashed in the mouth [00:10:00] for it. If agreed, because Don't wanna

Ed Gaudet.: talk, everyone will shut up. Yeah.

Drex DeFord.: It's all about speed. I mean, if you, it's all about speed. You gotta be able to catch it quick. And if you don't say anything, then it goes on for a while. So the

Ed Gaudet.: category is hot for hackers, but that's what

Drex DeFord.: I was gonna ask you.

Are you really

Ed Gaudet.: gonna do Jeopardy? The category is hot for hackers and uh, here's the clue Ziggy's Space Invaders.

Drex DeFord.: Sarah and Bill are just looking at us now. They don't. Glue is Ziggy Space Invaders. Come on. This one's easy. This one's

Bill Russell.: sees, well, there's a reason I don't go on jeopardy.

Ed Gaudet.: All right. I'll give you, I'll give you this one. Scattered spiders from Mars. Yeah. I would've never gotten that one.

What they did there. Alright, the next one, stripes. Sleepy. Smiley archers.

Sarah Richardson.: Got it. Well, I hear stipend. I think of Michael Stip and I think of REM Uhhuh. So if I'm on the right path there, then I'm gonna just say, who is REM?

Drex DeFord.: I'm gonna say it. No, we It's shiny. Shiny hunters, happy people or something like that. Yeah. Shiny,

Ed Gaudet.: happy hunters.

Drex DeFord.: Yes, shiny happy hunters. Okay.

Sarah Richardson.: Okay. [00:11:00] Well, here's your, here's what part of your trivia. I was a Okay. My senior year, my friends and I for Halloween, were we all dressed up as shiny happy people.

'cause that was the year that the song was out. So we had like these metallic costumes. Nice Billy Bass. But yeah, it was a very clever costume.

Ed Gaudet.: No, I met Michael Stipe in a bus in the back of a bus once, trying to break back into the concert that he was thrown out of. Oh yeah. He said nothing.

Nothing to me. Well, how you, because I was babbling about how much I loved him and how I needed to get back into the concert and I will not act like an idiot. I'll sit in the corner just listening to the music. He just got up, walked out, took me to the back, opened up the door and let me in security.

Bill Russell.: He was in the back. He was in the back. The, I feel the back in middle school hanging out with the cool kids going, I don't know what they're talking about.

Sarah Richardson.: Meanwhile, bill was listening to Enya.

Bill Russell.: Exactly. That's true. But I could go to Enya too. Bill.

Ed Gaudet.: Bill, what did you think about that Enya musical you were?

Bill Russell.: Oh, that was actually I was hoping it was real and I could buy tickets, but I guess. This is part of the conversation where I'm gonna step aside, create a

Sarah Richardson.: whole [00:12:00] thing for buying tickets to see any of the spare And Bill's gonna click on that link all day long.

Ed Gaudet.: We could produce it. We could produce it.

Dr. You had a couple other stories too. You had CISA, right? I did, yeah.

Drex DeFord.: There's a law. Called CISA. It's the Cybersecurity Information Sharing Act of 2015. This is a law that was put into place. It's supposed to expire after 10 years or be renewed after 10 years.

It seems to be in some kind of like maybe turmoil, like maybe it's not gonna get renewed. And the whole idea behind this CISA, not the organization CISA, but the law, CISA, the challenge is that. It creates sort of like a safe harbor where when you've been attacked and there's something going on, you can actually talk about it and share information about it without the fear of having that used against you later.

And so there's a lot of folks that are kind of scared about this CISA law disappearing off the books. What do you think? Yeah, it's

a

Ed Gaudet.: problem. Government, get your act together, let's go. Like, we need [00:13:00] this. Like, what are you doing? You're back from recess.

Bill Russell.: What's the rationale for doing away with it?

Drex DeFord.: I don't know that there's really necessarily a rationale. It's one of those things where it's like the law has to be renewed after 10 years and there seems to be no, I don't know, consensus we can't get parties together to say like, this is an important thing that we should actually do.

Politicians bill. It's politicians,

Bill Russell.: The pace of government. 10 years was not enough time to evaluate the effectiveness of this. But essentially, if I understand you guys correctly, 'cause I don't know the ins and outs of this. It's a framework that if I get attacked, I'm gonna share the information with other people within our industry.

Yeah. So that they can respond. Is that it? Yeah, that's pretty much it in a

Drex DeFord.: nutshell. Yeah.

Ed Gaudet.: And sharing it at the pace. That makes sense and it's effective, right? Versus you can eventually share it, but it's gotta go through all these legal hurdles and by the time it gets shared, it's too late at that point.

Right.

Bill Russell.: Maybe it, the, it should have been for [00:14:00] 15 years and they would have enough time to, to do it.

Ed Gaudet.: That's note to self, make the law longer.

Drex DeFord.: Or just make the law. Law and then when you decide you don't want it anymore, don't make it. So it has to be renewed.

Just it has to be repealed. Don't renew

Bill Russell.: the law. Yeah. Are there others in this category, in the cybersecurity space? I mean, there's been a lot of change in the cyber space since we've had the new administration. I remember just prior to the new administration coming in, there was really. Strict things coming towards healthcare and everyone was sort of up in arms like, oh my gosh, if this goes through we're in trouble.

Like, we are not gonna be able to do half of these things in a timely fashion. And then it all sort of evaporated overnight. Director's looking at me like he knows what, see I just know concepts. I don't know the specifics. What Bill's referring to is the, you're spot on

Drex DeFord.: The propo, the NPRM that came out around the HIPAA updates and a lot of stuff that was in that in the proposed rule.

And then the new administration came in and it's sitting on a back shelf somewhere in somebody's [00:15:00] office and nothing's ever gonna be done with it. And that's why it's called a proposed rule too, is that they put stuff out there and then people make comments and then they adjust and change the rules around that.

But it does never went anywhere. Between that and cuts it cisa and lots of other things like. And one of the other stories I was gonna talk about was that there's an industry group that is focused on there was a four year cyber grant program for state and local governments who traditionally don't have a huge amount of money to put into cybersecurity.

They don't have necessarily the cutting edge kind of staff go to tech companies and you have state and local government employees who are working their butts off and doing their best, but they've only got what they've got and they're responsible, obviously for like, are the toilets gonna flush tomorrow?

And are, are other things that are really important to us going to happen? And so those grant programs are ending and there's a lot of concern obviously about whether or not. What happens in state and local government, [00:16:00] which are being attacked, pretty vigorously. We see regularly cities being attacked and state governments being attacked.

So all of this kind of, conspires to the, i'm not sure the government's necessarily gonna come to help in the near term. Maybe in some places they will, but this is more reason than ever for us to build community and have these conversations for laws like CISA to be re renewed so that we can help ourselves as much as we can help ourselves.

'cause I don't know that anyone else is writing to the rescue.

Ed Gaudet.: Well, AI's writing to the rescue. So

Drex DeFord.: tell me more.

Ed Gaudet.: Yeah just be patient. You're actually be working for the AI overlord soon. You won't have to worry about these people. It'll be the,

Drex DeFord.: I saw somebody say something the other day about I for one, welcome our AI overboards.

That's right. If you wanna suck up to the AI overlords in advance, that probably would be a good thing.

Bill Russell.: I'm gonna throw something out here. So OpenAI did their [00:17:00] launch of chat CPT five. And they had that little section in there in the middle where it was healthcare related.

And you had a person came in who was fighting I believe, cancer. And she talked about how she used chat GPT to really understand the diagnosis and understand the test results and understand all those kinds of things. And and it turns out just this week, OpenAI appointed a Ashley Alexander.

Veteran from Facebook and Instagram as VP of health products signaling a deeper ambition within healthcare technology and patient access transformation. I'm sure that makes healthcare people. crazy when they, I mean, just everything about that makes them crazy.

It's like, why would you not bring somebody from healthcare who really understands healthcare? But

Ed Gaudet.: why does that make you

Bill Russell.: crazy?

Ed Gaudet.: How many failed Microsofts going into healthcare, Google going into healthcare, Amazon going into healthcare? How many times do we have to relive this? Right? Yeah.

[00:18:00] Oracle buying Cerner, right? I don't know. I do you think at some point they get it right? I don't know if they do ever get it right. I thought Amazon would get it right. Actually, I was a big proponent. I'm also an early investor in Amazon. I really thought they'd figure it out, but they haven't figured it out.

Bill Russell.: But here's the thing, and every doctor will tell you this.

I mean, Dr. Google was a thing for many years. People would take their stuff, they'd go to Google, they'd and it was worse, right? Because they would bring in these piles of paper to the doctor visit and say. I think I have bubonic plague or whatever it happens to be. And the doctors would just shake their head and be like, oh my gosh, I can't believe this is happening.

The difference now is instead of getting, 55 sites that I have to wade through and print stuff out it feels a lot more like I'm talking to somebody who has a medical degree. Who is giving me answers. Now I've done enough programming and other things on this thing that, the reason it scares me a little bit is I know it's going to be widely [00:19:00] used for this purpose.

And on the programming side, every now and then it does something that I look at it and go. You didn't understand the question, like I don't know where you're going, but let's come back to where I started this question and let me ask it a little differently so you don't head off whatever field you just went to, I want you to come back here.

Well, it's okay if I'm doing it with programming and it's okay if I have enough background in programming to look at it and go that AI just went into a dark place. But in healthcare, if I asked the question, I'm not sure I'd have enough knowledge. no clinical

Drex DeFord.: background.

Bill Russell.: Yeah, no clinical background to say.

And so that's who's gonna be accessing this. I know. Sarah, are you concerned about this at all?

Sarah Richardson.: I am. And so it's mentioned I have, the big ed questions. This is most likely for the group as well. You consider how AI is being used at the point of care. So you have, let's just make it up.

AI stethoscope, ambiance a big deal across the board. Even some of the predictive tools, we used to call that decision support, but it's taken on a whole new look [00:20:00] and feel these days. How are we talking to the board and the executives about the cybersecurity risk? Like we wanna stay innovative, and yet now we have this variable that is still being learned and morphed every single day.

What does that conversation look like to the board about how to stay innovative, use these new tool sets, and yet a level of risk and awareness that still keeps us resilient when it comes to patient care?

Ed Gaudet.: A lot to unpack, but you know, first of all, we have to change our perspective the way we think about third party risk, and most people have been thinking about it from a vendor and product perspective, and what we believe is.

In order to really get your arms around risk in a holistic level, you have to think about it from a business process and critical function perspective. And so all of your business processes today and critical functions are supported by technology. And that technology, that relationship has to be considered, has to be mapped in really order to get a [00:21:00] sense for the true risk.

In your organization the true priority of risk or potential risk, whether it's systemic like we saw with change health or whether it's a single incident and part and parcel with that is. Thinking differently about how we manage risk to be much more inclusive, not based on silos, right? So for decades we've been managing risk in the silo of cybersecurity, and that was necessary at some point.

But today it's not sufficient because we all have learned over the last couple of years that risk is so much bigger than just that silo, that slice of cyber. It does impact medical devices and oftentimes medical devices, sit outside of cyber. So now you've got two functions that are pretty much duplicating efforts, not coordinating.

Right. Then you talked about resiliency. You mentioned continuity, disaster recovery. That often sits outside in its own group too. These things need to be consolidated. They would be brought together under the view of business process and critical [00:22:00] functions. Because ideally, if you understand your critical functions in a healthcare setting, to your point, the point of care, that's a critical function.

If I can't deliver care, I cannot deliver business. I cannot generate business, right? I cannot service my patients, and I can't do it safely. And so in order to get my profile, correct, my tolerance, my risk control set properly so I can actually function in the context of the things I need to do to support my business.

I better prioritize that 'cause I don't have infinite resources. I can't spend infinite dollars either. So getting that priority lens over those critical functions, the business processes that matter is really CRI is important.

Sarah Richardson.: So Ed, if you're the CEO of the hospital

Where do you want that function to reside?

How do you wanna see it? Oh, governed is probably the best word, although not my favorite in this case. But you're a brand new CEO at a hospital. You walk in, this is a truism. Where do you see this living?

Ed Gaudet.: Yeah, that's a great question. First of all, forget about where [00:23:00] it's living so much as I would say it needs to be elevated at a level that is on par with the finance committee and audit committee, and comp committee at the board.

And so right outta the gate change the structure and function of the board to include cybersecurity. If it's under the CIO, the ciso the head of GRC or compliance, as long as it's at the board level.

Bill Russell.: makeup of that board? Because I, I had that group and every time we went it was like I was educating.

I mean, I hope none of 'em are listening, but it was like educating 1 0 1 on cybersecurity. There was always one person in the room who's like the tech person for the board. Who acted like, yeah, but that's what we were talking about who had no freaking idea what we were talking about. The individual at the board

Ed Gaudet.: level needs to communicate just like a finance committee communicates the finance details at a level the board understands.

So the audit committee communicates the audit, not the details behind the audit. But what's necessary for proper governance [00:24:00] at the board level, the same ex, I mean, cybersecurity, I, it's so fundamental to survival, and yet we don't have an individual at the board level governing cybersecurity, translating on behalf of the board, not bringing in a ciso, not bringing in a director of IT to communicate on a PowerPoint slide.

But talk about it in terms of critical functions, in terms of uptime and resiliency. And continuity. That's what they care about. Or gaps in the program that need to be funded because we understand where we are given our peers. Right. Not in terms of what, the latest and greatest patch of CrowdStrike.

It's not a technical

Drex DeFord.: conversation.

Ed Gaudet.: Business conversation. Exactly. It's a risk conversation. 'cause risk is business. It's not technical. That's the problem we have because we're risk. It's interesting too, risk like it's technical.

Drex DeFord.: It's interesting too because we have a lot of conversations about risk at the board level from a clinical perspective or absolutely.

Where we make [00:25:00] mistakes or where there's a potential for mistakes. So we have the, maybe the wherewithal to pull this off. We just need to. Broaden that conversation out to the rest of the, everything depends on technology, right? You're

Ed Gaudet.: framing the Da Vinci in terms of the business, not in terms of the mechanics behind how the thing works, right?

At the board level. Right? Board doesn't care about that. Why is it important to the business and what does the board have to know in order to do its job to govern appropriately? Right? And I think that's, there's an impedance mismatch between risk cybersecurity. And the board and governance and, I mean, we're getting better at it, but it's so slow.

It's people, it's so only green. Oh God. Boy man. So go back to a movie reference.

Drex DeFord.: Oh my gosh. Sorry about that. That's a classic. Hey, Sarah, you have anything else you wanna ask Ed?

Sarah Richardson.: One last question. Okay. Is, yeah, because the [00:26:00] newness of just say AI and some of the generation that's occurring out there, we're seeing more and more pilots, and yet philosophically, we always say a pilot is bound to fail because it's rarely time bound.

It's really not often measured, and yet at the same time we're seeing like five head-to-head pilots. I'm curious. Specifically from a cyber perspective, at what point do you know the pilot has been successful, and do you prefer a pilot when it comes to a cyber and risk perspective? Or do you like know that if it's the right solution, let's go for it and put the same parameters in place that we would for a controlled environment like a pilot.

Ed Gaudet.: The pilot, it's sort of an illusion of control that one puts in place to say, we got this. We're gonna manage the risk of AI by piloting it. The reality is AI already exists. It's already there. It's in existing

products and vendors that they have deployed that they're using today that they don't even know about, that's the bigger risk in the organization. So, go ahead and [00:27:00] do the pilots for new stuff. I think that's fine, but you gotta get a handle on your existing inventory. When You have to understand I remember when when AI came out in Acrobat, jammed AI down our throats.

They put in this. AI capability by default that you couldn't even turn off. I looked at and I couldn't accept it, so I had a uninstall acrobat, Reddit blew up, and eventually they recanted it and they took it out. They put it in as a knob. So I think, organizations need to be much more vigilant about policy and principles as it relates to adoption of technology.

Have the courage to say, if you don't go through our process, we're not going to buy you. Right. Have the courage to say, we're not going to renew the contract if you don't fix those corrective actions we told you about nine months ago.

Drex DeFord.: These are conversations about how you build your contracts and language.

You put in your contracts upfront. Yes, too, right? Because Exactly. We had this huge conversion during COVID. We went to a bunch of software as a service. Products. And now we have [00:28:00] ai. And on Thursday you open up the app and like you said, there's a new knob and it's an AI thing and nobody told me about it.

It's just now my whole, yeah, my whole's using it. Company has access to, its,

Ed Gaudet.: nobody did a, nobody stopped. They did an assessment on the new capability. Right. And that's why these certificates are dead. Right. The certificates of the SOC two certificates and others are dead. It's, it worked decades ago when technology changed twice a year, maybe at the most.

Right, right. It changes every single day. Hour minute. How can you manage on a certificate that once it's published, it's already outta date by six months?

Drex DeFord.: The on the product side and On the risk side. Yeah, on the cybersecurity side, the bad guys are also evolving and changing every day. That's right.

Sarah Richardson.: That conversation flows back up to the board, which say, you have a product everybody loves, and by the way, it's no longer safe to use it.

Are we willing to take it out and replace it with something that meets other criteria within our organization? That's a tough board conversation.

Ed Gaudet.: Well, it is until it isn't.

Until it's sales [00:29:00] loft and you have to turn it off anyway because unfortunately

Sarah Richardson.: Yep. As long as they can for you in those decisions. That's where we talk about where that accountability lies. Whomever is willing to make those tough calls that they have the support they need to do the right thing. And sometimes we've seen that's not always true.

Bill Russell.: I've seen some agentic AI coming out meaning it operates on its own and it functions against something. Let's simplest form is. There's a folder in your share, whatever that share happens to be, Microsoft Box, whatever happens to be. And there's an agent that sits there and monitors everything that comes through.

And a use case for that would be a folder that handles all your contracts, right? And then it pulls the information out of those contracts and puts it into discrete data elements and puts it into a database automatically. So all you do is you do the DocuSign, you drop it into this folder, and it does all these things.

I'm more concerned about the proliferation of these agents e either built by us or [00:30:00] not built by us, but these agents are going to be sitting there, performing work, doing things on an ongoing basis. I think it's a significant attack surface, but even if it wasn't an attack surface for threat actors, I think it's a significant area.

Like we can't manage the stuff we see, like this stuff that all of a sudden disappears in the background. We are gonna have to get really good at documenting where these agents are. What's the context they're operating in? What's its parameters for governance. How often do we visit that agent to, to ensure that it's doing what we thought it was going to be doing?

There's a lot of, this

Drex DeFord.: goes back to that risk conversation that Ed was having, and you're really sort of talking about two things. One is the AI that's built in and is a feature of something that we already own. And then there's this other version of ai, the agent stuff and other things that we're building ourselves or we may be buying separately.

And there's risk involved. There's risk on both sides, and we're really rookies at managing it. [00:31:00] Ed SalesLoft. Affected

Ed Gaudet.: 700 plus.

Drex DeFord.: Yeah.

Ed Gaudet.: Right? Yep. 700 plus companies on, it was an attack on an API integration. The largest integration we just went through is ai. It's ai, I mean, it's feature, but it's also an integration.

And so the minute someone figures out the vulnerabilities, sales loft will look like a, looks like child's play, which it is child's play because I think there're there, there're young hackers involved, right? The shiny hackers.

Yeah. Yeah. But yeah, no, we're just scratching the surface is really say ai. We haven't even and I think there's been a unnerving quietness about, I mean, even though this is a big event, I wonder what they're doing now as they're organizing. 'cause you know they're organized, right?

Everyone knows now they're organized, right? And so this SalesLoft looks like a little test in my mind. So what are they really planning? How big is this going to be and when's it gonna hit? And [00:32:00] AI is probably gonna be at the heart of that bill, to your point earlier. That's why I think this has to be a board level conversation.

Bill Russell.: Is that why risks never sleeps?

Ed Gaudet.: Yes. Yes. Bill? Yes.

Drex DeFord.: Well, I think maybe we should wrap up the show. This was a good news day. It was a good time. We went real long, but it's nice to hang out with you.

I really appreciate you being here. Thank you.

That's Newsday. Stay informed between episodes with our Daily Insights email. And remember, every healthcare leader needs a community they can lean on and learn from. Subscribe at this week, health.com/subscribe. Thanks for listening. That's all for now.