This transcription is provided by artificial intelligence. We believe in technology but understand that even the smartest robots can sometimes get speech recognition wrong.

[00:00:00] We want to thank our partner, Avasure. Over 1, 100 hospitals are using Avasure's virtual care platform to engage with patients, optimize staffing, and seamlessly blend Remote and in person treatment at scale, drive measurable outcomes and augment clinicians with an AI powered solution that deeply integrates into your clinical workflows.

Aversure offers virtual care solutions supported by a secure, scalable infrastructure that helps you to lead your organization into a future where cutting edge technology is at your fingertips. and Compassionate Care Converge. For more information, check them out at thisweekhealth. com slash Avisure.

Today on Newsday.

So for me, when you ask about what the future looks like, to me the future looks like making smart choices about who or what can take advantage of those devices otherwise, you could end up with so much data,

it could actually drive down

value over time.

My name is Bill Russell. I'm a former CIO for a 16 hospital system and [00:01:00] creator of This Week Health. where we are dedicated to transforming healthcare, one connection at a time. Newstay discusses the breaking news in healthcare with industry experts

Now, let's jump right in.

(Main) Hey, I'm Drex from This Week Health, and we are doing Newsday today, live from VIVE 2025. I'm with Jacob Hansen from AvaSure. How's it going? It's going awesome. It's been fast and furious

to start the conference. Have you had a ton of people come by? Had a lot of people here at the booth, and then we've also been doing lots of meetings in the, the Connect area down at the end.

And we've had some really exciting launches here at the conference. So it's been,

it's been a ton of fun so far. That's awesome. I'm going to ask you about some of those, probably in the context of Newsday. I'm taking the cheat sheet version of this. It's pretty hard to open up any website today that has anything to do with healthcare and there's not a story about AI or agentic

AI.

What do you think? You're involved in it. You guys do a lot of that kind of work with healthcare organizations.

It's a great point. It's amazing how much interest and how much AI [00:02:00] just captures every conversation that we're a part of. And as a virtual care platform, as you can imagine, it's critical that we not just bring virtual care team members to the bedside, but that we're helping augment or automate those things, and we're really excited.

Actually, at the conference this week, we are launching our first ambient listening model. So we have 10 different AI models, but this is our first ambient listening model for a virtual care assistant to support bedside triage operational triage. So think about let's say A care team member needs a second pair of hands.

virtual care assistant's name is Vicky, by the way. I just saw Vicky around the corner. Yeah, Vicky's pretty incredible. A care team member can say, Hey, Vicky I need a second pair of hands to help me, with this wound care with this particular patient. And then Vicky will ask a couple of clarifying questions.

Something goes to a coordinator who can then, find

somebody to bring to the bedside. It's really interesting, too, in the conversation that we just had with Vicky. The patient can say [00:03:00] something like I think have a tummy ache. Yeah. Or something like that. That would be very colloquial.

That's the way that we would describe it. But she actually converts it into like the medical diagnosis, the heads up for the next clinician to come in to check for stomach pain.

Yeah it's the amazing thing about working with great partners on, getting this prototype together.

We've worked with Oracle and NVIDIA. And that's allowed us to take advantage of some of the tools that they have. And so that means that, like you said Vicky can understand the connection between tummy and abdominal pain. Can understand the concept of I want to go home and how that relates to discharge.

All of those things. And part of that means that we have to be really clear about what we expect A. I. to do. This A. I. model that we've brought here to the conference is focused on operational bedside ambient listening. But then you think about also, our system needs to be a gateway for other algorithms.

Our platform has microphones and speakers there in the room. How can we [00:04:00] let Other partners who do clinical documentation, for example, tap into our platform for capturing something and sending it into EMR.

That is to me a really interesting part of this. I think when most people who don't know about Avasure think about Avasure, they think about the cameras and the microphones.

But there is a whole lot more that you guys are working

on. Yeah, it's it's the blessing and the curse of having been a part of this space for 16 years. Yeah. We started, we pioneered the virtual sitting category and our brand Telesitter became ubiquitous with this concept of sitting.

But in most cases, and we say this all the time now, the longer you've known AvaSure, the more likely it is you need to get to know us again because so much out of just necessity of what the market needs has changed on the platform, right? Now we have new form factor for our hardware. That's what I was going to say.

Besides,

Vicki, all the other things.

Tell us about some

of the other stuff folks will

see here. One of the exciting things that we've got here with this new [00:05:00] form factor of hardware is, think about the challenge of piloting or experimenting with these things. With some devices you have to run cables, and you have to do all these different things that require a lot of spend.

This new device, you could set it up, you could run it on Wi Fi, you could connect it into a private 5G network. Now you could set it up for that particular unit to run that experiment. We can do virtual visits, virtual sitting, all on that same device, while it also runs a separate camera so that AI runs unobstructed.

So AI is running in the background watching for falls or elopement, patient turns, mobility, all of those things. And then at the same time, while AI is doing all of that, a virtual nurse could be talking to a patient about respiratory therapy, and a sitter could be watching for falls, all of that happening simultaneously.

Through a single device?

Yep.

That's amazing. Yeah. What about patient privacy? We hear, I sometimes when I'm talking to CIOs or CISOs around the country There's some concern from patients. They're starting to express the I [00:06:00] don't know if how much I want to be involved in a I or I don't know how much I want a I involved with my care.

How do you address some of those issues?

Yeah. So number one, it's something everybody should be thinking about and worrying about. Makes a ton of sense. I would say a couple things. One is awareness at the bedside of when these devices are active or not. There need to be really clear signals. So some of our devices have light That will say, look, this is either active or not, and then the newest device, it can point down so that you're not even looking at the patient.

Oh, so the patient can clearly just look up and see the camera's pointing at you or not. It's not seeing me. And then for the secondary camera we can choose whether that should be active or not. And then the last thing I want to mention is there's always going to be a debate about AI and whether it runs at the edge or in a centralized setting, if it's a federated learning model.

The key is We don't capture and record that video. So what patients need to like have confidence in is the fact that this is directly consumed, not stored. It doesn't end up anywhere where it's being held. Their [00:07:00] image isn't being captured and kept track of, right? We use synthetic data or we have a new element that we've actually launched where we take a patient and we completely anonymize the patient, right?

They become an aggregation of all sorts of different people's, images. Huh. It makes them look like a zombie in the actual video. Huh. But it creates privacy.

And you can use that for training and for Correct. Oh, awesome. Yep, exactly. What's the future look like as we continue down the road, as you continue down the road?

Your work with NVIDIA, obviously a premier chip maker for this kind of technology. What's the future look like? for you.

I would say there's a couple things I'd want to mention. One is a huge part of the future is tied to the way we allow third parties to connect to our device.

We can't be all things to all people. If we try to be, we'll be bad at all of them. So it's really critical that we allow these other applications and algorithms to make use of our device in a compelling way. that delivers value back to the health system. They get [00:08:00] so much more out of the cost that they put in to run those devices if they can use what we do and what somebody else does and what yet another group does.

The key to that is arbitration. So for me, when you ask about what the future looks like, to me the future looks like making smart choices about who or what can take advantage of those devices and when. Why can they do that? Otherwise, you could end up with so much data, a flood of data, a flood of opportunities.

It could actually drive down

value over time. Part of the learning model then is working through making the determination which of the pieces of information from different partners that you need to be able to make a good decision for that patient.

Yeah and let's say that you've got AI running on digital camera.

You've got clinical documentation, ambient listening. You've got a sitter watching. And you've got a hospitalist rounding. All at the same time. And now another group wants to connect in. Do they want to just stream video? If so, great, we'll let them do it. Or do they need to take [00:09:00] control of the camera?

And how do we let other parties know that are actively engaged there? Hey, so and so would like to get engaged in this room. No different from what would happen if you're physically there, and people are in the room already, you'd come and knock. Hey, I needed to do X. Is that okay? Oh I'm in here for five more minutes.

Can you wait? Oh, sure. We need to facilitate that kind of interaction in

a virtual setting. I'm just thinking about rounding. Having been involved in tons of rounds as a CIO. Yeah. And seeing those 30 people trying to cram into that room. And making the patient really uncomfortable in a lot of cases.

So it sounds like you guys are addressing some of that.

Yeah we're thinking really hard about it. In fact something that we're working on for the future is a notion of how we handle all of this inbound traffic and then outbound decision making. And so when you think about rounding as an example this comes down to arbitration and smart rules, right?

How do we build rules so that If there's an urgent [00:10:00] need how does that displace somebody else that's doing something else? How do we make sure that they know what urgent thing happened that lets them know and then from the patient's experience standpoint? We've got things like a doorbell when somebody connects so they know and then they always start in privacy mode so they'll start out and think of something like instead of knocking on the door.

Hey, can I come in now? It's a doorbell. Hey, my name is nurse Joe. Everybody introduces themselves. I'd like to chat with you. I can't see you and you can't see me. Are you decent? Would this be a good time to chat about, tomorrow you're getting ready to head to a rehab facility. Do you have a moment?

All of this supports this notion of I'm not having my space invaded. Yeah,

so I mean that whole piece of just The protocols around how you manage the communication with the patient and what the patient should expect going into the room. And this, normally somebody would just knock and walk in whether or not the patient was decent or not.

So this idea of really respecting the patient more and [00:11:00] their privacy, their comfort is a big part of what you're doing.

And all of that and the speed with which we provide the care, right? When I had a hip replacement about 13 years ago. And when I got that replacement, when it was my turn to be discharged, they told me, hey, somebody's coming.

Hour passed. Two hours passed. Three hours passed. Four hours passed. And the whole time I was like, should I press my call button? Should I ask somebody? I don't want to bother anyone. That was my mindset, right? Now, I have a much simpler means without pressing a button that implies I have a major problem.

to check in on what the plan is. The only reason that they hadn't come into my room is I'd slipped through the cracks. They could have been making use of that room. And I was just sitting there wondering what was going on. This is something you

can check on with Vicky. Exactly. Vicky, I was supposed to be seen as somebody coming.

What's the plan? Exactly. Hey, thanks for doing the show today. I really appreciate it. It's been a lot of fun.

Yeah, great to chat with

you as well.

Thank you

so much.

Thanks for listening to Newstay. There's a lot happening in our [00:12:00] industry and while Newstay covers interesting stuff, another way to stay informed is by subscribing to our daily insights email, which delivers Expertly curated health IT news straight to your inbox. Sign up at thisweekealth. com slash news.

Thanks for listening. That's all for now