This transcription is provided by artificial intelligence. We believe in technology but understand that even the smartest robots can sometimes get speech recognition wrong

Flourish Sound Bytes: Attorney Insights on Navigating Privacy with Helen Oscislawski

[00:00:00]

Sarah Richardson: I'm Sarah Richardson, a principal here at this week Health where our mission is healthcare transformation, powered by community. This is Flourish Soundbites, unfiltered Conversations with healthcare leaders. Let's get real,

Welcome back to Flourish Soundbites. I'm your host, Sarah Richardson, and today we're talking about a topic that sits at the heart of every transformation story in healthcare trust as innovation accelerates interoperability, digital identity, ai, the laws governing patient privacy.

Are being tested like never before, and few people understand that evolution better than today's guest, Helen Osofsky. Helen is nationally recognized healthcare attorney and founder of attorneys at Osofsky LLC For more than two decades. She has guided health systems, payers, innovators, and policy makers through hipaa, HITECH information blocking and the ever expanding space of digital data rights.

What makes Helen remarkable is how she translates complexity into clarity, championing patient rights, challenging [00:01:00] assumptions, and helping leaders navigate the messy intersection of innovation and integrity. Helen, welcome to the show.

Helen Oscislawski: Thank you for having me.

Sarah Richardson: Yeah. I am so happy to have you because your panel at Bluebird during Soar was incredible and how many things that people did not know that I feel like everybody should know, so I was like, Hey, come on the show and let's chat about this. You have literally spent your career helping organizations advance interoperability while still protecting patient data.

What's the privacy paradox that leaders are facing right now?

Helen Oscislawski: Yeah, the privacy paradox. I think that this can be best described by the. Push right now for digital information to move instantly. But at the same time, we don't want that instantaneous access to data to result in lost trust. Organizations and individuals, are expecting instant access, frictionless data liquidity, but they're also expecting the [00:02:00] utmost guardrails to ensure that data isn't compromised.

And so it becomes. A huge challenge to be honest.

Sarah Richardson: Well, and you mentioned that regulation often lags technology. Where do you see the biggest gaps today?

Helen Oscislawski: I think that's the, the, the point is many of the privacy regulations and confidentiality regulations um, that. Have defined confidentiality over the years, have been in the books for decades.

One of the largest federal laws that governs substance use disorder information just went through an overhaul at the congressional level for the first time since 1972. They did that. Specifically because as movement of data is needed for things like care coordination, improvement of quality, interoperability, and whatnot, that law from 1972 just could not keep up, and it couldn't allow for the ways that we're expecting that data.

To move, as I had mentioned in my just previous comment, that we're expecting instant liquidity of data. [00:03:00] And so it's just really important that those laws are either updated. We really need to look at, you know, privacy by design. Making sure that we're building systems to honor the privacy as we're allowing the information to move quickly.

It needs to have guardrails in place that also requires the technology, but it also requires the legal framework for doing that and the governance structures for doing that. This way, if we design privacy or we refer to it as privacy by design and the interoperability is by design. Then the trust factor becomes the engine of how the information moves rather than being perceived as a barrier or an afterthought.

Sarah Richardson: And you've been on the front lines of HIPAA since the early days, then through HITECH and now information blocking roles. When you look across 20 years of evolution, what stands out to you as the biggest shift?

Helen Oscislawski: I think one of the biggest shifts is that we have gone from a historical position [00:04:00] of focusing solely on confidentiality.

You may have had the experience, I've had the experience where we've gone to the doctor's office and we wanted a copy of our own paper record, and they would make us sign paper forms consenting to them, releasing the information to ourselves. And so the point is, is that the healthcare sector, this is the way.

That people thought about data privacy. It is important to keep healthcare data private and confidential, but we now are seeing the laws change and I think one of the most. The biggest seismic shifts in the, healthcare industry right now is the in information blocking rule. And that was a legislation that I like to refer to it as turned HIPAA and privacy on its head because it said, rather than protecting and preventing information from flowing, it says you must share electronic health information.

Unless you have an absolute legal basis upon which you cannot do [00:05:00] it or certain other reasons like technological and feasibility and other things where you can't do it, that was a major shift in I think, the thinking of how we look at data. It doesn't mean that confidentiality and privacy has been thrown out the window.

That's not the case, but it has forced organizations to look at their governance structures and you can no longer just simply rely and fall back on a baseline or a default of, well, I'm just not gonna give it to them because it could be a disadvantage to my organization that will no longer flies under the information blocking framework.

You can no longer claim things like inconvenience or just generalize risk concerns. You have to have those things pinned to an actual legal restriction for not allowing the information to go. So on the one hand, I think that's. Sort of good, right? Because patients should have access to the information.

Others should have access when it's legally [00:06:00] permissible to do so. But it has caused a lot of, I think, confusion and other sorts of results including lawsuits as recently as a few weeks ago, the Trump administration through, um, Kennedy, HHS Secretary, said that they are going to be enforcing and looking very rigorously at enforcing the information blocking rule.

And so it's gonna be very, I think many of us are gonna be watching to see how that all pans out, because it's gonna pin everybody who hasn't been paying as close attention to IBB and pay attention to these rules even more. It's a seismic shift in the healthcare industry.

Sarah Richardson: How has that lended itself to leaders, maybe unintentionally then creating risk simply because the regulations are so nuanced and change in ways that are maybe hard for organizations to keep up with?

Helen Oscislawski: many organizations have looked at this as technology problems, whatever.

It's really a governance issue and. Some of the mistakes that are made out there by organizations can go both ways. They [00:07:00] can oversimplify things and be releasing data when they shouldn't be releasing data. And then on the flip side, they may be holding back data when they shouldn't be holding back data.

So I think one of the risk factors is that organizations and the individuals who are responsible for these frameworks and how their information is released, may rely on general understandings of things. And when you rely on a general understanding of a concept, it's more.

Risky to get it wrong or to begin structuring frameworks that just don't match what you're supposed to be doing. So I think it was Steve Jobs that said something like simple can be harder than complex. You have to work really hard to get your thinking really clean and simple. But once you do, you can move mountains or something.

It's not a direct quote. But the point is. You know, people, there's something to be said for simplicity. Like you don't want to overcomplicate things where things come to a screeching halt because [00:08:00] you'll never get anything done. But when you rely on your general understanding of say, like what is de-identification?

Many people look at that term and are like, oh, we can send that information because it's anonymous. It doesn't have the patient's name, but that's not technically true. Anonymized information that doesn't have patient's name does not legally de-identified. You actually have to understand how to create a legally de-identified data set before you can actually.

Release that without any potential legal consequences, without a consent in many instances. And so that's a great example to illustrate how somebody who's looking at these issues may apply an oversimplified understanding of something. Then allow for a governance or release of information structure that doesn't match what actually does.

And that gets back to the jobs point. Like you really have to do the hard work and understand the complexity of it before you can get to the point where you can do privacy by design and simplify things to make sure things are accurately being, [00:09:00] set.

Sarah Richardson: And you shared with me a real life example of being a proxy for your mom.

Mm-hmm. And even as an expert. You had to escalate the situation to get the information that was needed. What did that experience teach you and what does it need to teach us about the gap between policy and practice?

Helen Oscislawski: Yeah. Absolutely. We won't name names of who the organization was, but it was a very frustrating experience as you know, and it's okay for me to share.

I'm in the thralls of taking care of my elderly parents, and my mom had a hospitalization scenario, and as you mentioned, I was a legal proxy for my mother. I had. A very confident law firm that I worked at draft those documents. I had input on it and all, and suffice it to say that I had 100% certainty that the language contained in those documents allowed me to essentially stand in the shoes of my mother during her debilitated state.

And. The [00:10:00] organization, or I should say the individuals that were the first points of contact with the patient and family care members, which was me in this case were provided information just simply was not accurate. They were telling me that I needed to get two doctors to declare her incompetent before the proxy documentation triggers.

They were telling me I had her to get to. I had to ask my debilitated essentially comatose, not comatose, but she was, you know, she was, struggling with pain to have to sign forms in order before. And I knew that wasn't the case and I had to escalate it all the way to the legal department, which eventually agreed with me.

But, you know, to answer your question directly, you asked what did it teach me? It taught me that organizations internally. Have a lot more work to do to implement the actual accurate you know, guardrails and rules, I should say, to operationalize things the correct way. Where, when a.

Family member who has the correct legal credentials is able [00:11:00] to and permitted to access information without unnecessary barriers, without unnecessary frustration during one of the most vulnerable times that patients and their family members have, which is when they're hospitalized and going through a very difficult period of time.

So I think organizations have a lot of work to do. They Look at the complexity and then simplify. And then once you've gotten the right answer for the use case, if you will, then you have to operationalize how do you pull this through to your organization? So every single workflow and every single individual and touch point is getting the answer correctly and relaying that.

And so you have. Optimal interaction and optimal operationalizing of that use case and that situation in accordance with the law and, what needs to be done. So,

Sarah Richardson: Yeah, not everybody has the either. Understanding of our ability to escalate all the way to the legal team just to get what they need for their family.

I mean, that

Helen Oscislawski: right,

Sarah Richardson: just, it takes so many more cycles than it actually should, which to a degree has brought [00:12:00] us to what we like to call the rise of the third party applications. One of your big focus areas is digital records and the question, who really owns the data? Many consumers assume that if an app handles their health information that HIPAA applies, but you taught me that.

That's rarely true.

Helen Oscislawski: Yeah, it is. So many individuals and patients still don't realize that once your record leaves the custodian healthcare provider and goes into any third party that you're controlling those. Those consumer applications are not regulated by that federal privacy law that we refer to as hipaa.

That law doesn't apply to just the health inform the health information. It applies to the custodian, the type of custodian, and. Companies that allow, or have consumer apps that may collect health data or allow health data to be transmitted to them fall outside of that [00:13:00] law. Then you have to take a look at things like state consumer privacy laws, which vary from state to state.

All, I think we're getting to like 45 states or something that maybe 40, 45 that have their own data. Consumer privacy laws. And the FTC gets into this space. And the bottom line is if you are using any kind of consumer app to transmit your medical record into that application, or even generating your own health information into a wearable device or any kind of consumer app, you need to read the terms and conditions.

That's really what controls and if those terms and conditions. Say we can, you know, use your data. We can use, learn things about you with your data that we're gonna use for our own internal, commercial purposes. We're gonna sell your data. These are all red flags for that kind of product that may.

You know, signal to you that you may not want to put your health information into the data app. So it's really a situation where the consumer really needs to look [00:14:00] out for himself or herself with these new products that are hitting the field.

Sarah Richardson: Yeah, I mean, most consumer apps we're talking wellness trackers, fertility apps, mental health schools, medication reminders, genetic kits, a whole nother ballgame.

None of them are covered by hipaa and there's no single federal law that protects health data once it leaves that ecosystem. So if you've got all of these things that are known. And then without a B, a, A, the app has zero obligation for hipaa. What should CIOs and CISOs, and even an everyday consumer be thinking about when choosing or recommending a health app?

Helen Oscislawski: Absolutely. So I did wanna add one point is just last week there is a Congre Congress Cassidy. The Cassidy bill was just proposed. So they are attempting to apply hipaa. And HIPAA's protection standards to these apps and these consumers. But that's just been introduced just for your audience out there.

Keep an eyeball on that [00:15:00] one. 'Cause that will be interesting if that gets through in one, some shape or form. And it could add a whole new layer. But as far as right now, and assuming , that that doesn't happen, you know, CIOs and other organizations, if they're looking to evaluate consumer apps, I have two points and it really needs to be looked at in two buckets.

The first bucket is if your organization is looking to partner with a company that is going to provide this as a a feature or some sort of positive, product that. Your patients now can use as a convenient, portable way of, downloading their information and maybe having it portable to other places or other providers, and that's something your organization is directly holding out as a.

Service, then the consumer app becomes your business associate. And you are in some way, shape, or form tied to that consumer app. So, there are different questions to ask. Certainly selection of that kind of app., [00:16:00] Is, needs to be done very carefully. You don't wanna partner. I think most organizations will vet those kinds of apps.

They wanna partner. Again, it goes back to things like terms and conditions and so forth. But it also will go back to now the requiring that app vendor to align at this point contractually align with HIPAA standards. And you do need to take some. Points there and make sure, because there are many apps out there saying, oh, we're HIPAA compliant, but they actually have no idea what they're talking about.

So you definitely wanna do some due diligence there. And CIOs, are generally pretty good about vetting those things in the second bucket though. When. You ask the question about CIOs and vetting apps, you have to be a little bit more careful because if the patient's coming to your organization with an app of their choice and that patient is exercising his or her right to access under HIPAA and under information blocking and saying, I have a right.

I'm ex exercising that, right? I want now to download my patient record into this mobile app There. Is very little that the [00:17:00] organization can do now legally to prohibit that from happening. There are certain things, for example. If it's technologically infeasible for them to actually connect to the app and do that, expectation isn't that, that the organization would expend new resources or funds to create interfaces or whatnot, but in the very near future, and it's coming, most certified electronic health records that are certified through A STP and are accepting Medicare and Medicaid are gonna be.

Tied to certain certification criteria and that is gonna require almost an instantaneous fire based API access that Connect can connect to any app. And at that point it becomes a challenge because you really legally cannot say no if that's what the patient. Wants where they want their data to go. So, does the organization have a legal duty to inform patients, how to make the decision about good apps versus predatory apps or whatever?

They don't necessarily have a legal duty to [00:18:00] do that, but the federal government did say, as a matter of. Guidance that it wouldn't be considered an interference to provide patients with education of how to select a good app, if you will. Again, going back to the points about reading the terms and condition, looking for some red flags of things in terms and conditions that should discourage individuals from selecting that vendor as their app of choice.

Things like sale of data or you know. Scrubbing or scanning the data for things that reveal things about you and then selling that, say, for example, to health insurance companies, which is very concerning. So definitely I think organizations, while they can't prohibit an app interface connection that is of patient choice, I think that there's room opportunity that, and organizations as stakeholders have a place.

A role here to play in educating consumers about making those selections. 'cause that's gonna be an uphill battle in getting the public to understand really what's a good trustworthy app versus one that's [00:19:00] not.

Sarah Richardson: Giving people an opportunity to make an informed decision, which you would expect to a degree from your healthcare provider.

And now with AI, accelerating faster than regulation, interoperability is expanding identities becoming fully digital. From your vantage point, what worries you the most, but also what gives you hope?

Helen Oscislawski: I mean, ai, as we all know, we're sort of drinking from a fire hose here as the saying goes, right, it's it's just coming fast.

And you really don't even know where to start sometimes. But organizations really definitely need to start with. Governance and understanding. I think that's the key. There's AI is going in so many different directions and doing so many different things. It's going back to that theme of. Getting to the point of simplicity by understanding the complexity you have, you know, with ai, the first thing is I think to pause, stop, and really understand what is that AI tool exactly doing, not just what it's telling me it's doing for my [00:20:00] organization, but what are they doing on the back end?

What the data you know, how are they processing the data? Where are they processing the data? Are they reselling that learned that intellectual property to organizations like insurance plans or pharma or whatever. And I'm not I'm not meaning to pick on insurance plans and pharma because there's a lot of good information that can be shared, should be shared with insurance companies and pharmaceutical companies and whatnot for research and, and whatever.

But the point is, again. Understanding the complexity of what the AI does and then making those decisions. Back to your point about like informed decisions like, the understanding. Underpins all of that. And I do have hope because, you know, there are some amazing things, you've probably seen it too.

The efficiencies that are being gained in, healthcare even accuracy in terms of quality of data, potential for improving the delivery, improving, early detection. There are so many opportunities, and I do think there's a lot of good people looking at the issues and trying to reel it in.

So slow and steady, I think, again, um, understanding what's in front of [00:21:00] you, having that governance structure in place, and then just moving forward and reaping the benefits of what it has to offer potentially.

Sarah Richardson: For sure the slow and steady like this told us, like the informed decision, the different aspects of being able to actually thoughtfully determine how you wanna utilize the technologies, even if you don't understand how all of them work, to understand what that could mean for you as a patient right.

And a consumer. So before we wrap, I wanna have a little fun with, a quick speed round. Are you ready?

Helen Oscislawski: Okay.

Sarah Richardson: If you were not a healthcare attorney, what completely unexpected career would you have chosen instead?

Helen Oscislawski: I think probably a clinical psychologist because all the people who are in the interoperability, space need therapy. So I'd just be on the other side of the couch. I guess in that sense, I actually intended to go to become a clinical psychologist and ended up in law school somehow.

So that's part of my answer there. So.

Sarah Richardson: I love that. And where should people stay up to date on all these regulations and different things that are [00:22:00] constantly coming through? Like what's a great source of data for the average person?

Helen Oscislawski: Well, I would say, the federal websites are not bad, so, um, so the assistant Secretary of Technology, A STP has a great website regarding information blocking the department of Health and Human Services.

You can go onto their website for hipaa and there's. Quite a bit of information, both for providers and for patients, uh, explaining hipaa, explaining the all the elements of it. That's actually quite good as well. Another really great resource is the Sequoia Project. They have, numerous resources now, specifically aimed at interoperability.

They just posted a brand new, map of the 50 states and the litigation that is, that's occurring, across, you know, various, states on, um, things like antitrust for information blocking, which are really great. So that would be another one. And of course I have to plug my blog legal, [00:23:00] HIE where we, sometimes post about various things dealing with legal health information exchange.

Sarah Richardson: Which is always something to look after because you're constantly staying up to date. So follow her blog when you get a chance. Helen, your insights remind us. Innovation does not move healthcare forward unless trust comes with it, and privacy is not a break on progress. It's a foundation that allows us to build boldly and responsibly.

Thank you for bringing clarity to such a complex and increasingly urgent part of healthcare.

Helen Oscislawski: It was my pleasure.

Sarah Richardson: And for our listeners, if today's episode sparked questions about digital privacy, consumer data rights, or interoperability, share this conversation with a colleague. These are discussions every organization should be having right now.

Until then, keep flourishing.

that's flourish soundbites, find your community at this week, health.com/subscribe. Every healthcare leader needs a community to learn from and lean on. Share the wisdom.

That's all for now.