This transcription is provided by artificial intelligence. We believe in technology but understand that even the smartest robots can sometimes get speech recognition wrong.

 Hey everyone. I'm Drex, and this is the two minute drill where I cover some of the hottest security stories in healthcare, all part of the 2 29 project, cyber and risk community here at this week. Health, it's great to see you today. Here's some stuff you might want to know about. AI isn't just growing as much as it's exploding.

We just finished 2 29 project summits in California and Georgia last week, and those summits included CIOs and CMIOs and CXOs, and of course CISOs, chief Information Security officers, and every one of those events, there was some time we spent talking about artificial intelligence and while everyone was talking about what the models can do, almost nobody.

Not just those groups, but almost nobody is talking about what the models cost to run or how the infrastructure that powers them could become our next big headache. See training one large AI model can burn through as much electricity as 5,000 homes, and once the models go live, the inference model, the day-to-day question machine, the summaries, the image generations.

Once that inference model goes live, that's also a constant draw in cities like Amsterdam and Dublin and parts of Virginia. They've already said no more uncle. We can't take any more data centers. Our power grids are maxed out. Meanwhile, Texas and Ohio and other locations are handing out building permits like candy, but their electrical utilities are kind of sweating it out 'cause they're not sure how they're gonna keep up.

We built the internet on abundance. And now we're sort of realizing I think that the future of AI might have to be built on scarcity. And I've said it before, but scarcity, the lack of resources is what drives real innovation. That's where we're seeing a hardware Renaissance. Nvidia still rules the GPU world.

Those are the super high intensity chips that are needed to train really complicated models, but companies like gr, GROQ with a Q, companies like Grok and IBM and A MD, and there's a a dozen or so. Other startups. They're trying to rewrite the math around AI efficiency. The grok IBM deal, which was just announced earlier this week.

It's a really good example. IBM gets blazingly fast, low latency, way less power hungry, inference capable chips in their cloud environment, and grok gets the enterprise credibility that only comes with a century old tech giant like IBM. Rock's chips called language processing units are optimized to deliver AI results with a fraction of the energy while doing inference type AI way faster than more traditional GPUs.

So now every big player is racing, not just for smarter ai, but for cheaper, cleaner, faster ai. And that's where this story kind of turns into a cybersecurity and resilience story. Every new chip fab, every new data center, every experimental in interconnect. It's all part of the world's critical infrastructure now.

And right now we're expanding that faster than we can secure it. Yesterday, and this was not a security event, but everybody's heard about it by now. I think it was a resilience event yesterday. AWS had a large outage. They're one of the providers of computing infrastructure for things like Ring Doorbells and Venmo and the British government and the US government and lots of apps that your.

Your own health system uses those apps all went offline yesterday because AWS went offline. So imagine in this new AI world, if a supply chain attack hits a major AI chip boundary, or if a data center cooling system is compromised in the middle of a heat wave, you don't just take out compute, you take out intelligence capability.

In a world where compute equals power, that now becomes a real strategic vulnerability. And we did see it yesterday because when AWS went down, OpenAI lost some of its capabilities and a lot of companies that have now built products and services on top of that OpenAI platform, they went down too. So all these new relationships, grok and IBM, Microsoft and Open ai, Oracle, and Nvidia.

Like everybody in AWS, they're blending hardware and cloud and IP at a scale that creates complex interdependent risk. One breach doesn't mean just a data leak. It could disrupt global AI capability and that capability, that availability powers business and infrastructure and national security and healthcare, everything is connected to everything else literally.

We only get more connected from here on out, uh, definitely through this model. So yeah, the AI story initially sounds a lot like software, but underneath it's all physics and infrastructure and power grids and geopolitics and supply chain security. They are all connected, which brings me to resilience at this point.

Obviously it's not just about firewalls and backups anymore. It's about making sure that when the power grid hiccups or a chip fab gets hacked or AWS goes down, that the lights and the algorithms stay on because we have become addicted to them and it looks like we're only going to be become more addicted.

So the next time you hear someone say, AI will change everything, remember it's already doing that. The question is, as much as it drives new capabilities forward, can we power it and protect it and keep it from accidentally becoming the next great business liability? Tell me what you think. I'd love to hear your comments more on this story and all the latest healthcare, tech and security news at this week.

health.com. And I'll post a link in the comments that'll take you directly to Spotify or Apple. So you can sign up for my UNH hacked podcast channel two. That's where all these shows live. And as they say, smash the like and subscribe button. And that's it for today's two minute drill. Thanks for being here.

Stay a little paranoid. I'll see you around campus.