Speaker:

Welcome to show 350 of data driven.

Speaker:

In this episode, Frank and Andy interview Chris McDermott,

Speaker:

VP of engineering at Wallaroo. Wallaroo helps

Speaker:

customers operationalize machine learning to ROI in the cloud,

Speaker:

in decentralized networks, and at the edge. It's

Speaker:

a fun conversation on MLOps and the future of intelligence systems

Speaker:

and model management. Now on to the show.

Speaker:

Hello, and welcome to data driven, the podcast where we explore the emergent

Speaker:

fields of artificial intelligence, Data science and of course, data

Speaker:

engineering, the fundamental thing that kind of underpins it all. And with

Speaker:

me on this epic road trip down the information superhighway

Speaker:

Is my favorite data engineer of all time, Andy Leonard. How's it going

Speaker:

Andy? Good Frank, how are you? I'm doing alright. I'm doing

Speaker:

alright. We just, we're chatting

Speaker:

in the, virtual green room about some of the logistical challenges we had,

Speaker:

with Microsoft Bookings and how Kind of like you can only have, you

Speaker:

know, like, remember that the pick any 2 triangle, right? Good, fast, and

Speaker:

cheap? Yep. Yep. Like, we can only have 2 things, 2 features of what we

Speaker:

needed to do. Right. Alright.

Speaker:

Despite logistical challenges, we are excited here to have,

Speaker:

Chris McDermott who is the VP of engineering at Wallaroo,

Speaker:

and, he is a passionate, and intellectually

Speaker:

curious professional With excellent communication skills, he

Speaker:

loves hard problems, then he must have definitely loved the

Speaker:

process to get on the show, And,

Speaker:

have yet to meet 1 he couldn't solve somehow. Maybe we should get you,

Speaker:

Chris, to help us with our scheduling stuff. Really? You visit

Speaker:

later? Yeah. So welcome to the

Speaker:

show, Chris. Thank you. Thank you. It's great to be on. It's nice to meet

Speaker:

you both. Well, likewise. Likewise. So you're coming to us from the,

Speaker:

Mile High City. That's right. Awesome place. It's, I was

Speaker:

there once, for internal Microsoft

Speaker:

conference actually. Oh, nice. And beautiful town, like, it was

Speaker:

just really cool. I think it was the 2nd biggest

Speaker:

event that in Denver history was the Microsoft thing. Wow.

Speaker:

And they they literally ran out of hotel rooms like it was.

Speaker:

Oh, wow. It was pretty wild. Yeah. I think it was, just

Speaker:

before one of the big parties had a convention there. And,

Speaker:

they Oh, yeah. Yeah. Yeah. Yeah. I was so I'm actually

Speaker:

slated to head back there next year for a Red Hat

Speaker:

conference, so we'll see Let's see if the hotel situation has

Speaker:

improved. I think it's improved a little bit. The city's been growing a lot. So,

Speaker:

Yeah. Lots of government. Isn't Denver the place that has, like,

Speaker:

the large bear up against the conference center that

Speaker:

Yeah. Yeah. Yeah. Yeah. Yeah. That's exactly right. A giant blue bear appearing in the

Speaker:

window of the conference center. Yeah. I was there. And,

Speaker:

and and I remembered that. That was the That was the first thing I

Speaker:

remember. It was, I was there in

Speaker:

2007 for a Kind of a Microsoft conference. It was

Speaker:

a, Professional Association for SQL

Speaker:

Server. That's what it was called back then. And, I was

Speaker:

actually the first one I spoke at. I've spoken at a bunch since then,

Speaker:

but 2007 in Denver was the first. And,

Speaker:

yeah. Like, I echo what Frank said, beautiful city

Speaker:

and, just very picture picturesque. Yeah.

Speaker:

Yeah. The weather in the mountains are beautiful. Mhmm. Yeah. And

Speaker:

it's funny, like, you know, on the East Coast, we talk about mountains,

Speaker:

but It's nothing like that. Like Right. Yeah. We quite the

Speaker:

same. You would laugh at what we call mountains. Yeah. Right. But

Speaker:

I remember a Robin Williams bit Where he said something like that

Speaker:

people he admired the people in Denver because they got to

Speaker:

Denver and they looked at the mountains and went, Well, I can't say what

Speaker:

he said, but he had a kind of an Elon

Speaker:

moment.

Speaker:

There's so many of those. There's so many No more. We're stopping right here. We're

Speaker:

not going over those mountains. So,

Speaker:

You're VP of engineering at Wallaroo. So tell us a little bit about Wallaroo.

Speaker:

Mhmm. Plus you're also ex data robot too. That that's interesting.

Speaker:

Yes. Yep. Exadata robot. Yeah. So I've been working in the machine

Speaker:

learning and AI space for, about 7 years now, I guess, or 6

Speaker:

years. And, it's been really fun. You know, it's, it's a

Speaker:

good time to be in the business. There's a lot of development

Speaker:

happening, very fast pace of change, which I appreciate.

Speaker:

And, you know, Wallaroo has been really great. Like,

Speaker:

the team is fantastic, and the people are wonderful. And it it's a lot of

Speaker:

fun working, with people that you enjoy hanging out with and and you respect

Speaker:

and everything, that's that's very important to me. That's awesome. But I also just

Speaker:

I think the product is awesome. It's really, I think,

Speaker:

playing well in the market. Like, we are focusing on making it as easy as

Speaker:

possible to deploy And manage machine learning models.

Speaker:

And the focus is really on any model using any framework and being

Speaker:

able to deploy onto sort of any architecture, any hardware,

Speaker:

and being able to leverage GPUs if you need them or different kinds of CPUs,

Speaker:

different acceleration libraries that people have tailored to the different

Speaker:

architectures. And, honestly, there are not a lot

Speaker:

of other solutions that tackle those 2 problems for people. Right.

Speaker:

A lot of the other companies that we're competing with, they are trying to be

Speaker:

like an end to end solution or, like, really force people into, you know, their

Speaker:

platform. So you train on their platform, you deploy on their platform, you manage on

Speaker:

their platform. But it's very limiting in terms of what you can bring on to

Speaker:

the platform and and being able to, deploy on the different types of

Speaker:

architectures and, platforms and things like that. So it's really

Speaker:

exciting. It's fun. I think that's really important that you bring up

Speaker:

the CPU solutions. As I've been tinkering,

Speaker:

you know, the past couple of years with, you know, with the

Speaker:

different, different platforms that are out there, it's

Speaker:

That's definitely a smaller market, but maybe it's emerging now. I'm

Speaker:

just not sure. Mhmm. Yeah. I wonder yeah. Sorry. Go ahead.

Speaker:

Well, I was gonna say, you know, a lot of the time people conflate training

Speaker:

and inferring, which is, you know, sort of the 2 different stages. Like,

Speaker:

1st, you have to train a model, but then you use the model to make

Speaker:

inferences, which, you know, it's really like asking the model to make a prediction

Speaker:

or you give it some input and it gives you some output. And,

Speaker:

they're they're very, very different tasks. And just because, you know,

Speaker:

like, you may wanna use some hardware GPUs for training Doesn't

Speaker:

necessarily mean mean that you need the GPUs when you are in production and

Speaker:

you're asking it for predictions. A lot of the time, you know,

Speaker:

The model is small enough that you really don't need to, but there's

Speaker:

so much hype. It it's hard sometimes to separate the hype from the, you

Speaker:

know, The real stuff and Yeah. Yeah. The hype the hype

Speaker:

machine is real. I mean, like, it's and and and I I wanna get your

Speaker:

thoughts on, you know, I mean, I love generative AI. I'm not

Speaker:

knocking generative AI, but it feels like it's taken all the oxygen out of the

Speaker:

room for All the other kinds of AI.

Speaker:

Yeah. Yeah. Yeah. Because there are a lot of, you

Speaker:

know, great models. I like XGBoost is a very standard one. It's

Speaker:

been around for, you know, a long time, meaning at least for, you know, 5

Speaker:

or 10 years now or something. But, that really honestly solves so

Speaker:

many problems, and it's such a Small, easy model to deploy.

Speaker:

I I wish people would focus more on on that kind of thing rather than

Speaker:

hype. Right. No. That's a good point. And I think you

Speaker:

bring up an interesting point because not all not

Speaker:

all AI workloads are created equal. Right? Obviously, there's,

Speaker:

I heard this term the other day and I had to spit my coffee out

Speaker:

because it was just so funny. Legacy AI. Yeah.

Speaker:

Yeah. There's generative AI now. There's legacy AI. That's

Speaker:

crazy talk. You know? And I was just like,

Speaker:

wow. But,

Speaker:

you know, because, you know, legacy AI, basically,

Speaker:

you're not using deep learning, you're not using neural networks,

Speaker:

Generally, you don't get a good boost from GPU's.

Speaker:

Correct. Right. And that's something that when when you tell that to

Speaker:

Even tactical decision makers, they they they

Speaker:

kinda look at you like, you know, what sorcery is that? Like, you know, because

Speaker:

they'll they'll They'll say, like, oh, we don't have enough GPUs. There's no budget for

Speaker:

GPUs. Like, what what what types of workloads are you running? And I

Speaker:

tell them, it's like, well, it's not really a concern for you. Like, you don't

Speaker:

need them. Yeah. And you see, you know,

Speaker:

the the the the people who are doing the actual data science, they're like, yeah,

Speaker:

duh, that's what we're trying to tell you. Yeah. But you see, like, the leaders

Speaker:

of these teams are like, like, you know,

Speaker:

it's, now Just for my own education,

Speaker:

there wasn't there something called RAPIDS, and it was an

Speaker:

acronym that let you use GPU's For

Speaker:

certain types of like XGBoost, I think was one of them. Random

Speaker:

forest there. I don't know. Oh. You See, it's funny because

Speaker:

it was an it's an NVIDIA thing and obviously it only optimizes on. But,

Speaker:

like, it was I remember Hearing about

Speaker:

it in 2019, and I'm thinking, wow, this is gonna change

Speaker:

everything, and you haven't heard of it. Only,

Speaker:

like, one ever per other person I met in the wild has ever heard of

Speaker:

it, and he was at the same conference I was at where we heard about

Speaker:

it. So I'm like, That's kind of unusual,

Speaker:

but, you know, we gotta watch so

Speaker:

fast, you know, and it's really hard to tell sometimes What

Speaker:

what, which new developments are gonna end up being the future and which ones

Speaker:

are gonna end up as dead ends? Right. You know, and even all

Speaker:

the transformer stuff that that is powering GPT and and those similar types

Speaker:

of models, I think that was originally written up in a

Speaker:

white paper in, like, 2017 something. Mhmm. And it just kinda sat around for a

Speaker:

while, and nobody paid a whole lot of attention to it until OpenAI really

Speaker:

ran with it. So yeah. Pension is all you

Speaker:

need. I think that's was that the paper? Sounds right.

Speaker:

Yeah. And then we're gonna go. Oh, sorry. Go ahead. Sorry,

Speaker:

Andy. I cut you off your point. No. I I don't wanna go too far

Speaker:

downstream before I say cred boost for using the phrase I don't

Speaker:

know. Oh, nice. Somebody with your

Speaker:

credentials, you know, saying I don't know. That's that's super

Speaker:

cool. So cred Honestly, there's way too much to know. There's no

Speaker:

way anyone person could know that. I I like to joke. I

Speaker:

haven't checked my phone or, like, news Feed in like half an hour

Speaker:

and I'm like woefully behind now. Yeah.

Speaker:

But it feels that way like in the whole Oh, no. It does. Yeah. Especially

Speaker:

it was especially interesting when the whole drama on OpenAI, and I

Speaker:

don't wanna go down that rabbit hole too far. But when all of that soap

Speaker:

opera kinda unfolded Yeah. Yep. It was kind of like,

Speaker:

what's the latest? Like, is he back? Is he gone? Is he working at Microsoft?

Speaker:

Like, he did work at Microsoft for like 10 minutes, and now he doesn't.

Speaker:

Like, Yeah. You know, at

Speaker:

at some some point down the middle of it, it's like, call me when this

Speaker:

is over, and I'll deal with the, things yeah. I'll

Speaker:

check-in again. But that's just the human

Speaker:

side of it, let alone the let alone technology side of it.

Speaker:

So Operationalization. I think that's gonna

Speaker:

be the buzzword. Obviously, chatty b t and JennyIs, taking

Speaker:

all the air out of there. And I think the next buzzword It's gonna be

Speaker:

operationalization. 1, because it's kinda

Speaker:

hard to say, and I'm not gonna lie, I've had to practice.

Speaker:

But, it's something that I think companies and organizations

Speaker:

that adopt AI, whether it's legacy AI

Speaker:

Or generative AI. They're gonna have to realize, like, it's one thing to build

Speaker:

the model, and then it becomes a, okay, now

Speaker:

what? Yeah. Yeah. Well and models

Speaker:

really are just like any other software. It's not something that you just

Speaker:

write once and you, you know, Throw it out there, and it runs forever

Speaker:

without being touched. Right? All of it requires care and feeding, and

Speaker:

and machine learning models are no different. So, I think

Speaker:

part of it is, you know, how do you deploy it? And then, you know,

Speaker:

how do you keep that that deployment up to date, you know, getting critical

Speaker:

patches and vulnerability fixes and things like that. But also

Speaker:

how do you monitor the model and how it's performing and how it's performing

Speaker:

relative to the real world, Because the world doesn't stand

Speaker:

still right. So even if the model was trained on some data and it was

Speaker:

98% accurate when it was trained, as the world shifts and

Speaker:

and the situation around it shifts, that accuracy will

Speaker:

almost certainly start to degrade over time. So You need to monitor that. You need

Speaker:

to know when to retrain the model. And you have to be kind of

Speaker:

keeping track of, new training data. Right? So the

Speaker:

the the new environment that the model is operating in, you need to be recording

Speaker:

all of the the inputs and also paying attention to the ground truth of, You

Speaker:

know, what was the outcome of that prediction that the model made? Was it accurate

Speaker:

or not, after the fact? And and correlating that back into your

Speaker:

training data So you can retrain the models and, you know, keep them going

Speaker:

over time. And that's just, you know, assuming you're gonna be using the same model

Speaker:

forever. But as we just finished talking about new models coming out all the

Speaker:

time, new approaches, new techniques. So, yeah, it really is

Speaker:

is something you've gotta pay attention to. It's an

Speaker:

extremely Yeah. It's an extremely dynamic space.

Speaker:

Mhmm. I've heard this called

Speaker:

MLOps for the longest time. Mhmm. Mhmm. But I've also heard a new term

Speaker:

kinda pop up on the radar called AI ops Mhmm.

Speaker:

For this. What do you call it? I

Speaker:

generally call it MLOps. You know, one, I

Speaker:

I sort of per like, AI and ML, there's an

Speaker:

interesting, you know, difference there in in terms of who uses the different terms and

Speaker:

when they use them. For me, AI is

Speaker:

more of a general term that I use conversationally. And most of the time if

Speaker:

I'm trying to be fairly technical and specific, I'll usually revert to ML,

Speaker:

Because in fact, most of these things are machine learning. AI is a much more

Speaker:

nebulous concept, and I I don't even think everybody agrees on on what AI is

Speaker:

or What the threshold would be, you know, if you're

Speaker:

doing statistical analysis, I think most people probably would not call that

Speaker:

AI. But there are a lot of machine learning models that do work that way.

Speaker:

And and that's definitely, like, part of the gradient. You know? I've

Speaker:

noticed that too. Like, there it is a gradient too. Like, there's not a, like,

Speaker:

a hard, like, You know, typically it depends on the audience. Right? If they're

Speaker:

if they're BDMs, business decision makers, they're gonna use

Speaker:

AI. Yeah. They're technically focused people. They tend to prefer the term

Speaker:

ML. Yeah. That's also been my experience Interesting.

Speaker:

Quite often. So I like MLOps because, one,

Speaker:

it sort of grounds you a little bit more in that technical perspective. Mhmm.

Speaker:

And, and it's sort of a like, To me, I think I came up

Speaker:

through DevOps a lot of my, you know, first half of my career was DevOps

Speaker:

and infrastructure and things like that. And, I

Speaker:

think part of the appeal of the term MLOps is it taps into a

Speaker:

lot of the DevOps, associations. Right? And

Speaker:

Right. The concepts and the themes of DevOps, which is really about,

Speaker:

merging different skill sets and breaking down silos and getting different teams to

Speaker:

communicate with each other and And to collaborate more,

Speaker:

being more dynamic. Not just, you

Speaker:

know, putting software out there and and letting it run forever, but Keeping it up

Speaker:

to date, monitoring it, recording the logs, you know, all of that kind of

Speaker:

stuff, and and getting into a flow of continuous

Speaker:

deployment, you You know, continuous integration, continuous testing, continuous

Speaker:

deployment. And I think on the ML side, that's also where

Speaker:

MLOps really shines and and is bringing those themes

Speaker:

to the party, rather than a data scientist training a

Speaker:

model, deploying it, and, you know, Throwing it over the

Speaker:

wall to to, like, an operations team or something. It's

Speaker:

getting all these different teams and skill sets to work together. It's

Speaker:

building a continuous, you know, pipeline, with

Speaker:

monitoring and and feedback loops and so on. So that's that's why

Speaker:

I like MLOps. No. I like that too. So in order to prevent

Speaker:

any hate mail come in or or but actually comments, AI

Speaker:

ops is also used, I've heard, in,

Speaker:

the telcos and network operators tend to have a term

Speaker:

called AI ops, where they use AI to help operate their network. So that is

Speaker:

Got it. It's it's a it's a namespace collision,

Speaker:

which I've free further which I prefer MLOps for to avoid the namespace

Speaker:

collision, plus all the reasons you said. You

Speaker:

know, what's interesting is and I came from a software engineering

Speaker:

background and, you know, and I'll be honest, I was not

Speaker:

initially a big, believer in in DevOps, but

Speaker:

kind of as time went on, I became a convert. But I think

Speaker:

that, you know, when you look at how AI models, ML

Speaker:

models, whatever, how they get operationalized.

Speaker:

You look at it And I I often I often can tell

Speaker:

who the fans of the new Star Wars movies are by using this analogy,

Speaker:

because I'll say it's The 2015 Star

Speaker:

Wars movie and the 1977 movie, DevOps.

Speaker:

DevOps being kinda like the original, episode 4 And then the

Speaker:

new, the the first new one, right? It's the same

Speaker:

plot. I mean, the characters have changed, some things are

Speaker:

different, But very effectively, it's the same plot. And, you

Speaker:

know, some people will laugh like you did, and some people I can

Speaker:

see will, Their their faces turn red. But,

Speaker:

but I mean it's like it's it's the same plat plot. The names, the places

Speaker:

have some have changed. But you're right. I mean, I think and there's a lot

Speaker:

of lessons we can learn Yeah. In the ML community

Speaker:

from the DevOps world. Right? Because, You know prior

Speaker:

to DevOps, you know, the developers and operations had a

Speaker:

very antagonistic relationship for the most part. I'm sure there's always

Speaker:

exceptions. You know, I was I was joking that they would only meet,

Speaker:

they only have to interact 3 times a year, and one of those was the

Speaker:

holiday Christmas party. You know what I mean? And

Speaker:

Yeah. But if you wanna deploy something in a far more agile

Speaker:

way where they have to, you know, you put it In some extreme cases, every

Speaker:

few hours, some new bit of code gets gets pushed up. That's obviously on

Speaker:

on one fore end of the spectrum. But for the most part, you know, a

Speaker:

couple times a month is not unreasonable. You have to automate that. You have to

Speaker:

have processes in place. Yep. And I I see a day, and if that day

Speaker:

has not already come, I would be surprised, That AI is gonna be the

Speaker:

same thing or ML. Right? You're you're gonna have to get but to your point,

Speaker:

right, this is a continuous process, You know? Yep. Yeah.

Speaker:

We can't get away with, you know, you have this isolated team of data scientists.

Speaker:

They they they kinda go off to their little area 51 type labs

Speaker:

in secret. Right. I then come back with some model,

Speaker:

and and I'm guilty of this too. I've done this. Right? Where I'm like, I

Speaker:

built the model. I'm done. I did the math. I did the hard

Speaker:

part. How do you get the play it? Not my problem. Not my

Speaker:

problem. And it's funny, like Yeah. You know, I caught

Speaker:

myself. Right. I caught myself doing that as I, you know, you

Speaker:

know, doing that. Like recently, I had to I had to do a demo

Speaker:

and I had to work on a kind of a It's basically a predictive

Speaker:

maintenance type thing, and I took all this data, had the model, and I

Speaker:

just said, here's the here's the link to the model, Have at it,

Speaker:

pal. Mhmm. And then as I sent that, I was like, you know, I should

Speaker:

probably be more involved in getting this on a race for it.

Speaker:

Right. Yeah. Yeah. Yeah. Yeah. No. I think that's a big part of it.

Speaker:

Another big part of it though is, scale, you know. And I think scale

Speaker:

scaling of compute and, how How people were using compute and how

Speaker:

much compute was required was a big part of what drove DevOps.

Speaker:

You know, if you were a sysadmin responsible for a 100

Speaker:

servers, That's, you know, challenging, but it's feasible. Like, you can do

Speaker:

that. You can keep them all up to date. You can keep them all in

Speaker:

sync with each other. Make sure they they all have the same patch levels and

Speaker:

and so on. But you scale that up to a 1,000 servers?

Speaker:

That gets a lot trickier. You try to go to a 100,000 or, you know,

Speaker:

if you're doing Internet scale things like Google or Facebook or somebody, We're talking

Speaker:

millions, tens of millions. And Right. That level of scale

Speaker:

requires you know, everything has to be automated. Everything has

Speaker:

to Has to work that way and it has to be resilient and it has

Speaker:

to, you know, have automatic fail over and stuff. You know, there's the,

Speaker:

x k CD where they're, You know, they get to a certain point. They're just

Speaker:

roping off entire data centers and being like, alright. We're throwing that one away and

Speaker:

moving on to the next one. And for AI, I

Speaker:

think a lot of the same stuff is happening. When, you know, 10 years ago

Speaker:

or so when when people were just getting started on this journey And as an

Speaker:

entity, as a business entity, if you're talking about 1 or 2 use

Speaker:

cases, you know, you can have humans curate that stuff and hand

Speaker:

craft it, hand roll it, hand deploy it, and hand manage it. But

Speaker:

if you're a a big enterprise company and you you wanna have hundreds of use

Speaker:

cases in production or thousands or tens of thousands, there's just no way.

Speaker:

You have to automate it. No. That that that's a that's

Speaker:

an excellent point. Like, one way I've heard to describe is that if you're

Speaker:

baking a loaf of bread for your family and friends Or loads of

Speaker:

bread. You can do it in your kitchen. Right? You don't have to do anything

Speaker:

special, but if you're the Wonder Bread Corporation or

Speaker:

Mhmm. And you wanna deliver at that scale, that's no longer an

Speaker:

option. Mhmm. And I think that we're at that point where and

Speaker:

correct me if I'm wrong, where I think AI and ML adoption or AI

Speaker:

adoption is still new enough where there's enough of naivete out

Speaker:

there of, oh, we don't need to scale to that degree. Like, we don't need

Speaker:

the production line. I think I think that's ending. I think we're getting close to

Speaker:

the the end of that era, but that's kind of been my yeah. I think

Speaker:

so too. Yeah. Because they're they're more and more, ML

Speaker:

tools in everybody's toolbox. Right? So you were talking about telcos

Speaker:

routing network traffic using ML models. That's not

Speaker:

gonna be 1 model. Right? Like, with latency and and

Speaker:

everything else, you're gonna need, you know, Very small. Lots

Speaker:

and lots of very small models deployed on every router, every top of rack

Speaker:

switch, every, you know, whatever 5 gs cell phone tower,

Speaker:

whatever you're talking about. There are a lot of cell phone towers. So you're

Speaker:

not managing 1 model. You're managing a fleet of models, right, across

Speaker:

different geos and all kinds of things. No. That's that's an

Speaker:

excellent point. Sorry, Andy. That's okay. It does seem to scale like that,

Speaker:

though. Right? It's almost it's almost tectonic. There's

Speaker:

a whole new layer going down. You know? That's that's the new surface.

Speaker:

I noticed on the website, I I popped over to wallaroo dot,

Speaker:

aiwallar00.ai.

Speaker:

And I noticed a familiar looking, blurb just

Speaker:

below the top of page. And it's familiar to me because, I

Speaker:

started off in business intelligence. I'm still working in BI.

Speaker:

And there's a note, 90% of AI

Speaker:

initiatives Failed to produce ROI. And I saw this

Speaker:

in, you know, it's very similar number, 85% in,

Speaker:

in BI back in the day. It's probably still true. So where

Speaker:

does that number come from? Well, I think it reflects a lot of

Speaker:

things. You know? Some of them we were just talking about and

Speaker:

and where MLOps is coming from is is, a lot of the failure

Speaker:

modes were teams not really working with each other. Right?

Speaker:

Somebody decided we should be doing AI, so they hired the data scientist.

Speaker:

And the data scientist works in the corner for a while and,

Speaker:

You know, 1, they don't have access to all the data. They don't know what

Speaker:

the data is, where to find it, how to access it, how to clean it,

Speaker:

what it means to the business. There there are a whole set of challenges there.

Speaker:

And then, you know, they may train some models and and get something, you

Speaker:

know, to a point where they think that it's gonna solve a problem. But Then

Speaker:

you've got to work with an IT organization to stand up infrastructure. You've got to

Speaker:

work with somebody to package the model and build, you know, an API around

Speaker:

it or a UI of some sort And figure out how to deploy

Speaker:

it, train people on how to use it, and and actually, like,

Speaker:

somehow integrate it into your business process So that it's

Speaker:

it's driving business outcomes. And all of those are really tough

Speaker:

challenges. And all of them require breaking down those

Speaker:

silos and getting a bunch of different People within an organization to

Speaker:

talk to each other and communicate and to work together to solve something.

Speaker:

I don't think ML or AI is is a magic wand that you just

Speaker:

wave and magically provide value to a business. You've got to really

Speaker:

think about What is your business doing? And, you

Speaker:

know, machine learning at at heart, it it's

Speaker:

really just like a a more efficient way of

Speaker:

Making decisions, you know, faster and more accurately,

Speaker:

and with less human input. And so you've got to look for places where your

Speaker:

business can either save a lot of money or make a lot of money by

Speaker:

being able to answer a a simple question repeatedly very,

Speaker:

very efficient. And that sounds easy, but in practice,

Speaker:

defining a business problem is often one of the hardest parts. So now I'm

Speaker:

seeing even more parallels. Uh-huh. Yeah.

Speaker:

You know, that was the problem we were trying to solve, with

Speaker:

business intelligence as well. So didn't mean to cut you off. Sorry about

Speaker:

that. No worries. So I yeah. I think I agree with you. It it there

Speaker:

are tons of parallels there. I think there are a lot of similar lessons to

Speaker:

be learned, and I think we are applying them in this In this space in

Speaker:

ways that we've applied them to other spaces in the past. I also

Speaker:

think there are technical challenges. You know, part of it is the field is moving

Speaker:

so fast. So there's just this constant stream

Speaker:

of of new frameworks, new models, new techniques, and you

Speaker:

have to kinda stay on top of that. You have to be careful with your

Speaker:

tool selection, to make sure you're not, you know,

Speaker:

going whole hog into some tool. That sounds

Speaker:

great today, but it's just not flexible, and it's not gonna be able to support,

Speaker:

like, all these new things that are coming out. Yeah.

Speaker:

Or that company could have internal internal political

Speaker:

strife, which was crazy talk. Right? Cast Absolutely. Right. Cast

Speaker:

doubt on their future. Alright. That would never happen. That would never

Speaker:

happen. Sorry. Yeah. You were talking about privacy, which I think is

Speaker:

another key thing. Yeah. Data residency, data privacy, see data

Speaker:

security. You know, all of those things matter tremendously.

Speaker:

And for for a business trying to, get

Speaker:

value out of AI and ML. You know, a lot of it, depends on

Speaker:

having good data and, Cleaning it and curating it

Speaker:

and getting it ready for things. But then it it forces the

Speaker:

the organization to really kind of do an inventory. What do we have? What's useful?

Speaker:

What's not useful? Well, how much do we store? How much do we not store?

Speaker:

How do we comply with various regulatory

Speaker:

environments? Right? GDPR is is the big one everybody, you know,

Speaker:

loves to throw out there. It's it's big and it's complicated, but, you know,

Speaker:

things like that matter a lot. And And there's 300 +1000000 people behind

Speaker:

that. They're covered or whatever. I think that, you know, that that

Speaker:

is not only do they have a big stick, but they have a big arm

Speaker:

that they can wave that stick wet. Yes. You

Speaker:

know, if if a small country with, like, you know, 50 people in it, and

Speaker:

that could something like GDPR, people would just walk around it. But I think

Speaker:

that, a block with I've heard different numbers, but

Speaker:

it's for, you know, pushing 4 to 500,000,000

Speaker:

people. That's a huge that's a big enough market nobody can really ignore.

Speaker:

Yeah. What's interesting is on the LinkedIn page

Speaker:

for Wallaroo I love the website, by the way. I checked that out too. Thank

Speaker:

you. It talks about decentralized

Speaker:

networks Mhmm. And at the edge. Yes. What how would

Speaker:

you define decentralized network? Yeah. This is a big new push for us that we've

Speaker:

been focused on for, I mean, we've been focused on it kinda for the

Speaker:

last year, but it was a lot of, development on on the back end. And

Speaker:

we just released kind of our 1st edge features and product,

Speaker:

in October, so it's kind of a new thing for us. But,

Speaker:

As you think about ML and edge or ML and AI,

Speaker:

and the the fleets of models that we talked about and all these use cases

Speaker:

And, you know, telcos and and five g cell phone towers and all of those

Speaker:

types of things, intersecting with data and data

Speaker:

residency and privacy and security, It it really seems to

Speaker:

indicate to me and and to us at Wallaroo in general that the

Speaker:

future is lots and lots of models being deployed in lots of

Speaker:

locations. And I think that one

Speaker:

big sort of industry wide theme that I'm seeing is if the

Speaker:

last 20 years, let's say, was the story of Everybody

Speaker:

picking up from their colos and moving to the cloud and centralizing

Speaker:

all of their IT, I think that the next 20 years are gonna be

Speaker:

Not like deconstructing the cloud. I think the clouds are here to stay and they're

Speaker:

gonna continue to grow, right, year over year. But there will be more

Speaker:

of a push out to more edge computing environments. Cell phones

Speaker:

are getting more and more powerful. Cars are getting more and more powerful. Like, there's

Speaker:

more computer stuff happening, all over the place, and the compute

Speaker:

available, the memory and the storage available is all through the roof compared to

Speaker:

what it was 20 years ago. And, I think we're

Speaker:

gonna see more push for smaller, more specific machine learning models, And

Speaker:

they're gonna be pushed out to all these edge locations so that they can run

Speaker:

close to where the data is. So you're not schlepping this sensitive data all over

Speaker:

the Internet and other people's networks. Yeah.

Speaker:

But, you know, you're taking advantage of of compute resources that you

Speaker:

have local to the data and making very fast decisions,

Speaker:

you know, very efficiently. So I I have to jump in

Speaker:

because, you you just made me feel really good.

Speaker:

About a year ago, I built a large server here

Speaker:

at home, which I hadn't done in a decade. Actually, my my

Speaker:

20 year old son built it. But he and he helped me with,

Speaker:

with picking out the new shiny fast parts, on it because I was

Speaker:

so out of practice with this such confessing.

Speaker:

But, and it's really cool to see, you know, all of his All of

Speaker:

his skills. He does edge. We just picked up the

Speaker:

Raspberry Pis are back in stock, finally. Yep. And I just picked up,

Speaker:

like, 3 for $35, You know, the 1 gig force.

Speaker:

Yep. Anyway, super excited about that. One of the things I built

Speaker:

at the time I built a box About a year ago, you

Speaker:

couldn't do a local GPT or anything close

Speaker:

to that. And I said, Eventually, we're

Speaker:

gonna be able to do this. I I made that guess, and it was a

Speaker:

guess. Yeah. But about 6 months later, about 6 months

Speaker:

ago, All of a sudden, I started seeing these 7,000,000,000

Speaker:

token machines showing up and it started clicking.

Speaker:

It was like, holy smokes, you can do this. I did make one stupid mistake

Speaker:

and he didn't catch me on it. I bought a 12 gig GPU

Speaker:

because that's super crazy huge From 10 years

Speaker:

ago. And that wasn't super crazy huge at all. No. No.

Speaker:

No. But it's interesting. Now they're back now. They can run on, You know, on

Speaker:

the 12 gigs. And like you said, you mentioned the CPU models. So I just

Speaker:

learned a ton as I've been going through this. And, That

Speaker:

it's it's very encouraging to hear that. I had not heard anybody

Speaker:

say edge and running small ML models on the edge.

Speaker:

That's, I mean, that's what we've been trying to do here. And I I love

Speaker:

the redundant you know, the idea of a redundant array of whatevers,

Speaker:

you know, MLs. It's almost like a swarm of MLs. I've heard,

Speaker:

yeah. Yeah. Yeah. That's true. Right? And, you know, I think there's a lot

Speaker:

of interesting stuff happening on the battlefields in Ukraine right now drones.

Speaker:

And Right. That Yeah. Was also a fascinating space and

Speaker:

very much, I think, heading in the direction of lots of ML running at the

Speaker:

edge. It's it's funny you mentioned that. So I live in a DC area,

Speaker:

and, I was at a government tech

Speaker:

symposium about 2, 3 weeks ago now. And

Speaker:

they were talking about that that, you know, edge is gonna be much more important

Speaker:

in the future of warfare. And he said presumably

Speaker:

elsewhere too. Right? He was permanent primarily a government in defense. It was definitely a

Speaker:

military industrial complex, type of type of event. But he was

Speaker:

explaining like, you know, in the past, you know, 20 years,

Speaker:

we've not dealt with adversaries. We've

Speaker:

only dealt with adversaries in in battle space conditions

Speaker:

where it was, you know, we controlled the airwaves.

Speaker:

Mhmm. And he, I think he used an interesting term. We

Speaker:

had airspace and electromagnetic electromagnetic

Speaker:

dominance. I was also like, Wow. Yeah. That was yeah. Yeah. I was, like, oh,

Speaker:

that's interesting. So, like, the whole idea of these disconnected

Speaker:

decentralized networks, I mean, I think

Speaker:

you're I think you're spot on. It's the future for

Speaker:

geopolitical reasons, but also just for, you know,

Speaker:

Privacy and just kind of flexibility reasons. Yeah. The

Speaker:

question I have though is, like,

Speaker:

Organizations can barely manage the infrastructure they have now and barely manage

Speaker:

the software they have now. What are they gonna do when the software starts Not

Speaker:

thinking for itself, but, like, this becomes another workload Yeah. On

Speaker:

top of that. Like, what Well, for one thing, that's why Wallaroo It

Speaker:

is focused where we are, and we're trying to build this platform to help people,

Speaker:

you know, with this capability of being able to deploy models and manage a fleet

Speaker:

of them at the edge. Because, yeah, there aren't a lot of good

Speaker:

solutions for that today. Yeah. Interesting. I I think the

Speaker:

general answer to your question is probably some combination of cloud and edge.

Speaker:

You know, like, it does make sense to centralize a lot of things, and it

Speaker:

makes the the maintenance easier and, more efficient. And

Speaker:

You can get some economies of scale and, you know, all that kind of stuff.

Speaker:

But, we are gonna have to get good at managing a bunch of,

Speaker:

disparate types of things in desperate locations. I think all of

Speaker:

us. Interesting.

Speaker:

So this is the part of the show where we'll switch over to

Speaker:

The premade questions, and for your convenience,

Speaker:

I will, paste that in here.

Speaker:

Hopefully, paste it. And there we go.

Speaker:

So You had an interesting career looking at LinkedIn. You were at

Speaker:

SendGrid. You were then you were at DataRobot, and you said you made a switch

Speaker:

into the the data world, which begs the question, How did you

Speaker:

find your way into data? Did data find you or did you

Speaker:

find your way to data? I I

Speaker:

guess that is a good question. I think that, it was probably a

Speaker:

little bit of both.

Speaker:

Finding my way to data, I think that the beginning of the story is probably

Speaker:

at SendGrid. And I joined SendGrid as a DevOps engineer.

Speaker:

And to be honest, I had not really heard of SendGrid at the time. I

Speaker:

knew a little bit about it, but it, you know, I didn't really understand what

Speaker:

it was, too much with the scale. SendGrid, by the way, is now owned by

Speaker:

Twilio. But they have an API for sending email, and

Speaker:

they make it just really easy to integrate with, websites and applications

Speaker:

and and software so you don't have to worry about SMTP and, you know,

Speaker:

DKIM signing and all the other, like, gnarly bits of of

Speaker:

email. Turns out that Sengrid had a

Speaker:

ton of data. They're handling billions of emails a day,

Speaker:

and, you know, there's a lot of metadata there. The the actual data of the

Speaker:

email and and so on, the recipients and who to send it to and all

Speaker:

that stuff. And so working in that space,

Speaker:

I was dealing with tons and tons and tons of data. I mean, we

Speaker:

had, we were using mostly MySQL, and we had these

Speaker:

massive massive clusters. I think we had,

Speaker:

like, 30 or 40, you know, schemas under management. Each

Speaker:

schema was a cluster of anywhere from, Like, 6

Speaker:

to 40 plus servers, Wow.

Speaker:

You know, with lots of compute and everything else. So that was probably my

Speaker:

1st foray into, like, really thinking about data as a first class

Speaker:

citizen. And, and even to the extent of, like,

Speaker:

You know, building an architecture around the data. Right? So

Speaker:

that you can optimize the flow of the data, and being able to store it

Speaker:

and process it and transmit it fast enough to keep up with, with the

Speaker:

flow. And so, yeah, from there,

Speaker:

you know, had a lot of fun, learned a lot of things about, startups

Speaker:

about industry, about, DevOps and and all kinds of

Speaker:

things. Management as well and leadership because that's where I first,

Speaker:

started managing teams And then moved to data robot and,

Speaker:

into the ML space. And then it was a whole another learning journey

Speaker:

about, you know, data,

Speaker:

engineering, feature engineering, transformation tools. How do you

Speaker:

curate your data? And how do you really, like, know what you

Speaker:

have and inventory it and, make it available

Speaker:

to people within the business so that they can get value out of it.

Speaker:

Interesting. Very much. So our next question

Speaker:

is what's your favorite part of your current gig?

Speaker:

I think it's actually, I'm gonna cheat and I'm gonna say I have 2 favorite

Speaker:

things. And I I kind of always have I I

Speaker:

Figured out this formula a while back, in terms of what

Speaker:

motivates me. And it's one part the people that I work with

Speaker:

and another part, the problems that I had yet to solve.

Speaker:

So I wanna work with smart people. I I really don't like being like, feeling

Speaker:

like the smartest person in the room. I much prefer to surround myself

Speaker:

with people that are smarter than me and I respect and I can learn

Speaker:

from. But that also, you know, I enjoy. Right?

Speaker:

We spend a lot of time at work, so it helps to to enjoy the

Speaker:

people that you're working with. True. So that's a big part of it. And

Speaker:

then, finding tough problems, hard challenges. You know, if I

Speaker:

don't have hard challenges to keep me, to keep my mind

Speaker:

engaged and occupied, I start to get bored and, that's no fun. I

Speaker:

prefer to to always have something new to to to, you know, be chewing

Speaker:

at. So, yeah, good people, smart people,

Speaker:

and hard challenges. That is that is really awesome. I feel the

Speaker:

same way about about both of those things. The, for me though, I

Speaker:

I, Trying to find people that are smarter than me is

Speaker:

really easy. So I I enjoy that part a

Speaker:

lot. Like Frank. Frank is smarter than me.

Speaker:

Well, thank you. So

Speaker:

we have a couple of, complete this sentence, questions.

Speaker:

The first one is, when I'm not working, I enjoy

Speaker:

blank. When I'm not working, I enjoy

Speaker:

reading. I Enjoy movies. I go biking sometimes.

Speaker:

That's part enjoyment, part exercise. You know, it's good for me, but,

Speaker:

There's a lot of good, road biking in particular around Denver and a lot of

Speaker:

beautiful scenery. So you can, you know, just ride for a while and find yourself

Speaker:

up in the mountains or something, which is great. Yeah.

Speaker:

Traveling, cooking, all these things are good.

Speaker:

Our next fill in the blank is I think the coolest thing about

Speaker:

technology today is blank.

Speaker:

I I don't think it's necessarily something about today, but I think the coolest thing

Speaker:

about technology is how it builds on itself. I remember

Speaker:

Years years ago, I was studying for the CCNA exam, and

Speaker:

that was such a formative moment for me to

Speaker:

suddenly understand How networks worked all the way

Speaker:

from the physical, you know, sending

Speaker:

electricity down a copper wire, and it can be on or it can be off.

Speaker:

And that's it. Right? And you can do that really, really fast. Switch from on

Speaker:

to off, on to off, on to off, all the way up to,

Speaker:

web 2.0 and and Ajax and, you know, Asynchronous

Speaker:

JavaScript stuff happening in Google Maps. Right? And I can just drag my map

Speaker:

around. It's just mind blowing. And, honestly,

Speaker:

like, that That journey from the zeros and the

Speaker:

ones up to Google Maps, that was, you

Speaker:

know, what, 50, 60 years of,

Speaker:

technology building on itself of people solving very small simple

Speaker:

problems, but you add up all those small simple solutions and you get

Speaker:

something incredibly complex And absolutely mind blowing.

Speaker:

Excellent. Very interesting. The last, the 3rd and

Speaker:

final, Complete the sentence. I look forward to the

Speaker:

day when I can use technology to do blank.

Speaker:

I I would love, a Personal assistant, you know, like

Speaker:

Jarvis from from Marvel Comics or something or, I don't know,

Speaker:

from I I'm big into sci fi and and things like that when I read.

Speaker:

So, there are plenty of examples, but some kind of a smart personal

Speaker:

assistant that, you know, I can chat with and it keeps track of my calendar

Speaker:

and reminds me of appointments and, you know, when to call

Speaker:

my dad and whatever else, stuff like that. I just think that's

Speaker:

so cool. And I don't you know, with Especially with all the new LLMs

Speaker:

and and GPT stuff that's happening, I don't think we're super far from that. So

Speaker:

it's kind of exciting to me. No. You're right. Like, I

Speaker:

you know, if you watched, you know, when I was a kid, Star Trek next

Speaker:

generation was on, And the way that they were able to interact with the

Speaker:

computer just through their voice. Yep. And I mean, the 1st Star

Speaker:

Trek show had that too, but, like, the way the conversations I thought were more

Speaker:

richer and more kinda interactive. Mhmm. Mhmm. We

Speaker:

have a lot of that now. Yeah. I think some of the fundamental pieces are

Speaker:

in place now. Yeah. It'll probably take a little while to put

Speaker:

them all together and make it work right. But yeah. Agreed.

Speaker:

So our next one is, share something different about yourself.

Speaker:

But we, always remind guests that we're trying to keep our clean

Speaker:

rating. Yeah. On Itunes. So

Speaker:

I don't know. I think one of the more interesting things about my

Speaker:

Journey is that I don't have a background, like a a degree

Speaker:

in anything technical. I went to college and I got

Speaker:

my undergrad Studying Greek and Latin and classics. And

Speaker:

so it was mostly history, archaeology, languages, and things like

Speaker:

that. And Computers have always been a hobby of mine and and I

Speaker:

definitely did some computer science stuff in high school. I took 1 or 2 classes

Speaker:

in college, but I didn't really make my way into that

Speaker:

Professionally until a few years after college.

Speaker:

And, you know, honestly, I I don't think it's hurt me at

Speaker:

all. And in many ways, I think it's helped me partly

Speaker:

because, you know, it it helps a lot with management and leadership, just

Speaker:

to To kind of have a broad background and and understand, you know,

Speaker:

different people and perspectives and and where they might be coming from.

Speaker:

And I'm sure that some of the languages, you know, studying languages helped me

Speaker:

picking up computer languages as well. I think there are a lot of similarities in

Speaker:

In, human languages and and computer, you know, programming languages. But

Speaker:

What? Yeah. But, yeah, it is somewhat unique, and I don't run

Speaker:

into too many other classics majors, At, you know, tech startups.

Speaker:

I could definitely see the convergence, especially now when we're talking about

Speaker:

LLMs and the like. Right. You know, the the

Speaker:

nearest neighbor algorithms and all of that that are that are being applied

Speaker:

because my understanding is that's that's, You know, that's how that

Speaker:

works as it picks the next best word Right. You know, in a in a

Speaker:

sentence. And so syntax and grammar and all of the things you

Speaker:

studied in-depth, That should be very helpful.

Speaker:

Yeah. No. That that's awesome. There

Speaker:

is that good value in,

Speaker:

like a classics education. I I went to Jesuit

Speaker:

High School and Jesuit College, you know. Mhmm. I was forced into studying Latin

Speaker:

and things like that, like, didn't do it voluntarily. I'm not gonna

Speaker:

admit that, not do that. But but like as I get older, like, it's

Speaker:

definitely like, Oh, I get this. Like, you

Speaker:

know, especially when dealing with a lot of lawyers, there's a lot of Latin in

Speaker:

that. And so I'll hear them, like, you know, Excuse some

Speaker:

words. I'm like, I think I know what that means. Yeah.

Speaker:

Audible sponsors data driven.

Speaker:

And you mentioned you read a lot. Do you do audiobooks and

Speaker:

sci fi? Do you have any recommendations? Yeah.

Speaker:

There was a really good book that I read recently. Like, this is maybe

Speaker:

a year ago or something, but, best book I've read recently.

Speaker:

It's, The title of the book is called Seeing Like a State,

Speaker:

and it's by, James c Scott. The the longer

Speaker:

subtitle is, something like how Some

Speaker:

schemes to improve the human condition have failed or something like that. But,

Speaker:

it talks about this concept of legibility and how a lot of

Speaker:

The developments over the course of the enlightenment, the industrial revolution,

Speaker:

and, in the last few 100 years in in

Speaker:

Our society have been primarily

Speaker:

driven by the centralization of power in states

Speaker:

And the state needing to administer all of these people,

Speaker:

taxes, lands, land ownership, and all these different things.

Speaker:

And, you know, as part of, like, the the enlightenment, the

Speaker:

scientific revolution, we all got very enamored with, like,

Speaker:

rational thought and Logic and and all of this stuff. And

Speaker:

we thought, we're understanding the principles of the universe. We can predict

Speaker:

the motions of the planets and all these things. Well, we can solve all these

Speaker:

problems about, you know, around human civilization and humans as well.

Speaker:

And in a lot of cases, it failed. Right? And we didn't know as much

Speaker:

as we thought we did. And one of the sort of basic,

Speaker:

like, premises of the book, I guess, or arguments that it's trying to make is

Speaker:

that we routinely Underestimate, the

Speaker:

complexity of the natural world and how necessary it is.

Speaker:

And we think we can Simplify things and strip out all these

Speaker:

variables and go, you know, monocultures in our in our agriculture,

Speaker:

for example, and do industrial scale agriculture. You need

Speaker:

timber for building ships. Great. We'll just plant Norwegian pines in straight

Speaker:

rows. This is gonna be great. It's so predictable. We know exactly what,

Speaker:

You know, an acre of that will yield after 10 years.

Speaker:

But it turns out you can't strip out all the variables because the whole thing

Speaker:

falls apart. You need the complexity of the ecosystem to keep all those trees

Speaker:

healthy. And so all that predictability you thought you had

Speaker:

disappears within a couple of generations because, it can't

Speaker:

sustain itself. Wow. So, anyway, it it's a very, like,

Speaker:

complicated book. I'm not really doing it justice,

Speaker:

but I definitely recommend it. Interesting. It's on Audible.

Speaker:

Yeah, yeah, so definitely check it out.

Speaker:

The show. So if you go to the date is ribbon book.com,

Speaker:

you'll be routed to an Audible page. And if you choose to get a subscription,

Speaker:

to Audible. You will give us

Speaker:

you'll get a free book, and then we'll get like a little bit of a

Speaker:

bump on the head, and pat on the back, and Probably enough to

Speaker:

buy a cup of coffee. It started Which will share. Which will

Speaker:

share. Yes. Yes. And the final question,

Speaker:

Where can people learn more about you and Wallaroo? And they even

Speaker:

made that rhyme. Yeah. Great. I

Speaker:

think the best place to go is the Wallaroo website, which, as

Speaker:

Andy mentioned earlier is wallaroo.ai. So wallar00.ai.

Speaker:

And we've got a ton of great stuff on there. Lots of, you know,

Speaker:

documentation and and white papers and, tutorials and things about the

Speaker:

product and what we're doing there. And for myself,

Speaker:

I'm on LinkedIn. That's probably the easiest place to find me, Chris McDermott.

Speaker:

And, I think I even have that as my, like, LinkedIn

Speaker:

Profile name or whatever sits in the, you know, in the URL.

Speaker:

Cool. It is, actually c s m

Speaker:

c s McDermott. Okay. Well, thank you. Close. I was just looking at

Speaker:

it, and I was also looking at the website. It's a very nice website. Thank

Speaker:

you. Great design. And, although I can't design

Speaker:

great websites, when I look at one, I can tell whether it's great or

Speaker:

not. Me too. Me too. Same boat. I can't do it myself, but I definitely

Speaker:

appreciate it. I I can't cook, but I appreciate a good meal. There

Speaker:

we go. Yeah. That's it. And with that, we'll let

Speaker:

Bailey finish the show. Thanks, Frank and

Speaker:

Andy. And thank you, Chris, for putting up with our broken

Speaker:

calendaring system. Satya should really look into that

Speaker:

now that the drama around open a I is over.

Speaker:

Well, over for now at least. Maybe g p t

Speaker:

five can fix it.