Speaker:

How much is media contributing relative

to customer base is a really nice place

Speaker:

to start.

Speaker:

And the benefit of running

incrementality and media mix modeling is

Speaker:

informing the model with

some of that causal data.

Speaker:

Well, hello and welcome to another edition

of the E-Commerce Evolution podcast.

Speaker:

I'm your host, Brett

Curry, CEO of OMG Commerce.

Speaker:

And today we have got

a doozy of an episode.

Speaker:

We're talking about the three

horsemen of measuring your

Speaker:

marketing effectiveness. We're

talking MTAs Multitouch attribution.

Speaker:

We're talking M'S. Media mixed

modeling. We're talking incrementality.

Speaker:

It's going to be nerdy,

Speaker:

but I also promise you it's going to

be practical and it will make you more

Speaker:

money. And so we'll hopefully

make it fun as well.

Speaker:

And so my guest today is Tom Leonard.

Speaker:

We are LinkedIn friends first.

Speaker:

So I saw Tom on LinkedIn posting about

incrementality, talking about MMM,

Speaker:

throwing shade on certain tools and stuff

like that on LinkedIn. And I'm like,

Speaker:

this is my type of guy. So I reached

out, we had a call, and then we're like,

Speaker:

Hey, we got to record a podcast.

Speaker:

Let's create some insights

for people on the pod.

Speaker:

And so Tom is a fractional

marketing leader.

Speaker:

He's operationalizing MMM

and incrementality testing,

Speaker:

and I'm delighted that he's my guest

today. So Tom, with that intro,

Speaker:

how's it going? And welcome to the show.

Speaker:

Good. Thanks for having me, Brent.

Excited to be here. And yeah,

Speaker:

some of my favorite things to talk

through, so excited to do it. Good stuff.

Speaker:

It's good stuff, man. So briefly,

Speaker:

before we dive into the

meat of the content here,

Speaker:

what's your background and

how did you become a guy who's

Speaker:

operationalizing MMS and incrementality?

Speaker:

Yeah. And what does that even mean?

Speaker:

That's a good point.

Speaker:

For sure. Yeah, totally. Yeah.

Speaker:

So spent most of my career thus far on

the agency side at performance agencies.

Speaker:

And I'd say the crux of

how I got to where I'm now,

Speaker:

or I've been reflecting back a little

bit more on the why I have such a passion

Speaker:

for measurement. And I was at

a pretty hardcore DR agency,

Speaker:

and it was right shortly after TRUBY

for Action came out when YouTube was

Speaker:

starting to invest in, DR.

Speaker:

Moved into a new role we had created

with a centralized group of basically

Speaker:

people who had different areas of subject

matter expertise and a few analysts

Speaker:

that ran tests across a

pretty large client base.

Speaker:

And I was our YouTube SME,

Speaker:

and worked with a couple

analysts to run a bunch of tests.

Speaker:

And really it was to evangelize how to,

Speaker:

and is YouTube a platform to drive growth?

And it was really interesting

Speaker:

because I started spending a lot of time

on YouTube and then also connect to TV

Speaker:

and broader programmatic video.

And it was this really interesting,

Speaker:

for me, the biggest learning was less

about how to make YouTube as effective as

Speaker:

possible,

Speaker:

but more how to help brands think about

demand creation as opposed to just

Speaker:

demand capture. And frankly,

Speaker:

the difficulty of getting brands

to leverage YouTube relative

Speaker:

to connected tv,

Speaker:

because YouTube sat so close to Google

ads and therefore last click attribution

Speaker:

and see tv, you couldn't click

and was sexier in a deck.

Speaker:

And it was just this sort

of recognition of the

Speaker:

irrational kind of human behavior just

in any sort of industry or any thing

Speaker:

in life.

Speaker:

But it sort of helped frame up this

idea of you really have to do more than

Speaker:

just, I don't know,

Speaker:

represent logic or rational arguments.

You really have to also

Speaker:

bring the easy to understand

clear data. And that's,

Speaker:

I think what draws me to incrementality

testing specifically and why

Speaker:

that's sort of the backbone

of a lot of what I do now.

Speaker:

And I think I use the word

operationalizing, NMM and

incrementality testing.

Speaker:

And really what I mean by that is a lot

of people will run medium mix models or

Speaker:

run incrementality tests,

Speaker:

but oftentimes they'll sit in a slide

or in a report to be shown once,

Speaker:

but never to be looked at again.

Speaker:

And so what I'm really trying to do

with brands now is how do you build a

Speaker:

framework and a repeatable methodology

to get insights from tests,

Speaker:

but not just leave them as

insights but to take action?

Speaker:

Because the only way that you create

value from any of these sort of testing

Speaker:

methodologies and measurement

methodologies is by

acting on the insights.

Speaker:

And so that's sort of what I mean by my

funky little headline of those words.

Speaker:

Yeah, it's so good, man.

Speaker:

And it's one of those things where data

really doesn't matter if you don't take

Speaker:

the right actions from it.

And what's so interesting,

Speaker:

and our paths are similar in that

I got my start in actually TV and

Speaker:

radio and doing traditional media, and

then I got into SEO and paid search,

Speaker:

but I loved video. Video was my

thing, but I love paid search as well.

Speaker:

And then when TrueView and TrueView

for Action came out, I was like, whoa,

Speaker:

these are all my world's colliding.

Speaker:

This is.

Speaker:

Video and there's some search components,

Speaker:

at least some search intent involved

there. And it's direct response.

Speaker:

I've always been a direct response guy.

Speaker:

I believe that marketing

should drive an outcome, right?

Speaker:

Advertising should drive

a measurable outcome,

Speaker:

and that should be measured in terms

of new customers and profitable new

Speaker:

customer acquisition. And

what's really interesting, Tom,

Speaker:

and I think this kind of feeds into

the conversation we're having today.

Speaker:

There was a period of time, so I

grew up reading some of the classics.

Speaker:

So David Ogilvy of course, but John

Cap's tested advertising methods,

Speaker:

Claude Hopkins Scientific Advertising.

Speaker:

And they would do things like they would

run and add in a newspaper or magazine

Speaker:

and people would clip a

coupon and bring it in,

Speaker:

or they would call a certain number and

they would track it and they would have

Speaker:

codes and stuff.

Speaker:

And I remember thinking once I got

into e-commerce, I was like, oh man,

Speaker:

we've got so many tools. The world is

so clear now we have every piece of

Speaker:

data at our disposal.

Speaker:

And now the more I've gotten into it

and the more I've matured, I'm like,

Speaker:

we've got more data. But I don't

know that we've got more insights,

Speaker:

and I don't know that we've

got any more clarity. In fact,

Speaker:

there's maybe more confusion.

Speaker:

And I think it goes back to

what you said a minute ago,

Speaker:

this idea of demand generation

versus demand capture.

Speaker:

We're really good at measuring channels

and campaigns that are demand capture,

Speaker:

meaning they're capturing

demand that's already out there.

Speaker:

That's harder to measure

the demand generation,

Speaker:

which is usually where the magic happens.

Speaker:

And so super excited to dive in here.

Speaker:

I think what might be useful

is let's talk about what

Speaker:

are these kind of three horsemen that

I laid out there, MTAs, multitouch,

Speaker:

attribution, and incrementality.

So let's start with MTAs first.

Speaker:

So Multitouch attribution tools,

Speaker:

what are they and what

is your take on them?

Speaker:

Yeah, big question. Great

question. Yeah, I mean,

Speaker:

MTA been around for a while,

Speaker:

different flavors and ways

of trying to make it work,

Speaker:

especially as so much has changed

in privacy and the tech and tracking

Speaker:

landscape.

Speaker:

But ultimately the goal is to try

to give fractional credit to all the

Speaker:

touchpoints along a customer journey with

a recognition that the last touchpoint

Speaker:

click or last impression is

ultimately not what drove that person

Speaker:

to purchase.

Speaker:

That may be the last or the only thing

that you might see in something like

Speaker:

Google Analytics or your analytics suite.

Speaker:

But there's this general recognition

that that is not what drove the purchase.

Speaker:

So MTA, the kind of promise, which I

ultimately think is a failed promise,

Speaker:

is whether all the different touch

touchpoint and then how can you

Speaker:

value those differently. So

maybe you use first touch,

Speaker:

maybe you use even distribution. The

idea of data-driven attribution was the

Speaker:

holy rail or the Promise many years ago,

Speaker:

and I guess still to a

degree for some is like,

Speaker:

how do you know this channel was more

additive or more necessary and therefore

Speaker:

should get more credit than that channel?

Speaker:

Which I think makes a

ton of sense in promise.

Speaker:

I think in reality it's really hard

and I would argue impossible to do,

Speaker:

especially as a lot of the ability to

track users at a one-to-one level degrades

Speaker:

generally my perspective,

I'm very bearish on MTA,

Speaker:

so that'll probably come

through pretty strongly.

Speaker:

But I guess I don't think the toothpaste

is going back in the tube in terms of

Speaker:

the ability to track a customer across

all these different touchpoints,

Speaker:

especially as the ability to

track through or impression based

Speaker:

touchpoint erodes. And then you

really get reliant on clicks,

Speaker:

which I think then leads to a lot of

all the issues that just last click in

Speaker:

general has.

Speaker:

So I think it's really hard to

make a compelling case for MTA.

Speaker:

I've seen too many brands,

Speaker:

especially trying to

build MTA tools internally

Speaker:

and just be a huge time and resource

suck. And then when you ask to compare,

Speaker:

show the multi-touch view versus

last click, it's like, I don't know,

Speaker:

80 or 90% only had one touch

point anyways, that's all

that MTA model could see.

Speaker:

So is it really that much

more useful than last click?

Speaker:

It's sort of multi-touch when that can

be measured, but usually it can't be.

Speaker:

Yeah, and It never really answers

the causality question either,

Speaker:

which we'll get to when we

talk about incrementality.

Speaker:

And I always kind of tell this,

Speaker:

I think the short story of why MT A

isn't really viable anymore as all the

Speaker:

tracking and privacy changes.

Speaker:

But I think the slightly longer story

is the kind of recognition that just

Speaker:

because an ad was shown or a

click occurred doesn't mean that

Speaker:

that medium was needed or

that channel was needed.

Speaker:

It doesn't answer the causal question,

Speaker:

what would've happened

without this ad running?

Speaker:

Did somebody just happen to use multiple

touchpoints as navigation or was it

Speaker:

more convenient to click on one of

these ads that happened to be served?

Speaker:

But if you're not comparing that to some

sort of control group to really hard

Speaker:

to assign causality to the fact

that there just was a touchpoint.

Speaker:

Yeah, it is so good. And it's one of

those things where I remember again,

Speaker:

early on,

Speaker:

you would look inside of Google ads or

you look inside of Meta or was back when

Speaker:

it was Facebook only, and you

were like, the data's here.

Speaker:

I see row ads and I see clicks and

I see performance and all that.

Speaker:

Then you realize, well, wait a

minute, this isn't fully accurate.

Speaker:

If I add the two together,

that's double my total revenue,

Speaker:

so I can't just rely on

what's in the platform.

Speaker:

And that got worse as I was 14 was

introduced and other privacy changes were

Speaker:

made. But then MTA came

along and it's like, oh,

Speaker:

finally we're going to get to see the

full picture. It's going to decipher,

Speaker:

decode the shopping journey,

Speaker:

and we're going to finally see with a

keen eye in perfection exactly what caused

Speaker:

this ad or what caused this purchase

to happen. And then we finally realized

Speaker:

MTA is maybe just a third

option. It's like, okay,

Speaker:

Google's imperfect, Meta's

data's imperfect, and then mt A,

Speaker:

it's just imperfect too.

Speaker:

So now we just got three imperfect

things to look at and make

Speaker:

decisions from.

Speaker:

And in some ways it leads to more

confusion than it leads to clarity.

Speaker:

And now I don't want to wholesale discard

Speaker:

MTAs because I do believe there's some

helpful insights that can be gained

Speaker:

there,

Speaker:

but it's incomplete

and incomplete at best.

Speaker:

And one of the best analogies I've heard,

and this actually comes from Ben Ter,

Speaker:

who's also a LinkedIn friend,

but I met him in person as well,

Speaker:

but he talks about this analogy of, Hey,

Speaker:

if we're trying to measure what

caused people to watch this

Speaker:

movie at our movie theater,

Speaker:

and we look at all these

results and 30% say they saw a

Speaker:

billboard for our movies,

20% say they saw a TV ad,

Speaker:

but you know what? A hundred percent

say they saw the poster on the

Speaker:

door. So we're like,

let's just cut everything.

Speaker:

Let's just do the poster at the door

and that's it. And you're like, well,

Speaker:

wait a minute. Everybody saw it.

Everybody was walking in the door.

Speaker:

But the movie poster is not

what caused someone to purchase.

Speaker:

It was the billboard and the TV

and some of the other things,

Speaker:

word of mouth and other things

that caused them to come in.

Speaker:

And so this idea of causality,

super, super valuable.

Speaker:

So that really leads us to incrementality.

So talk about incrementality.

Speaker:

What is it and why are you on

a quest to operationalize it?

Speaker:

Yeah, it's really the best way,

Speaker:

if not the only way to

establish that a causal

Speaker:

portion that we've been talking about.

It has a distinct control group,

Speaker:

so it has a counterfactual,

Speaker:

it has what would've happened

without this intervention,

Speaker:

whatever that intervention is.

Speaker:

And there's a handful of ways to derive

that counterfactual that control.

Speaker:

The most common would be geographic

based. So like a match market test.

Speaker:

I've got this market over here that

historically has behaved similarly to this

Speaker:

market over here. I can

see that in an AA test,

Speaker:

the lines sort of move similar

to one another. They're not,

Speaker:

if they're influenced by outside

factors, they're influenced.

Speaker:

In what's an AA test for

those who don't know.

Speaker:

Before an intervention happens.

Speaker:

So just over time are those lines

essentially moving together?

Speaker:

Are external factors or stimuli equally

impacting both sides of that test

Speaker:

so that you can feel confident that

when you do intervene and it becomes

Speaker:

comparing A to B,

Speaker:

the delta is what was a

result of that intervention.

Speaker:

So oftentimes it's my Atlanta

Speaker:

and I don't know Memphis,

Speaker:

maybe some other midsize city that

you've done this market matching for.

Speaker:

Historically, they both

look like this on a line,

Speaker:

all of a sudden you turn off

ads on Facebook in Atlanta,

Speaker:

what happens to your top line that

Delta is what was attributed or

Speaker:

should be attributed to

advertising in Atlanta.

Speaker:

Whereas the flip side of that would be

attribution would say basically anything

Speaker:

that was attributed to that could

be attributed to that would really,

Speaker:

it should just be the gap between a

world where that ad does not exist

Speaker:

compared to a world where that ad

does exist. We can't take credit for

Speaker:

everything.

Speaker:

We can only take credit for as much

above and beyond what would've happened

Speaker:

anyways. And so that's the

basis of incrementality testing.

Speaker:

There's other ways to do it.

Speaker:

If you use a Facebook or Google

conversion lift study because they own

Speaker:

that auction or anybody

that owns an auction,

Speaker:

they can do that hold out

for you at a user level.

Speaker:

They can track all of those users

regardless of if you serve an ad.

Speaker:

Good examples are maybe easier to

describe in a first party data capacity.

Speaker:

If you're running email, you may blast

all of your customers and say, Hey,

Speaker:

I sent an email to all my

customers and this many purchased.

Speaker:

They went back to the website or

clicked it. But if you just said, Hey,

Speaker:

I'm going to serve just to odd

number of customer IDs and not to

Speaker:

even number customer IDs,

I can then just compare,

Speaker:

forget about who clicked on ads,

Speaker:

who did anything.

I'm just going to look at my backend.

Speaker:

I know I exposed these users,

but not these users 50 50 split.

Speaker:

They've historically kind

of done the same thing.

Speaker:

All I did was even an odd and just

measuring the difference between those two

Speaker:

groups.

Speaker:

So really any way that you can

establish a true control that

Speaker:

passes that AA test. So

before you intervene, do they

continue to look similar?

Speaker:

Are they influenced at the same rate so

that you can feel confident that when

Speaker:

you do intervene with new

media, retracting media,

Speaker:

some new sort of test that you are

confidently comparing to what would've

Speaker:

happened in a world

without that intervention?

Speaker:

Yeah, yeah.

Speaker:

It's applying the scientific

method with some rigor behind

Speaker:

what happens when I turn this channel on,

Speaker:

or what happens when I

turn this channel off?

Speaker:

What is the actual impact of this channel?

Speaker:

And what's interesting is I

remember back in my early days

Speaker:

of being in the advertising world,

Speaker:

this was when online stuff was

just getting kind of warmed up.

Speaker:

I was talking to this furniture store

owner and I'm like, Hey, what do you do?

Speaker:

Do you invest in radio ads?

Do tv, do you do newspaper?

Speaker:

And so as I went through Themm like,

Hey, do you do radio ads? And he is like,

Speaker:

yeah, I mean, yeah, I sort of do.

And I'm like, newspaper's like, yeah,

Speaker:

there's a big sale, something will

happen. I'm like, well, what about tv?

Speaker:

And he said, yes. And his

eyes lit up and he is like,

Speaker:

when I run TV ads, I feel

it. People walk in the door,

Speaker:

it happens. And I remember early on

in my online career thinking, man,

Speaker:

that was so unsophisticated. Did

that guy really know what's going on?

Speaker:

But now looking back, I'm like,

yeah, that's maybe all that matters.

Speaker:

That is incrementality in a real loose

easy just to observe with your eyes think

Speaker:

because you had one. Totally.

Speaker:

Which I think people

take for granted. Yeah.

Speaker:

They do.

Speaker:

Yeah.

Speaker:

That's not exciting. That's not

like, where's all your data?

Speaker:

It's in my cash register.

That's where all the data.

Speaker:

Is, especially for smaller brands,

Speaker:

when you have the ability

to feel if something's

Speaker:

working or not working,

Speaker:

if you double spend in something that

you think is working really well because

Speaker:

attribution says it's working really well,

Speaker:

and all of a sudden

your cash just doubles,

Speaker:

even though your attributed number

scales linearly, something has to give,

Speaker:

right?

Speaker:

And what has to give is it wasn't really

causing any additional top line growth.

Speaker:

It was just really good at

getting the attributed credit.

Speaker:

So I think the feeling

it in the p and l is

Speaker:

definitely overlooked.

Speaker:

It's valid, and it is overlooked

though. You're a hundred percent,

Speaker:

especially now that we have

so many tools at our disposal.

Speaker:

And I think another way to look at

this, and look, I'm a Google guy,

Speaker:

YouTube and Google is kind of where

I really got my start in online.

Speaker:

Marketing.

Speaker:

But listen, branded search is a

perfect example here. What happens,

Speaker:

we see this all the time.

Speaker:

What happens if you turn branded

search completely off? Now, I believe,

Speaker:

and this is top of front of the podcast,

Speaker:

there are strategic ways to use branded

search and there's ways to run it and

Speaker:

not waste money, but a lot of people

could shut it off and nothing happens,

Speaker:

nothing. Maybe sales get in a little bit,

Speaker:

but you take meta meta's really working

and you shut it off and you feel it.

Speaker:

Sales go down and that's

an incrementality.

Speaker:

Same is true for YouTube if you're doing

YouTube the right way. And so yeah,

Speaker:

I really like this. And one

kind of anecdote here to share,

Speaker:

we just did a test with Arctic,

Arctic coolers, Yeti competitor,

Speaker:

my favorite cooler, my favorite drinkware

as well. And so they wanted to see,

Speaker:

Hey, can YouTube drive an incremental

lift at Walmart? So they had just

Speaker:

gotten into most Walmart

stores, coast to coast.

Speaker:

So we did exactly what you laid out

there. We had a 19 test markets,

Speaker:

19 matched control markets.

So similar markets.

Speaker:

So think like a Denver and a

Kansas City or the example,

Speaker:

use Atlanta and whatever else

that's kind of comparable. And hey,

Speaker:

let's run YouTube in one

and not in the other.

Speaker:

And let's measure then the

growth in Walmart sales,

Speaker:

and let's do a comparison

between the two in Walmart sales.

Speaker:

And it was remarkable. It

was about an eight week test.

Speaker:

We had three test regions, so 19

markets, but three test regions,

Speaker:

test region. One, we saw an average

of 12% lift in Walmart sales.

Speaker:

The test region two was like 15% lift.

Speaker:

And then our final test

region was 25% lift.

Speaker:

And there were some standouts,

Speaker:

like Oklahoma City was up 40% and Salt

Lake City was up 48%. But it was one of

Speaker:

those things where, okay, now we

look at that and we can say, okay,

Speaker:

YouTube had a big impact. And

what's also interesting, Tom,

Speaker:

is we just ran the YouTube portion at OMG.

Speaker:

They also did a connected TV test

in other markets, not related,

Speaker:

didn't see a lift, didn't

see a measurable lift.

Speaker:

And so it could be lots of

that was not to throw shade on

Speaker:

CTVI like CTV,

Speaker:

so maybe they just did a wrong or

wrong creatives or who knows what.

Speaker:

But it's one of those things

where it's like, okay,

Speaker:

if you do this the right way,

you should see an impact.

Speaker:

And I think touching on the

piece that I didn't mention,

Speaker:

the other beauty or value of

incrementality testing relative to

Speaker:

attribution or mt a is the ability

to see beyond your.com to be able to

Speaker:

see what's happening on third parties

like Amazon, what's happening in store.

Speaker:

If you get that data own an operated

store or if you can get that through

Speaker:

wholesale data, it really simplifies.

Speaker:

There's so much complexity.

And I think that's, again,

Speaker:

one of the rubs that I have

with MTA is all of them,

Speaker:

all of the data you have to

wrangle together to try to

Speaker:

patchwork this kind of story together.

Speaker:

Whereas in incrementality testing,

it's pretty straightforward.

Speaker:

It's what did I spend and how

did I run that spend in these by

Speaker:

market by day or by week, and what

was my sales? What were my sales?

Speaker:

What were my new customers or whatever

metric I'd want to look at with that same

Speaker:

granularity and same dimension.

Speaker:

And that's really it because you're

really just trying to understand the

Speaker:

relationship that calls the

relationship between spend and outcomes,

Speaker:

all that kind of muddy middle

in the middle, trying to

get it at the user level,

Speaker:

which again, not going back into

the tube really simplifies things.

Speaker:

Yeah, it does.

Speaker:

And another thing that was

kind of interesting that

came a light doing this test

Speaker:

for Arctic is all of the ads we

tagged with available at Walmart,

Speaker:

shop at Walmart, find on the

shelves and Walmart, whatever.

Speaker:

We measured everything

though in those markets.

Speaker:

So you could look at Walmart sales,

online sales, so the.com and Amazon.

Speaker:

And what's interesting is the

push to Walmart really worked.

Speaker:

It's a reminder of what you ask someone

to do in an ad is what they're going to

Speaker:

lean towards. Because

in some of the markets,

Speaker:

we didn't see that much of an online lift.

Speaker:

We saw some clicks and stuff like

that, but the Lyft was at Walmart.

Speaker:

But we also saw a pretty

strong lift at Amazon as well,

Speaker:

because I think that just speaks to,

Speaker:

there's some people that are just going

to buy everything from Amazon right

Speaker:

there, tell 'em to go online value pro

proposition. Is it on Amazon? Yeah, yeah.

Speaker:

Yeah. Here in a day or two, it's hard.

Speaker:

To beat, dude. It's hard to beat

same price in a couple days.

Speaker:

I don't have to leave my house. But

yeah, really, really interesting.

Speaker:

And so we'll circle

back to that of course,

Speaker:

but let's talk about then

MMM or media mix modeling.

Speaker:

What is that? How are you using that?

Speaker:

And then how does that kind of relate to

incrementality testing? Because again,

Speaker:

going back to your tagline, Tom, you

did not say operationalizing NTAs.

Speaker:

You said operationalizing m

and ms and incrementality.

Speaker:

So what is MM and how does

that pair with incrementality?

Speaker:

Yeah,

Speaker:

basically a big correlation exercise

trying to suss out without a true kind of

Speaker:

holdout group,

Speaker:

what is the impact and contribution of

each media channel and also what would

Speaker:

happen without media.

Speaker:

So trying to suss out a lot of the

same questions as incrementality,

Speaker:

but basically using correlation as

opposed to having a true holdout group.

Speaker:

So basically,

Speaker:

and I'm sure all the hardcore MMM people

and data scientists will thumbs down

Speaker:

this or whatever you can do to podcast,

but hey, in this period of time,

Speaker:

sales went up and nothing could really

explain that other than the fact that

Speaker:

TikTok spend went up and essentially

doing that at a mass scale over longer

Speaker:

periods of time trying to take into

account anything that could explain that.

Speaker:

So you'll always kind of flag it with

these are promotions that happen,

Speaker:

it should because you're going to give

a model at least like two years worth of

Speaker:

data or two years worth of data,

Speaker:

it'll bring in seasonality and try to

understand those sort of trends. So it's

Speaker:

trying to pull out if not

seasonality, if not promotions,

Speaker:

if not some other things

that we are flagging.

Speaker:

And it wasn't price reductions,

it wasn't all these pieces,

Speaker:

what was happening in media

that could explain that change.

Speaker:

And so that's ultimately

what MMM is doing.

Speaker:

It's a big correlation exercise,

Speaker:

figuring out roughly what is the channel

contribution to a top line revenue or

Speaker:

order number and what's really important.

Speaker:

I think the nicest part or the best

first step with M is trying to get an

Speaker:

understanding of a base,

Speaker:

which is what it's going to be called or

intercept what without the presence of

Speaker:

ads,

Speaker:

does this model think that my sales would

be such that I can then calculate not

Speaker:

a total CAC of just looking at

total new customers divided by cost,

Speaker:

but incremental to media

or remove base from

Speaker:

that equation,

Speaker:

how many conversions were contributed

because of media as this model sees,

Speaker:

which no model is going to be perfect,

Speaker:

no measurement method

is going to be perfect,

Speaker:

but it's a really nice

place to start to say,

Speaker:

I knew I couldn't account all

new customers to advertising,

Speaker:

but what's a good number to use or

to start with? Well, it looks like,

Speaker:

and this will depend on the maturity of

the brand, but a really mature brand,

Speaker:

I mean super mature brand,

Speaker:

the big CPGs might be like 99% base

smaller brand might be something

Speaker:

like 50% because you've got

this word of mouth flywheel,

Speaker:

you've got product market fit,

Speaker:

but trying to get an understanding of how

much is media contributing relative to

Speaker:

customer base is a really

nice place to start.

Speaker:

And the benefit of running

incrementality and media mix modeling is

Speaker:

informing the model with

some of that causal data.

Speaker:

You see that a lot and there's a

really powerful feature of media mix

Speaker:

modeling is saying, Hey, yes,

that's a correlation exercise,

Speaker:

can't pull everything out,

Speaker:

but let me inform the model or at least

restrict the priors it can use or the

Speaker:

coefficient, whatever

you want to call 'em,

Speaker:

what it's searching for to try to find

a fit in this model and say, well,

Speaker:

I did a hold out test. I know

you don't have the causal data,

Speaker:

but we ran this in this channel and that

channel and helping that restrict the

Speaker:

model and giving it data that it can't

have without that human intervention can

Speaker:

be a really powerful flywheel.

Speaker:

So using your incrementality test data,

Speaker:

feeding that back into your MMM

model to make it more accurate and

Speaker:

more causal and make that correlation.

Speaker:

Stronger.

Speaker:

Because the two things that are really

like you're really trying to get,

Speaker:

but you don't get with Multi-Tech

attribution or attribution in general.

Speaker:

And you do get with the combination of

media mix modeling and incrementality

Speaker:

testing is the incremental impact,

Speaker:

the causal impact of what

would've happened without

the presence of ads as well

Speaker:

as the diminishing returns curve,

Speaker:

which we know can be really

powerful and important too,

Speaker:

is what has happened over time as I

spend in that sort of a feature of big

Speaker:

feature of media mix modeling

is understanding where

are you on a diminishing

Speaker:

returns curve? Is there

if I keep spending more,

Speaker:

I know it's not going to scale linearly,

Speaker:

but are there channels

that diminish faster?

Speaker:

Is there more headroom in other channels?

Speaker:

And it really becomes this

true optimization game of

where do I put the next

Speaker:

dollar? Ultimately the

question that every marketer,

Speaker:

every finance team is

trying to answer is, Hey,

Speaker:

if I find $20,000 into couch

cushions, where do I put it?

Speaker:

And if I need to give back $20,000,

where do I pull from to have.

Speaker:

I want to hang out at your house and

look at your couch cushions and find 20

Speaker:

grand? That's.

Speaker:

Great. Yeah, it's easy to

give it back, but yeah, right.

Speaker:

We're trying to figure out what is going

to be the least impactful if I have to

Speaker:

give the money back and cut budgets

and where is it going to be the most

Speaker:

impactful if I have another $20,000?

Speaker:

Because the answer is not going to be

found in what has the highest or the

Speaker:

lowest ROAS in an attributed

view. And in fact,

Speaker:

that can have the complete

opposite impact that you want.

Speaker:

Yeah, yeah, it's really great.

Speaker:

So I want to actually talk about

that point in a minute where

Speaker:

if you've got cut budgets,

which hey, listen,

Speaker:

there's been some uncertainty even as we

record this, tariffs up, tariffs down,

Speaker:

markets up, market down, whatever

consumer sentiment is all over the place.

Speaker:

So if things get a little bit

tight, what are we going to do?

Speaker:

We can't slash marketing,

we can't slash growth.

Speaker:

I think that sends you

into a death spiral,

Speaker:

but we might have to get pull

back and get more efficient.

Speaker:

And so let's talk about that

actually for a little bit.

Speaker:

So where can you be led astray?

Speaker:

I think you just made a post

on LinkedIn about this, right?

Speaker:

Where you start looking at performance,

which feels like the smart thing to do,

Speaker:

looking at ROAS and whatnot, and

you're like, well, great, well,

Speaker:

let's just cut the lowest ROAS

campaigns and channels. We'll be fine.

Speaker:

How does that lead you astray?

Speaker:

And if you want to talk about your

specific example to help illustrate these

Speaker:

points, that'd be great.

Speaker:

Yeah, totally.

Speaker:

I think the other one you're referring

to is I think branded search,

Speaker:

which we were talking about

earlier. And I love using both a,

Speaker:

because it can be really, if a brand

is spending a lot of money there,

Speaker:

it can be a really great place to go

find those savings without impacting top

Speaker:

line. But also frankly, it's

really easy to understand.

Speaker:

I think most people understand that

up and down the organizational chart

Speaker:

across departments, everybody sort

of understands the idea of, Hey,

Speaker:

if somebody's already

searching for my brand,

Speaker:

do I need to pay to get that

click and that conversion?

Speaker:

And I found that just the fact that

it's easy to understand can be a

Speaker:

really good gateway to incrementality

testing because it's easy to get buy-in.

Speaker:

Everybody understands that idea,

Speaker:

whereas it may be more challenging

to express that idea in

Speaker:

other types of campaigns.

But branded search is a good example,

Speaker:

and the example that you're referring to,

Speaker:

kind of a midsize brand that I was

working with went through that exact

Speaker:

exercise, had to cut budgets.

Speaker:

They looked at up and down the campaigns

they were running. It was like, Hey,

Speaker:

we just got to make the best decision

we can with the best available data.

Speaker:

They were basically running p max

non-branded search and branded search and

Speaker:

p max and branded search where had

the best attributed roas Best CPA

Speaker:

non-brand was really hard to justify in

a lower budget kind of environment based

Speaker:

off the attribution data cut that leaned

a little bit more into branded search

Speaker:

as a percentage of their budget.

And over the next couple months,

Speaker:

new customers in total revenue

was declining despite the

Speaker:

attributed ROAS and CPA

looking even better than ever.

Speaker:

And that's where was brought

in, looked at all these things,

Speaker:

saw the loose correlation to

non-brand and new customer

Speaker:

acquisition and top line,

Speaker:

just the general skepticism that

many have around branded search,

Speaker:

especially in a low

competition environment,

Speaker:

which they were in. There weren't many

competitors in the auction that we

Speaker:

could see in Auction Insights. So yeah,

Speaker:

ran a very blunt instrument

match market test,

Speaker:

which at a brand of that size and for a

branded search I don't think is ever a

Speaker:

bad idea. And yeah, no

impact to branded search.

Speaker:

It was about 20% of their budget,

Speaker:

which was substantial that you

can either make the decision,

Speaker:

I'm going to put that 20% back in

my pocket or save it for a rainy day

Speaker:

or give it to some other

place in the org or say, Hey,

Speaker:

I'm going to redistribute this to

something that I see in correlation

Speaker:

data that might help

drive top line backup.

Speaker:

Let's reinvest that in non-brand as

opposed to keeping it in branded. Again,

Speaker:

complete opposite of what

attribution would say.

Speaker:

And you see that a lot frankly with

branded search is an easy one to pick on.

Speaker:

Same with retargeting,

Speaker:

but really anything that's especially

challenging with the black box

Speaker:

solutions that blend,

Speaker:

and I'm sure we could do a whole talk

show on p max Advantage plus some of the

Speaker:

things that bundled together historically

radically different levels of

Speaker:

incrementality can be a real challenge

when you're then measuring on

Speaker:

attribution. But yeah, a

ranty way of saying yes,

Speaker:

finding areas to cut oftentimes

if you follow the attribution kind

Speaker:

of data can lead to really kind

of impactful in a negative way

Speaker:

business outcomes because the attribution

view just does not take into account

Speaker:

what would've happened

without the presence of those

ads like Incre Ality does.

Speaker:

And so can definitely lead brands

astray as they're looking to cut.

Speaker:

Yeah, really interesting. And yeah,

Speaker:

max notorious for leaning into

remarketing or branded search.

Speaker:

If you're not diligent about that, it

can lean into both of those things.

Speaker:

And so got to be mindful of that.

Speaker:

You also quoted something

that totally ties into this.

Speaker:

It's from a shop talk talk that

you went to shop Talk the show,

Speaker:

and I can't remember who said

it, but if you see high roas,

Speaker:

I know something is wrong and that the

auto targeting is just finding existing

Speaker:

customers. Do you remember actually

who said that and unpack a little bit?

Speaker:

Yeah, I forget his name and I could

look real quick. He worked for.

Speaker:

Mic.

Speaker:

The Post Dan Danone, the big CPG.

Speaker:

Yeah, I just really appreciated

that quote because I

Speaker:

mean always wonder if I live in sort of

a bubble of being super passionate about

Speaker:

incrementality versus attributed metrics,

Speaker:

but that was just really refreshing to

hear because I don't think that's the

Speaker:

natural.

Speaker:

It's not.

Speaker:

Thought in people's.

Speaker:

Head spend more.

Speaker:

But I really think it should

kind of spark some skepticism,

Speaker:

especially when your goal really

is to try to drive new customers.

Speaker:

My first,

Speaker:

especially if you think about both

incrementality in the context of a SC

Speaker:

or pex that's blending retargeting

and prospecting by default

Speaker:

and knowing diminishing returns

Speaker:

are my first dollars, yes, they're

going to be the most effective,

Speaker:

but if they are focused on people that

are already buying from me and my goal in

Speaker:

my head is new customers,

Speaker:

I should be shocked that I can

spend a hundred dollars and drive

Speaker:

this amazing new customer revenue

Speaker:

and not think that something is up or

even over time as I continue to spend

Speaker:

our BS meters should probably

go up a little bit more.

Speaker:

And I don't think they do by default. So

I found that comment really refreshing.

Speaker:

Yeah, I think that

really illustrates that,

Speaker:

right where it's like most of us would

think, oh, ROAS is going up great,

Speaker:

we're printing money.

Speaker:

Whereas maybe you should say BS

detector, something's wrong here.

Speaker:

This campaigns leaning into customers

that we're going to buy anyway.

Speaker:

And I'll give two examples here to

illustrate this a little bit more.

Speaker:

And I'll also, since we've been

picking on branded search so much,

Speaker:

I'll share a couple of ways I

think we should use it. One.

Speaker:

If.

Speaker:

Other competitors are

aggressively bidding on,

Speaker:

just know that if you're not Nike and

you're not Adidas and you're not like Ford

Speaker:

or something, it's not a

lock. If it's a new customer,

Speaker:

they could be swayed by a competitor.

Speaker:

And that's generally how we

like to separate it out is like,

Speaker:

let's have branded search for returning

customers and let's make that crazy

Speaker:

efficient or just turn it off altogether.

Speaker:

If.

Speaker:

It's a new customer, then again,

we want it to be very efficient,

Speaker:

but maybe we want it on because we

don't want our competitor to come in and

Speaker:

swipe us to give and swipe our

customer. And so one example of this,

Speaker:

I did a podcast with Brian Porter,

he's the co-founder of Simple, modern,

Speaker:

great Drinkware brand has become a friend

and they did a study incrementality

Speaker:

study and they found, I'll

get these numbers off,

Speaker:

but it was like branded

search was 10% incremental.

Speaker:

So basically what that means is if it

shows that I got a hundred new customers

Speaker:

from Branded Search,

Speaker:

I probably would've gotten 90 of

those if I had shut it off, right?

Speaker:

Only 10% were incremental.

Speaker:

So then what you would need to do there

is you need a 10 x row as on branded

Speaker:

search for it to even make

sense. If it's below that,

Speaker:

you're completely wasting

money. Pair that with,

Speaker:

and you and I were commenting

on the House analytics, HAUS,

Speaker:

Olivia Corey and team did 190

incrementality studies involving

Speaker:

YouTube and they showed with

tremendous amounts of rigor

Speaker:

that hey,

Speaker:

YouTube is probably 342 times more

Speaker:

incremental, meaning if

you see a one in platform,

Speaker:

it's actually like a 3 42 in

terms of incremental impact.

Speaker:

And so wildly different

between those two. But again,

Speaker:

we're just so drawn to in platform

row as man, we'll just say spin,

Speaker:

spin spend on p max and branded search

when really we should be saying,

Speaker:

let me lean into YouTube or let

me lean into top of funnel meta.

Speaker:

I think both those examples

too are really good examples.

Speaker:

To me it also speaks

though to the importance of

Speaker:

cost per incremental almost being

more important than incremental

Speaker:

percent incremental. And that's something

I always use with branded search.

Speaker:

I think you and I have a very similar

feeling around branded search.

Speaker:

There's definitely a

time and a place for it,

Speaker:

and it's one of those things where

it might not matter that it's 10%

Speaker:

incremental, 10% incremental relative

to what Google's attributing.

Speaker:

If your attributed CPA

is a dollar and now it's

Speaker:

$10,

Speaker:

but your margin when you sell a

product is a thousand dollars like

Speaker:

hammer that all day long,

Speaker:

that cost per incremental is still

extremely profitable and valuable.

Speaker:

And same with the YouTube piece.

Speaker:

If YouTube was four times as

incremental as Google said,

Speaker:

but your YouTube was crazy expensive,

Speaker:

it still might not be worth it

even though it's four times.

Speaker:

More.

Speaker:

Incremental than the platform was making.

Speaker:

And that's how I think a lot

about this with connected tv where

Speaker:

connected TV can be super powerful

and maybe more so than linear tv,

Speaker:

but if you can buy scatter

linear TV for a 10th

Speaker:

of the cost of CTV,

Speaker:

well it just has to be more

than a 10th as effective and

Speaker:

it's accreted, it's a positive.

Speaker:

So it becomes more of comparison

of a cost per than just a

Speaker:

blanket.

Speaker:

How incremental is something which I

always think is important to focus on and

Speaker:

call out.

Speaker:

To. Yeah, it's so good.

Speaker:

I mean measuring something in terms of

percentages can provide insights and help

Speaker:

make decisions, but ultimately

it's the cost per right.

Speaker:

Translate that into real dollars

to see if it makes sense.

Speaker:

100% agree with you,

Speaker:

but I think this also goes back

to and use your linear TV example,

Speaker:

and I still love TV and

connected TV and stuff. Again,

Speaker:

I'll use YouTube just because

I've got the numbers in my brain,

Speaker:

but with YouTube sometimes

we'll see a $5 CPM or a

Speaker:

$7 CPM in certain audiences

compared to other channels that are

Speaker:

15, 20, 30, 50, whatever.

Totally. And I'm like, well,

Speaker:

if we're reaching the right person

and if the message and offer are

Speaker:

good, how could this not work? And it's

one of those things where it's like,

Speaker:

okay, we're either one of those is

off, we're talking to the wrong person,

Speaker:

that's the wrong message,

Speaker:

or we're just not measuring it properly

and that's where we need to look at it.

Speaker:

So did you have a thought on that?

Speaker:

You another question on

MM here in just a second.

Speaker:

Yeah, yeah, totally. But it

made me think of the idea of,

Speaker:

I think the reason I'm starting to become

way more bullish on any channel that's

Speaker:

historically been hard to measure

where I think there's that arbitrage

Speaker:

opportunity of costs are still relatively

low because people haven't all moved

Speaker:

in because it's easy to attribute.

Speaker:

It'll be really interesting

with a house example,

Speaker:

does that inspire a lot

more YouTube buyers?

Speaker:

That's something that Google

should have put out way long ago,

Speaker:

but I think it would undermine

undermine search and that's their bigger

Speaker:

business. And I could do a whole

kind of rant and I'll save you that,

Speaker:

but the idea of incrementality first

measurement probably wouldn't be great for

Speaker:

the search business. So probably exactly,

Speaker:

haven't been able to make such a

good point that case on YouTube.

Speaker:

But you think about all the channels

that have historically been harder to

Speaker:

attribute,

Speaker:

that's where costs are deflated just

from a supply and demand perspective.

Speaker:

So when you can move in and get CPMs at

five to $7 and it's really effective,

Speaker:

but most people that are measuring

through attribution don't know it's really

Speaker:

effective, that's a huge win for certain

period of time until everybody's flood,

Speaker:

everybody and the costs go.

Speaker:

Up the market.

Speaker:

I'm sure there's a lot of people that

were not excited to see that study from

Speaker:

house like dang it, that means my costs

are going up. I don't like that at all.

Speaker:

So really good man.

Speaker:

So we talked about incrementality testing

and I think you can use tools like

Speaker:

House and then there are others.

Speaker:

We're just talking about work magic and

there's a number of others you can lean

Speaker:

into. Full disclosure,

they're pretty expensive,

Speaker:

but you can also do stuff on your own too.

Speaker:

If you've got someone that

can measure this stuff,

Speaker:

you can do a little bit of it on your

own. What about the MMM side of things?

Speaker:

What's kind of the easy way to start

there? Is there an easy way to start?

Speaker:

What do you recommend to people.

Speaker:

There? I don't know. I dunno if

there's an easy way to do anything.

Speaker:

I think, well, I guess

that's not totally true.

Speaker:

I think there's some ways to

run relatively easy incre tests.

Speaker:

So I think that's the

easier place to start.

Speaker:

Certainly you can always

ratchet up the scientific rigor.

Speaker:

I think the problem with looking

for an easy MM solution is

Speaker:

anybody could run a model with Robin or

there's a lot of open source packages,

Speaker:

but just because you can run a model,

Speaker:

it could say anything.

Speaker:

It's not necessarily rooted in this

can all of a sudden predict the future

Speaker:

and tell you exactly the

contribution from media.

Speaker:

Whereas incrementality can do

that a little more out of the box.

Speaker:

You may have wildly wide

confidence intervals,

Speaker:

but it answers the question.

It gives you the comparison.

Speaker:

I didn't do it in this market,

Speaker:

I did it in this market.

What is the Delta Media mix modeling?

Speaker:

You could build a model

to tell sort of any story.

Speaker:

The proof is sort of in the pudding of

if I do the thing that the model says,

Speaker:

does it change my top line?

Speaker:

Can I see over time that

when I listen to the model

Speaker:

that improves my top line?

Speaker:

So it's a lot easier to get started

with incrementality testing.

Speaker:

You can run poor man's match

market tests as I sort you can just

Speaker:

sort of pick,

Speaker:

some markets historically behave

similarly and there's certainly some risk

Speaker:

there, but with a model you might

think that it's an amazing model.

Speaker:

I just don't feel like there's a great

place to DIY that together without some

Speaker:

real scientific or statistical

rigor. Or if you do,

Speaker:

you've just got to try to prove it over

and over by taking some big swings. And

Speaker:

that's really,

Speaker:

I sort of feel like you can get away

with the kind of feel it sort of tests

Speaker:

without really running a true

incrementality test or model.

Speaker:

If you're a small enough business and

you spend a decent amount on Facebook,

Speaker:

maybe you're not willing

to turn off Facebook,

Speaker:

but are you willing to drastically

increase spend and see if you can feel

Speaker:

something at the top line? Okay, then

what happens if you cut it in half?

Speaker:

What happens?

Speaker:

And start to understand those curves on

your own is probably a less risky way

Speaker:

than trying to, I've never done

anything in R and I'm going to run

Speaker:

or done any sort of medium amount.

I'm going to try to run one.

Speaker:

That's probably a risky proposition.

Speaker:

Yeah, it's a really good insight. I'm

glad you answered the question that way.

Speaker:

I think, yeah,

Speaker:

leaning into the poor man's incrementality

test or just leaning really heavily

Speaker:

into a channel and measuring your top

line if you've got a small enough business

Speaker:

to look at that, but probably if

you're going to lean into MM M1,

Speaker:

you need a couple years of data and so

to be able to make some correlations and

Speaker:

you probably need to lean in to

someone or a tool with quite a bit of

Speaker:

experience because you can do that astray.

Speaker:

And on your comment on cost too.

Speaker:

I mean it's all relative and a lot of

times where you're going to need a medium

Speaker:

mix modeling is when you're spending

a significant amount in a significant

Speaker:

number of channels,

Speaker:

which you're probably only doing

if you are spending a lot total,

Speaker:

which you're probably only doing if your

revenue can support that high level of

Speaker:

spend,

Speaker:

which means that a tool may not be

all that expensive relative to the

Speaker:

opportunity you could derive from

it, which is where I always net out.

Speaker:

So I'm paying 10 or 20

grand for a tool monthly,

Speaker:

but it's allowing me to

redeploy millions in ad spend.

Speaker:

And it totally in completely

makes sense. So Tom,

Speaker:

this has been fantastic.

I'm just watching the clock.

Speaker:

I know we're kind of coming

up against it, but one,

Speaker:

I recommend people follow you on LinkedIn.

You put out some awesome content.

Speaker:

I love reading it.

Speaker:

Thank.

Speaker:

You. People should definitely follow

you on LinkedIn and you are, is it Tom,

Speaker:

what is your handle on LinkedIn?

You are Thomas B. Leonard.

Speaker:

Thomas B. Leonard. That's

probably confusing.

Speaker:

I'm very self-conscious of LinkedIn, so

I'm glad to thank you for saying that.

Speaker:

I think it's good, man. I think it's

really good. I like it a lot. Yeah.

Speaker:

Yeah, it's been fun to start

doing connecting with folks.

Speaker:

Definitely an area that had a lot

of excitement and passion for,

Speaker:

it's fun to have these

sort of conversations,

Speaker:

so I appreciate you reaching out a

while ago and that we could connect.

Speaker:

Absolutely.

Speaker:

Man. Absolutely. So then if

other people were like, Hey,

Speaker:

I just want to talk to Tom because maybe

you can help my brand or my business,

Speaker:

how can they connect with you and who are

you looking to or who do you feel like

Speaker:

you can help?

Speaker:

Yeah, definitely appreciate that.

Yeah, reach out on LinkedIn.

Speaker:

I spend time there. I love reading

everybody's thoughts and content. So yeah,

Speaker:

reach out on LinkedIn mostly we work

with consumer facing brands that

Speaker:

are trying to understand where to

put the next dollar or where to pull

Speaker:

in the scenarios. They have to really

kind of rescue people from attribution,

Speaker:

trying to better understand where they

can get more with their ad dollars.

Speaker:

I think to your point that you teed

up now is such an interesting time or

Speaker:

anytime that there's margin pressure,

Speaker:

there's more scrutiny

on a marketing budget.

Speaker:

Really want to try to help

empower marketing teams to

feel more confident with

Speaker:

what they're doing and ultimately the

finance teams to feel more confident with

Speaker:

what marketing team is doing. Hundred

percent. That's where I love to plug in,

Speaker:

but also just love to talk about this

stuff probably more than I should.

Speaker:

So always open to the conversation.

Speaker:

Yeah, I talk about that a lot.

Speaker:

I've read analytics and measurement

books on vacation and my wife

Speaker:

is like, what is wrong with you? And I'm

like, it's interesting. I don't know.

Speaker:

I like it. And so totally, we are

just a different breed I suppose,

Speaker:

but I love that.

Speaker:

And then I think this is a great way to

end it where if I've got an extra dollar

Speaker:

to spend on marketing, where do I put

it? If I need to cut a dollar of spend,

Speaker:

where do I cut it from?

Speaker:

And that's really what

this approach is about MMM

Speaker:

and incrementality. And so

I think their necessities,

Speaker:

I think attribution is broken and or

misleading in so many different ways.

Speaker:

There's some correlations there, so we

don't have to throw it out completely,

Speaker:

but I do believe you need to lean

into MMM and incrementality for short.

Speaker:

So connect with Tom on LinkedIn.

And with that, we'll wrap.

Speaker:

Tom's been fantastic. Thanks for the

time, the insights and the energy. Yeah.

Speaker:

Thanks so much Brett

time. Glad to connect.

Speaker:

Absolutely. And as always, thank you for

tuning in. We'd love to hear from you.

Speaker:

If you found this episode helpful,

Speaker:

someone else in the D two C space or

marketing space, and you think, man,

Speaker:

they got to listen to this, please

share it. We mean the world to me.

Speaker:

And with that, until next

time, thank you for listening.