1 00:00:34,800 --> 00:00:38,490 Adam Outland: Today's guest is Terence Mauri, a global expert 2 00:00:38,490 --> 00:00:41,970 on the future of leadership, AI and disruption. He is the 3 00:00:41,970 --> 00:00:46,110 founder of the future trends think tank, Hack Future Lab, and 4 00:00:46,110 --> 00:00:49,170 also an acclaimed author. Mauri spearheads a movement for 5 00:00:49,170 --> 00:00:53,550 leaders to rethink leadership in a post AI world. His newest book 6 00:00:53,550 --> 00:00:57,030 is called The Upside of Disruption. It's out right now. 7 00:00:57,180 --> 00:00:58,710 Terence, great to meet you. 8 00:00:58,860 --> 00:01:00,900 Terence Mauri: Thank you so much, Adam for inviting me. 9 00:01:01,110 --> 00:01:04,290 Adam Outland: Absolutely. Well, listen, there's so many amazing 10 00:01:04,320 --> 00:01:07,950 present things to ask you about in terms of your work and the 11 00:01:07,950 --> 00:01:10,680 work you're doing this moment. But one of the things I always 12 00:01:10,680 --> 00:01:14,760 love hearing about from people who have generated a lot of 13 00:01:14,790 --> 00:01:18,330 success in life in their different paths, is their their 14 00:01:18,330 --> 00:01:19,770 roots and their beginnings. 15 00:01:19,980 --> 00:01:22,620 Terence Mauri: You know, the philosopher Soren Kierkegaard 16 00:01:22,680 --> 00:01:26,580 said that life is best lived forwards, but is best understood 17 00:01:26,580 --> 00:01:29,400 backwards. And I can really relate to that. I had my whole 18 00:01:29,400 --> 00:01:33,780 life and career mapped out in a very linear way. But as as 19 00:01:33,780 --> 00:01:36,870 another famous philosopher, Mike Tyson, once said, you can have 20 00:01:36,870 --> 00:01:40,020 the best laid plans until somebody punches you in the 21 00:01:40,020 --> 00:01:42,840 mouth, and this is what happened to me. So I had a successful 22 00:01:42,840 --> 00:01:46,620 career management consultancy. Things were going well. I was 23 00:01:46,620 --> 00:01:51,570 making good progress. One day, I walked into a store, it was in 24 00:01:51,570 --> 00:01:54,510 the middle of the day, and a car driver lost control and drove a 25 00:01:54,510 --> 00:01:57,750 car into the store. He mounted the car. It was actually a 26 00:01:57,750 --> 00:02:01,440 terrible accident. Nobody lost their lives that day, but many 27 00:02:01,440 --> 00:02:05,280 people were injured, including myself. I woke up under the car 28 00:02:05,430 --> 00:02:10,320 in the store with the wheel of the car still going, burning my 29 00:02:10,320 --> 00:02:14,640 legs. It was one of those kind of accidents where your life 30 00:02:14,670 --> 00:02:19,170 flashes past. You spent a number of weeks in hospital, and you 31 00:02:19,170 --> 00:02:22,500 know when you have time to think when you're out of the building, 32 00:02:22,620 --> 00:02:26,310 that's when you you get to reflect at a deeper level. And 33 00:02:26,340 --> 00:02:28,980 you know, for me, I really sort of it was a reawakening. I 34 00:02:28,980 --> 00:02:31,890 connected with, reconnected, actually, with, you know, my 35 00:02:31,890 --> 00:02:36,240 values, legacy, who I was, who I who I was becoming. And I 36 00:02:36,240 --> 00:02:39,570 realized there was a gap. There was a gap between who I wanted 37 00:02:39,570 --> 00:02:42,420 to become and what I was doing right now in the world of 38 00:02:42,420 --> 00:02:45,900 management consultancy, which is a very transactional world, a 39 00:02:45,900 --> 00:02:48,510 very profitable world, but I wanted to pivot to a 40 00:02:48,510 --> 00:02:52,110 transformational world, a world where actually, I could bring my 41 00:02:52,110 --> 00:02:56,340 life and my values to life in a visceral way and visible way as 42 00:02:56,340 --> 00:03:01,440 well. So that happened about 20 years ago, and since then, I've 43 00:03:01,440 --> 00:03:06,510 been on a mission, a higher mission, to inspire leaders 44 00:03:06,510 --> 00:03:10,890 around the world ranging from NGO and non for profit to S, p5 45 00:03:10,890 --> 00:03:16,560 100, to harness the upside of disruption. What I mean by that 46 00:03:16,560 --> 00:03:19,590 is, you know, sort of turning disruption. It could be career 47 00:03:19,590 --> 00:03:22,920 disruption, technology, disruption, industry, 48 00:03:22,920 --> 00:03:25,620 disruption, life, disruption, whatever the disruption is, big 49 00:03:25,620 --> 00:03:30,270 or small, life changing or professional. How do we turn 50 00:03:30,270 --> 00:03:36,360 that into a tailwind or a platform for focus, laser light, 51 00:03:36,360 --> 00:03:42,720 focus and strategic courage and a more sustainable values driven 52 00:03:42,720 --> 00:03:43,350 life. 53 00:03:43,710 --> 00:03:47,520 Adam Outland: There's a lot of risk in going out and doing 54 00:03:47,520 --> 00:03:51,960 something on your own, so I guess what kind of process did 55 00:03:51,960 --> 00:03:52,920 you go through? 56 00:03:53,500 --> 00:03:56,500 Terence Mauri: I think it's such a great question. It's what I 57 00:03:56,500 --> 00:03:59,260 call a catalytic question, and it's a question that we should 58 00:03:59,260 --> 00:04:03,820 all be thinking about, because our relationship with risk is 59 00:04:03,820 --> 00:04:07,570 often not a good one. Our appetite for risk taking is 60 00:04:07,570 --> 00:04:11,620 often squeezed out of us like a lemon from an early age. By the 61 00:04:11,620 --> 00:04:15,190 age we leave college, you know, we're risk averse, and then 62 00:04:15,190 --> 00:04:18,370 we've got this paradox of companies now demanding that 63 00:04:18,370 --> 00:04:22,060 we're all courageous risk takers, and it's there's a big 64 00:04:22,090 --> 00:04:25,660 rhetoric to reality gap there. And for me, I think that 65 00:04:25,660 --> 00:04:29,170 accident reconfigured my relationship with risk, that you 66 00:04:29,170 --> 00:04:32,290 know life is inherently risky. You can you can die at any 67 00:04:32,290 --> 00:04:36,070 moment. Basically the worst thing we can do is actually not 68 00:04:36,070 --> 00:04:39,640 take any risk, because being addicted to certainty can make 69 00:04:39,640 --> 00:04:43,690 us feel comfortable. But if that certainty is also causing a 70 00:04:43,690 --> 00:04:47,740 stagnation or inertia, we're not learning anymore. We're not 71 00:04:47,740 --> 00:04:51,040 moving anymore. We're accept accepting a status quo that's 72 00:04:51,040 --> 00:04:54,370 not healthy or does not giving us happiness. Well, that's not a 73 00:04:54,370 --> 00:04:57,520 good place to be, and we know that if you look at different 74 00:04:57,520 --> 00:05:00,340 statistics around the world, whether it's the game. Develop 75 00:05:00,340 --> 00:05:03,430 engagement survey. I mean, that hasn't changed for decades. Now, 76 00:05:04,150 --> 00:05:08,170 majority of the global workforce disengaged, but still go to 77 00:05:08,170 --> 00:05:10,960 work, but have mentally quit the job, right? You know, life is 78 00:05:10,960 --> 00:05:14,560 short, like we're lucky. We get about 960, months to live, which 79 00:05:14,560 --> 00:05:16,930 is just 80 years of age. And when you think of it that way, 80 00:05:16,930 --> 00:05:20,410 when I frame it that way, it's not to make people feel scared. 81 00:05:20,740 --> 00:05:24,100 It's this idea that it's never been easier to waste time and 82 00:05:24,100 --> 00:05:28,210 waste energy, because this is the age of abundance. Also, we 83 00:05:28,210 --> 00:05:31,420 have technology that's incredible, but technology can 84 00:05:31,420 --> 00:05:35,200 make the trivial seem urgent. Think about when you look at 85 00:05:35,200 --> 00:05:38,230 your day, reacting to emails all the day, everything seems 86 00:05:38,230 --> 00:05:41,200 exciting and urgent. Probably less than 5% is actually 87 00:05:41,200 --> 00:05:45,520 important. So I think for me, what it did was a couple of 88 00:05:45,520 --> 00:05:49,870 things. Number one, it was a great reawakening, and it helped 89 00:05:49,870 --> 00:05:54,880 me to reconnect with the idea that not taking a risk is a risk 90 00:05:54,940 --> 00:05:59,530 sometimes. Number two, that actually the world will always 91 00:05:59,530 --> 00:06:03,640 be uncertain, will always be volatile, and risk and reward 92 00:06:03,670 --> 00:06:07,840 always come together, wrapped together. We forget that risk 93 00:06:07,840 --> 00:06:11,350 and will reward travel in that same elevator, and sometimes we 94 00:06:11,350 --> 00:06:14,980 forget that actually, you know this idea that we always 95 00:06:14,980 --> 00:06:19,060 overestimate the risk of trying something new could be a new way 96 00:06:19,060 --> 00:06:22,030 of working. It could be taking a career break. It could be 97 00:06:22,030 --> 00:06:25,210 applying for a new job and a new sector, whatever it is, we 98 00:06:25,210 --> 00:06:28,630 always overestimate the risk of doing something new, and we 99 00:06:28,630 --> 00:06:32,200 always underestimate the risk of standing still. 100 00:06:32,200 --> 00:06:36,400 Adam Outland: I love that. You founded a Hack Future Lab, which 101 00:06:36,400 --> 00:06:40,360 is focused on future trends as a think tank. How do you define 102 00:06:40,390 --> 00:06:41,260 disruption? 103 00:06:41,680 --> 00:06:45,640 Terence Mauri: Disruption, for me, is a secular and structural 104 00:06:45,670 --> 00:06:49,300 here to stay, inflection point. But there's different types of 105 00:06:49,300 --> 00:06:51,790 disruptors out there. If we look at mega trends, for example, 106 00:06:51,790 --> 00:06:55,240 we've got optimizer reality that's moving from doing AI to 107 00:06:55,240 --> 00:06:59,620 being AI. That means, for example, five to five we're 108 00:06:59,620 --> 00:07:03,580 moving from IT spending as a percentage of global GDP, moving 109 00:07:03,580 --> 00:07:08,350 from five to 10% over the next seven years to optimize reality. 110 00:07:08,380 --> 00:07:11,980 That's a big here to stay, disruption that will impact and 111 00:07:11,980 --> 00:07:16,090 reshape and redefine value creation, but also redefine 112 00:07:16,090 --> 00:07:20,260 completely new industries and companies. Another example would 113 00:07:20,260 --> 00:07:24,700 be decarbonization, the whole the whole transition to the 114 00:07:24,700 --> 00:07:28,360 green economy around the world. So these are big bang 115 00:07:28,360 --> 00:07:32,470 disruptions, but if we take it down to a more granular level, 116 00:07:32,680 --> 00:07:36,190 you know, disruption can also be a family disruption. It could be 117 00:07:36,190 --> 00:07:41,050 a divorce, a sudden death. Could be a career type disruption. But 118 00:07:41,050 --> 00:07:46,000 actually, how we deal with them? I think it's this kind of point 119 00:07:46,000 --> 00:07:49,000 of view that asking this question, how do I turn these 120 00:07:49,000 --> 00:07:53,710 disruptors into platforms, into tailwinds? You know, how do we 121 00:07:53,710 --> 00:07:55,060 turn these into upside. 122 00:07:56,320 --> 00:07:59,260 Adam Outland: It's a great skill to have for anyone, but in 123 00:07:59,260 --> 00:08:03,160 particular for leaders, I feel like to be able to recognize 124 00:08:03,160 --> 00:08:06,490 that every obstacle has inside of itself the key to its own 125 00:08:06,520 --> 00:08:07,360 solution. 126 00:08:07,690 --> 00:08:10,330 Terence Mauri: I think so. I think constraints are often 127 00:08:10,330 --> 00:08:14,080 opportunities in disguise. You know, for example, Hermes, the 128 00:08:14,080 --> 00:08:19,240 global luxury company, one of the big constraints facing, you 129 00:08:19,240 --> 00:08:23,830 know, the fashion industry is this, you know, the idea that 130 00:08:23,830 --> 00:08:27,880 they're not sustainable business models. They waste a lot of a 131 00:08:27,880 --> 00:08:31,360 lot of money. They've got high carbon dioxide emissions. So for 132 00:08:31,360 --> 00:08:35,740 Hermes, what they've done is turn that constraint into upside 133 00:08:35,740 --> 00:08:38,740 by creating new strategic partnerships with biotech 134 00:08:38,740 --> 00:08:44,530 companies. They develop and kind of produce mycelium, which is a 135 00:08:44,530 --> 00:08:47,830 non leather based form of leather. It's kind of like a 136 00:08:47,830 --> 00:08:52,660 mushroom based type of leather, alternative to leather. And the 137 00:08:52,660 --> 00:08:56,320 big, the kind of big, audacious goal at Hermes now is that at 138 00:08:56,320 --> 00:09:00,400 least half its global revenues will be mycelium based leather 139 00:09:00,460 --> 00:09:04,360 by 2030 now, that wouldn't have happened without turning a 140 00:09:04,360 --> 00:09:06,640 disruption into a tailwind. 141 00:09:07,150 --> 00:09:10,060 Adam Outland: So you know, one of the big disruptors that we 142 00:09:10,090 --> 00:09:15,460 keep hearing that's AI. So how are you seeing leaders react to 143 00:09:15,490 --> 00:09:20,380 AI? And how can we best harness this trend without getting 144 00:09:20,380 --> 00:09:21,010 caught up in it? 145 00:09:21,580 --> 00:09:23,800 Terence Mauri: I think it's helpful to take a historical 146 00:09:23,800 --> 00:09:28,810 perspective first of all. So we go back to 1956 Professor Marvin 147 00:09:28,810 --> 00:09:33,430 Minsky, for example. He was one of the pioneers of AI. And 148 00:09:33,430 --> 00:09:36,640 originally it was going to be called Applied Statistics, but 149 00:09:36,760 --> 00:09:40,840 that wasn't sexy enough. I think there are three kind of time 150 00:09:40,840 --> 00:09:44,080 horizons to be aware of. So the first one was excitement phase. 151 00:09:44,350 --> 00:09:49,180 It took chat GBT two months to reach 100 100 million users. It 152 00:09:49,180 --> 00:09:54,010 took the cell phone 16 years to reach 100 million users. And so 153 00:09:54,010 --> 00:09:56,200 the excitement phase, I think, has happened over the last 154 00:09:56,200 --> 00:09:58,570 couple of years, especially, we've seen over a trillion 155 00:09:58,570 --> 00:10:02,500 dollars of CapEx going. To AI infrastructure, over two $50 156 00:10:02,500 --> 00:10:06,160 billion of VC money as well. That's going up exponentially. 157 00:10:06,280 --> 00:10:09,370 So at that excitement phase, we've now moved to the 158 00:10:09,370 --> 00:10:14,800 experimental phase, for example, T Mobile and open aI have just 159 00:10:14,800 --> 00:10:19,300 formed a partnership for a sort of proactive AI decision in 160 00:10:19,300 --> 00:10:23,590 making model that will be able to proactively help solve 161 00:10:23,620 --> 00:10:27,700 customers pain points. You know, three phases. Number one, 162 00:10:27,700 --> 00:10:31,930 excitement phase, exuberance. Excitement. Number two is the 163 00:10:31,930 --> 00:10:35,230 experimental phase that I believe we're in right now. The 164 00:10:35,230 --> 00:10:38,260 next phase, the next horizon, which we'll be entering over the 165 00:10:38,260 --> 00:10:42,220 next 18 months, is the embedded phase, and that's where, you 166 00:10:42,220 --> 00:10:45,220 know, actually AI eventually will be just become invisible. 167 00:10:45,430 --> 00:10:48,730 Every great technology, if it's truly great, should be 168 00:10:48,730 --> 00:10:53,080 invisible. It'll be embedded in our cell phones, in our 169 00:10:53,080 --> 00:10:58,600 toothbrushes, in our TVs. A trillion sensor economy 170 00:10:58,690 --> 00:11:03,940 connected together, amplifying intelligence, cross pollination, 171 00:11:04,090 --> 00:11:07,360 helping tackle some of the world's biggest existential 172 00:11:07,360 --> 00:11:10,600 challenges, from climate change to healthcare. And we're at the 173 00:11:10,600 --> 00:11:13,960 embryonic stage of that. But you it doesn't take much imagination 174 00:11:13,960 --> 00:11:17,440 to to think about where we're going with this over the next 175 00:11:17,440 --> 00:11:21,850 couple of years, that the the sort of cost of production, the 176 00:11:21,850 --> 00:11:25,480 cost of knowledge production, is going to reach zero in the next 177 00:11:25,510 --> 00:11:30,310 15 years. It would take you a lifetime to read 8 billion 178 00:11:30,310 --> 00:11:35,200 words. Now imagine that you know the fastest AI today can can do 179 00:11:35,200 --> 00:11:38,680 that in the blink of an eye, and again, we can start to see the 180 00:11:38,680 --> 00:11:43,120 exponential opportunity of this platform. But I want to say as 181 00:11:43,120 --> 00:11:47,980 well that we have to be careful of artificial idiocy. Am I 182 00:11:47,980 --> 00:11:56,320 investing in warm AI or cold AI? So warm AI is humanity first AI? 183 00:11:56,590 --> 00:12:02,140 It's a humanity first future enabled by AI. It maximizes, 184 00:12:02,140 --> 00:12:05,020 elevates what makes us more human, and it protects Well, 185 00:12:05,020 --> 00:12:08,710 being, loneliness, democracy, truth and transparency. That's 186 00:12:08,710 --> 00:12:12,760 warm. Ai, the bad news is right now, most governments, most 187 00:12:12,760 --> 00:12:15,910 organizations, are not investing in warm. Ai, they're investing 188 00:12:15,910 --> 00:12:22,570 in cold. AI, cold. AI is machine first future enabled by AI. It 189 00:12:22,570 --> 00:12:27,790 elevates division, disinformation, truth decay, it 190 00:12:27,820 --> 00:12:31,630 erodes well being. And so that's the difference. Are we investing 191 00:12:31,630 --> 00:12:34,960 in a warm tech future or a cold tech future? 192 00:12:37,180 --> 00:12:40,000 Adam Outland: For so many of our listeners that are business 193 00:12:40,000 --> 00:12:43,510 owners themselves. You talk about the return on 194 00:12:43,510 --> 00:12:47,590 intelligence, what are some things leaders can do to prepare 195 00:12:47,590 --> 00:12:51,520 their organizations for AI and adopt it effectively? 196 00:12:51,910 --> 00:12:53,680 Terence Mauri: What a great question. I think, the question 197 00:12:53,680 --> 00:12:56,110 every leader, every manager, should be thinking about right 198 00:12:56,110 --> 00:13:01,210 now, which is to use AI in the right way and an inclusive way, 199 00:13:01,210 --> 00:13:04,480 sustainable way, in a way that sharpens the growth and talent 200 00:13:04,480 --> 00:13:09,190 agenda. We should be thinking about ROI, which is not just 201 00:13:09,190 --> 00:13:14,380 return on investment, but this new human centric KBI, key 202 00:13:14,380 --> 00:13:19,540 behavior indicator, which is return on intelligence, return 203 00:13:19,540 --> 00:13:23,680 on imagination. Imagine a cognitively enabled enterprise 204 00:13:23,920 --> 00:13:26,800 where your talent gets to solve the biggest problems, the 205 00:13:26,800 --> 00:13:31,300 biggest challenges that it faces. That means 10x 206 00:13:31,330 --> 00:13:36,340 productivity, 10x engagement, 10x execution. We know that 207 00:13:36,340 --> 00:13:39,040 that's not the reality for most organizations right now. I just 208 00:13:39,040 --> 00:13:42,430 had an article published a few weeks ago in Fast Company called 209 00:13:42,490 --> 00:13:46,690 the rise of bore out, which is the opposite of burnout. Bore 210 00:13:46,690 --> 00:13:50,080 out is cognitive or emotional under load. It's boredom at 211 00:13:50,080 --> 00:13:54,910 work, and it's at record levels. And so if AI is just doing parts 212 00:13:54,910 --> 00:13:57,610 of the job which we're already doing, and what we're left with 213 00:13:57,730 --> 00:14:00,760 is other boring parts of the job that's not return on 214 00:14:00,760 --> 00:14:04,060 intelligence. And so that's a big question. We should be using 215 00:14:04,060 --> 00:14:09,580 AI to speak to insight, speak to innovation, speed to decision, 216 00:14:09,580 --> 00:14:13,450 velocity for creating new scenarios, new you know, new 217 00:14:13,450 --> 00:14:16,090 products, new services, new platforms, testing out 218 00:14:16,090 --> 00:14:20,620 hypotheses. You know, it's a generative tool. The clue is in 219 00:14:20,620 --> 00:14:24,940 the name, but my worry is that many C suite are just looking at 220 00:14:24,940 --> 00:14:30,820 AI to automate, to make cost savings and to just focus on a 221 00:14:30,820 --> 00:14:34,030 very narrow metric, which is shareholder return. 222 00:14:35,140 --> 00:14:37,570 Adam Outland: Sure. This is just maybe a quick question for 223 00:14:37,570 --> 00:14:42,100 you. Is a form of disruption, to disrupt technology by being more 224 00:14:42,100 --> 00:14:43,000 in person? 225 00:14:43,240 --> 00:14:46,540 Terence Mauri: Yes, I think so. I really think so. Because, as I 226 00:14:46,540 --> 00:14:50,320 said, when the cost of this technology is coming to zero and 227 00:14:50,320 --> 00:14:53,140 everybody has access to the same tools, the same technologies, 228 00:14:53,650 --> 00:14:57,070 it's more difficult to stand out. Ironically, everyone's got 229 00:14:57,070 --> 00:14:59,890 access. Everyone can set up the great website, the great. 230 00:14:59,890 --> 00:15:02,710 Podcast, the great YouTube channel, but how do you stand 231 00:15:02,710 --> 00:15:05,710 out? The value of that goes down. This is one of the 232 00:15:05,710 --> 00:15:08,230 ironies, and I think a lot of people don't think about that. 233 00:15:08,260 --> 00:15:11,110 And it's there's a reason why naught point naught. 1% of 234 00:15:11,110 --> 00:15:15,310 people make money on Spotify or tick tock or YouTube, and that 235 00:15:15,310 --> 00:15:18,130 number is even going down more. We've got to be so careful. I 236 00:15:18,130 --> 00:15:21,430 call it the curse of sameness. Write about it a lot in the new 237 00:15:21,430 --> 00:15:25,720 book, The upside disruption. So yes, ironically, sharpening your 238 00:15:25,720 --> 00:15:30,820 human edge, your in person edge, is going to be the superpower 239 00:15:30,910 --> 00:15:34,930 that differentiates you. Gives you that distinctive quality in 240 00:15:34,930 --> 00:15:37,840 the sea of sameness and sea of commoditization that we're in, 241 00:15:37,990 --> 00:15:41,800 and that's why making the effort to go to these in person events, 242 00:15:41,830 --> 00:15:46,270 speaking at them, contributing to them, panels. This is a part, 243 00:15:46,270 --> 00:15:49,870 an important part, of the human edge. Yes, use the tools around 244 00:15:49,870 --> 00:15:53,860 us. You'd be stupid not to. But don't think that it's going to 245 00:15:53,860 --> 00:15:56,770 be easy just doing it that way. Yes, there'll be a percentage 246 00:15:56,770 --> 00:16:00,310 that managed to do it. But my fear is that when everything 247 00:16:00,310 --> 00:16:04,480 come comes to zero cost. Everyone has access to the same 248 00:16:04,480 --> 00:16:07,510 incredible tools. Well, actually it's much more difficult to 249 00:16:07,510 --> 00:16:12,460 stand out and so that in person, human edge, the that social 250 00:16:12,460 --> 00:16:16,540 skills, emotional intelligence, conversational listening, being 251 00:16:16,540 --> 00:16:20,080 fully present. These going to be important. Human skills, human 252 00:16:20,080 --> 00:16:23,020 skills, Courage skills, these are the skills that we need to 253 00:16:23,020 --> 00:16:26,350 nurture and sharpen for the next generation. 254 00:16:27,310 --> 00:16:30,550 Adam Outland: What walls have you encountered in building 255 00:16:30,550 --> 00:16:34,480 something? I love allowing our listeners insight that when you 256 00:16:34,480 --> 00:16:38,440 choose to take a risk, that it doesn't necessarily mean a 257 00:16:38,440 --> 00:16:41,410 pathway paved in gold, it comes with a lot of potholes. 258 00:16:41,620 --> 00:16:44,440 Terence Mauri: I love that question because disruption is 259 00:16:44,440 --> 00:16:48,910 about humility. And what I mean by humility is the ability to 260 00:16:48,910 --> 00:16:54,340 the capacity, the awareness to know your blind spots, to be 261 00:16:54,340 --> 00:16:57,160 aware of the blind spots that you're blind to, but also 262 00:16:57,160 --> 00:17:01,990 understanding that failure and setback and obstacle is one half 263 00:17:01,990 --> 00:17:07,120 of of success. And as Ryan Holiday says so eloquently, you 264 00:17:07,120 --> 00:17:10,660 know, the obstacle is the way, disruption is the way. And you 265 00:17:10,660 --> 00:17:13,840 know, I'm a self confessed failure pioneer. Yeah, I failed 266 00:17:13,840 --> 00:17:17,500 multiple times in order to get where I am today, multiple 267 00:17:17,500 --> 00:17:21,430 setbacks, multiple rejections, book rejections, client 268 00:17:21,430 --> 00:17:24,970 rejections. You know, 30% of what I do is keynotes 269 00:17:25,000 --> 00:17:27,940 conferences around the world. But you know often you're 270 00:17:28,000 --> 00:17:30,520 you're, you know, when you're chosen, you're, you've been 271 00:17:30,520 --> 00:17:34,330 chosen out of like maybe five or six or eight other great 272 00:17:34,330 --> 00:17:38,530 speakers. So you get rejected a lot. You know, 90% of the time 273 00:17:38,530 --> 00:17:41,920 it's a rejection. Now I could choose two ways to respond to 274 00:17:41,920 --> 00:17:44,800 that. I could say, one, I'm going to give up because I'm 275 00:17:44,800 --> 00:17:47,200 good. You know, my ratio of rejection is so high, so I'm 276 00:17:47,200 --> 00:17:51,070 just not going to do that. Or two, recognize that actually 277 00:17:51,340 --> 00:17:55,120 anything worthwhile in life requires resilience, requires 278 00:17:55,120 --> 00:17:58,900 persistence, requires grit, and that's been the big lesson for 279 00:17:58,900 --> 00:18:02,230 me. I'm here because I've overcome probably more failures 280 00:18:02,230 --> 00:18:06,310 than the average person, and that's been painful, but I've 281 00:18:06,310 --> 00:18:10,660 made pain part of the process and understand that if I'm not 282 00:18:10,660 --> 00:18:12,280 hurting, I'm not growing. 283 00:18:13,510 --> 00:18:15,430 Adam Outland: You know I just want to spend one more minute on 284 00:18:15,430 --> 00:18:18,910 this; people, and their relationship to risk is that 285 00:18:18,910 --> 00:18:21,250 they don't understand that a lot of life can be, a little bit 286 00:18:21,250 --> 00:18:25,660 like a game of baseball, and that if you're batting 20 to 30% 287 00:18:25,720 --> 00:18:27,010 it's good batting average. 288 00:18:27,000 --> 00:18:27,540 Terence Mauri: Yeah. 289 00:18:27,600 --> 00:18:30,690 Adam Outland: Most of life, we're not trained to embrace 290 00:18:30,750 --> 00:18:34,710 rejection or misses or missed swings that way we're treated. 291 00:18:34,740 --> 00:18:38,400 We treat it as an ultimate failure, which then generally 292 00:18:38,400 --> 00:18:40,440 means I'm not good enough. 293 00:18:40,500 --> 00:18:41,280 Terence Mauri: Yes. 294 00:18:41,460 --> 00:18:44,340 Adam Outland: For you, where did this light switch flip? Or where 295 00:18:44,340 --> 00:18:46,860 do you? Where could you trace it back to say, this is the moment 296 00:18:46,860 --> 00:18:49,320 where I kind of changed my relationship to failure? 297 00:18:49,590 --> 00:18:52,260 Terence Mauri: I think it goes back to that life disruption, 298 00:18:52,260 --> 00:18:55,980 life or death moment, life flashing past you, which you 299 00:18:55,980 --> 00:18:59,760 realize you can have your whole life that mapped out that linear 300 00:18:59,760 --> 00:19:03,990 way become obsessed with success and avoidance of failure, and 301 00:19:03,990 --> 00:19:06,690 that's not real life. You know, being stripped down and being 302 00:19:06,690 --> 00:19:10,230 made very vulnerable, nearly losing my life, actually was the 303 00:19:10,230 --> 00:19:13,710 wake up call to me that, you know, we've life is about. Life 304 00:19:13,710 --> 00:19:17,580 is risk. And actually the biggest regret, one of the 305 00:19:17,580 --> 00:19:20,010 things to help our listeners and viewers today is when you get to 306 00:19:20,010 --> 00:19:24,270 the end of your life, and I hope it's a long life, 8090, years of 307 00:19:24,270 --> 00:19:28,680 age, or 100 the number one regret, according to research, 308 00:19:28,710 --> 00:19:32,370 is a lack of courage. That when you look back, the biggest 309 00:19:32,370 --> 00:19:34,620 regret you'll have in your lives, when you look back at 310 00:19:34,620 --> 00:19:38,130 your 90 years will be the amount of times that you didn't you 311 00:19:38,130 --> 00:19:41,700 didn't step up. The lack of courage, the courage to think 312 00:19:41,730 --> 00:19:46,050 bit bolder, the courage to say no, the courage to walk away 313 00:19:46,050 --> 00:19:48,930 from something that wasn't working for you, to courage to 314 00:19:48,930 --> 00:19:52,200 start over. This is our number one regret at the end of our 315 00:19:52,200 --> 00:19:55,920 lives, and we can use this knowledge in advance of getting 316 00:19:55,920 --> 00:19:59,850 to 90 years of age to our advantage and do what you know. 317 00:19:59,850 --> 00:20:03,000 Steve, Steve Jobs or Jeff Bezos used to do very well, which is 318 00:20:03,060 --> 00:20:06,990 regret minimization. To get to imagine that you're 70 years old 319 00:20:06,990 --> 00:20:10,080 and say, what would I regret most not doing when I look back? 320 00:20:10,350 --> 00:20:14,370 Is it not having kids? Is it not starting that business? Is it 321 00:20:14,370 --> 00:20:17,640 not, you know, yeah, hitting that C level in my company. 322 00:20:17,640 --> 00:20:22,800 Whatever it is, success is very personal. But just remember, you 323 00:20:22,800 --> 00:20:25,230 know, at the end of our lives, the number one regret is a lack 324 00:20:25,230 --> 00:20:29,310 of courage. And actually, one half of courage is the ability 325 00:20:29,310 --> 00:20:34,830 to embrace failure and recognize that that is a stepping stone to 326 00:20:34,830 --> 00:20:38,220 where you want to be. It's not necessary. And of course, there 327 00:20:38,220 --> 00:20:41,070 are different types of failures as well. By the way, productive 328 00:20:41,070 --> 00:20:45,570 failures, intelligent failures, stupid failures. So we have to 329 00:20:45,570 --> 00:20:48,960 be careful here as well and understand the nuances. But the 330 00:20:48,960 --> 00:20:52,650 one thing to take away is we need to reframe our relationship 331 00:20:52,680 --> 00:20:56,100 with productive failure and recognize that it's also an 332 00:20:56,100 --> 00:20:58,440 important part of success. 333 00:20:59,250 --> 00:21:02,310 Adam Outland: What do you think when you are spending time as an 334 00:21:02,310 --> 00:21:05,670 entrepreneur mentor for MIT, or you're you know, you're speaking 335 00:21:05,730 --> 00:21:08,760 at some universities and engaging with this next 336 00:21:08,760 --> 00:21:11,640 generation that's coming out, what do you see there? Do you 337 00:21:11,640 --> 00:21:16,620 see a group of young women and men that are have that new 338 00:21:16,620 --> 00:21:19,500 definition of failure? Or do you feel we need to be able to 339 00:21:19,500 --> 00:21:22,890 manufacture somehow, for some of these people, maybe not a life 340 00:21:22,890 --> 00:21:26,580 threatening situation, but something that that shakes them 341 00:21:26,580 --> 00:21:29,700 up in how they perceive what we're discussing? 342 00:21:30,180 --> 00:21:33,990 Terence Mauri: What I love about Generation Alpha is this 343 00:21:34,530 --> 00:21:39,390 incredible vision aspirational, and I think it's very nuanced, 344 00:21:39,420 --> 00:21:46,230 the culturally context and culture demographics, whether 345 00:21:46,230 --> 00:21:50,520 it's Africa or North America or Europe, there's a different risk 346 00:21:50,520 --> 00:21:54,690 appetite out there depending on geography, and so that's that's 347 00:21:54,690 --> 00:21:58,050 a big deal. But the good news is, it can be unlearned and 348 00:21:58,050 --> 00:22:02,730 relearned as well, for sure. And what I love about my work at MIT 349 00:22:02,790 --> 00:22:07,890 is every year we host MIT solve, and the purpose of MIT solve is 350 00:22:07,890 --> 00:22:11,430 to create a generation of solvers. We post Grand 351 00:22:11,430 --> 00:22:14,490 Challenges, global challenges that the world faces, for 352 00:22:14,490 --> 00:22:18,270 example, climate change, lack of literacy, healthcare challenges, 353 00:22:18,600 --> 00:22:21,840 and we give people anywhere in the world, the opportunity to 354 00:22:21,840 --> 00:22:26,310 pitch tech based solutions to those global challenges, and one 355 00:22:26,310 --> 00:22:30,720 of those examples recently was for people with Alzheimer's. 356 00:22:30,720 --> 00:22:34,170 There was a young student called Emma Yang. Her grandmother was 357 00:22:34,170 --> 00:22:37,650 diagnosed with Alzheimer's, or dementia. It's a terrible 358 00:22:37,800 --> 00:22:41,490 disease. One in seven people will get it in their lifetime. 359 00:22:41,670 --> 00:22:45,990 And you know, being a very an introvert and a mathematician, 360 00:22:46,140 --> 00:22:50,460 she framed this as a hypothesis, and she created an app called 361 00:22:50,460 --> 00:22:54,120 Timeless. The purpose of the app timeless is to help people with 362 00:22:54,120 --> 00:22:57,960 dementia stay reconnected to their memories, reconnected to 363 00:22:57,960 --> 00:23:01,740 their families through geo tagging, facial recognition 364 00:23:01,980 --> 00:23:05,700 gamification, received over a million dollars of wealth of 365 00:23:05,700 --> 00:23:09,450 investment since she since inception. This is a great 366 00:23:09,450 --> 00:23:12,930 example of generations solve. This is a great example of what 367 00:23:12,930 --> 00:23:17,250 can happen when the cost you imagine, the costs of doing this 368 00:23:17,310 --> 00:23:20,130 10 years ago, would have been prohibited. It would cost 369 00:23:20,130 --> 00:23:24,510 millions of dollars to set up an app and test it, scale it. Now, 370 00:23:24,510 --> 00:23:28,950 you can do all of this. You can go from idea to iteration to 371 00:23:28,980 --> 00:23:32,790 implementation within hours or days, and obviously that's 372 00:23:32,790 --> 00:23:36,660 accelerated even more with AI. So for me, this is not the age 373 00:23:36,660 --> 00:23:40,080 of disruption, by the way. This is the age of wonder, the age of 374 00:23:40,080 --> 00:23:43,380 possibility, and the only limit is our imagination. 375 00:23:43,890 --> 00:23:47,460 Adam Outland: Yeah, I was at a longevity dinner, and it was 376 00:23:47,460 --> 00:23:51,120 very interesting to hear people talk about how quickly and 377 00:23:51,120 --> 00:23:55,410 exponentially medical and health disruption is occurring, and 378 00:23:55,410 --> 00:24:00,120 that the key takeaway from this speaker was that if you can live 379 00:24:00,150 --> 00:24:03,570 10 more years, you'll solve most of your problems that you'll 380 00:24:03,570 --> 00:24:05,970 have in the future, and probably add another 20 to your life. 381 00:24:06,120 --> 00:24:08,580 Terence Mauri: It's so exciting, isn't it? Ray Kurzweil, 382 00:24:08,610 --> 00:24:11,940 Singularity University has written a book recently where he 383 00:24:11,940 --> 00:24:14,850 really deep dives into this as well. And you know, I was in 384 00:24:14,850 --> 00:24:19,470 Doha big you know, future of tech, future of AI Summit, and 385 00:24:19,470 --> 00:24:21,780 you know, just some of the stats, you know, the facts and 386 00:24:21,780 --> 00:24:25,170 insights coming out are so exciting. This idea that, you 387 00:24:25,170 --> 00:24:29,220 know, we've now got, you know, chips that are the it can be 388 00:24:29,220 --> 00:24:33,060 scaled to DNA, and they can hold billions of transistors. That's 389 00:24:33,060 --> 00:24:35,910 actually happening. It's not, it's not science fiction. So 390 00:24:35,910 --> 00:24:38,700 science fiction has become science flag. And I think you're 391 00:24:38,700 --> 00:24:41,160 right. If we can hold on for at least another time, 10 years, 392 00:24:41,250 --> 00:24:43,620 it's going to be one hell of a ride. 393 00:24:46,980 --> 00:24:49,620 Adam Outland: And stay tuned. We'll continue this conversation 394 00:24:49,620 --> 00:24:53,880 with Terence in Episode 479 of The Action Catalyst.