1 00:00:00,300 --> 00:00:02,910 Matt Best: Hello and welcome to the Growth Workshop Podcast with 2 00:00:02,940 --> 00:00:05,430 me, Matt Best, and Jonny Adams. 3 00:00:05,520 --> 00:00:06,030 Jonny Adams: Hello! 4 00:00:06,300 --> 00:00:08,490 Matt Best: Today we're absolutely thrilled to be joined 5 00:00:08,490 --> 00:00:11,850 by Athena Peppes, and you're going to talk to us a lot around 6 00:00:12,120 --> 00:00:16,260 navigating AI adoption, through overcoming fear and embracing 7 00:00:16,290 --> 00:00:18,720 foresight. We're going to address some of those ethical 8 00:00:18,720 --> 00:00:21,090 challenges that come up as part of that today. So we're really 9 00:00:21,090 --> 00:00:23,310 looking forward to getting into that conversation. So Athena, 10 00:00:23,310 --> 00:00:24,720 welcome and thank you for coming along. 11 00:00:25,290 --> 00:00:27,270 Athena Peppes: Well, pleasure to be here today. Thank you for 12 00:00:27,270 --> 00:00:27,900 inviting me. 13 00:00:28,200 --> 00:00:30,900 Matt Best: Brilliant, and just a bit about your background. So 14 00:00:31,110 --> 00:00:35,490 you're a futurist speaker, and you advise on trends in 15 00:00:35,490 --> 00:00:39,330 technology and looking into the future and obviously, and also a 16 00:00:39,330 --> 00:00:42,510 founder of Athena Peppes Consulting and Beacon Thought 17 00:00:42,510 --> 00:00:45,450 Leadership. But I wonder, for the benefit of our audience, if 18 00:00:45,450 --> 00:00:47,460 you could just share a little bit about your background, your 19 00:00:47,460 --> 00:00:47,910 journey so far? 20 00:00:48,620 --> 00:00:52,220 Athena Peppes: Yes, absolutely. So I started off my career as an 21 00:00:52,220 --> 00:00:57,530 economist, working in insurance and then in shipping and mining, 22 00:00:57,530 --> 00:01:01,550 and my role at the time involved forecasting quite a lot, which 23 00:01:01,550 --> 00:01:06,650 is very different to foresight and researching those sectors 24 00:01:06,770 --> 00:01:09,620 and understanding a lot about the policy issues, what the 25 00:01:09,620 --> 00:01:13,940 demand, supply is and those in those sectors. Then I moved to 26 00:01:13,940 --> 00:01:17,420 Accenture, which is a major consulting and technology 27 00:01:17,420 --> 00:01:23,750 company, and I worked in thought leadership there for 15 years, 28 00:01:23,750 --> 00:01:28,130 focused on topics related to the future of business. More 29 00:01:28,130 --> 00:01:32,780 recently, as you said, I've founded my own consultancies, 30 00:01:32,870 --> 00:01:36,560 helping my clients with two aspects of what I'm really 31 00:01:36,560 --> 00:01:39,860 passionate about. One is around long term thinking and really 32 00:01:39,860 --> 00:01:43,010 understanding how technology trends are going to shape 33 00:01:43,640 --> 00:01:46,370 business, they're going to change it, and how to help them 34 00:01:46,370 --> 00:01:51,380 be prepared and actually turn that change into an opportunity 35 00:01:51,380 --> 00:01:55,040 and growth for their organization and the others 36 00:01:55,040 --> 00:01:59,510 around the other consultancies is a partnership focused on 37 00:01:59,510 --> 00:02:01,970 helping them with their with their thought leadership 38 00:02:02,030 --> 00:02:02,810 initiatives. 39 00:02:03,350 --> 00:02:06,230 Matt Best: Wow, fantastic. So I was going to try and come up 40 00:02:06,230 --> 00:02:08,600 with some sort of pun there, but I think this is all going to be 41 00:02:08,600 --> 00:02:11,480 very futuristic. I'm very much, very much looking forward to 42 00:02:11,480 --> 00:02:12,560 diving into this with you. 43 00:02:12,599 --> 00:02:14,519 Jonny Adams: Glad that you finished it, because I was like, 44 00:02:14,579 --> 00:02:18,239 is he? Is he going to do it on absolutely super, super 45 00:02:18,239 --> 00:02:21,739 interested about this topic today, Athena. And really, I 46 00:02:21,739 --> 00:02:25,339 think, in the last number of conferences, or what's going on 47 00:02:25,339 --> 00:02:28,279 in the news feed on LinkedIn at the moment, it's all about the 48 00:02:28,279 --> 00:02:31,399 future. And you know, what does technology mean to us? And 49 00:02:31,399 --> 00:02:33,799 there's a lot more sort of that personal impact. I'm really 50 00:02:33,799 --> 00:02:36,379 super excited to spend some time for you today. So thank you for 51 00:02:36,379 --> 00:02:36,799 joining us. 52 00:02:36,979 --> 00:02:39,619 Matt Best: Perfect. And before we jump into the future, though, 53 00:02:39,679 --> 00:02:42,339 what's customary on the growth workshop podcast is for us to 54 00:02:42,339 --> 00:02:44,739 ask a little bit about what's been going on in your week. What 55 00:02:44,739 --> 00:02:46,899 have you learned? What's been interesting? Maybe something 56 00:02:46,899 --> 00:02:49,299 that you've experienced in your personal life, but you'd be 57 00:02:49,299 --> 00:02:51,279 happy to share with with us in the audience today. 58 00:02:52,360 --> 00:02:57,280 Athena Peppes: Excellent question. I I'll draw on my 59 00:02:57,580 --> 00:03:00,900 recent trip. I just came back from Dubai, actually, yesterday, 60 00:03:01,440 --> 00:03:06,960 I spent five days there, visiting different sites, there 61 00:03:07,680 --> 00:03:13,260 in Abu Dhabi as well. And I was very impressed. It was my first 62 00:03:13,260 --> 00:03:17,040 time there. Was very impressed with the focus they have on long 63 00:03:17,040 --> 00:03:22,400 term thinking and future generations. And have they, they 64 00:03:22,400 --> 00:03:27,440 plan the city, and they plan their economy with that in mind 65 00:03:27,440 --> 00:03:29,120 and kind of working backwards. 66 00:03:29,330 --> 00:03:32,660 Jonny Adams: I've also heard good weather, good food and good 67 00:03:32,660 --> 00:03:36,350 shopping. Is that fair to say? Things may have come into this? 68 00:03:38,000 --> 00:03:39,860 What was your favorite out of those three? 69 00:03:41,630 --> 00:03:46,160 Athena Peppes: The weather not so much because I'm Greek, so I 70 00:03:46,160 --> 00:03:49,100 should be used to the heat, but it's actually very humid there. 71 00:03:49,100 --> 00:03:53,570 And yeah, so quite different. Shopping was excellent. I will 72 00:03:53,570 --> 00:03:56,270 say I've never seen a shopping mall like to buy more. 73 00:03:56,900 --> 00:03:59,060 Jonny Adams: Did you? Did you go out on the plane with a half 74 00:03:59,060 --> 00:04:01,580 empty suitcase and come back, we'll see cases on the way 75 00:04:01,580 --> 00:04:03,770 around. Good on you. 76 00:04:03,800 --> 00:04:05,840 Matt Best: Good on you. Brilliant. Well, I mean, thank 77 00:04:05,840 --> 00:04:08,240 you for sharing that, Athena. And actually, when we dive into 78 00:04:08,420 --> 00:04:10,850 the conversation today, I imagine some of that reflection 79 00:04:10,850 --> 00:04:13,340 from your trip to Dubai, that sort of, that forward thinking 80 00:04:13,340 --> 00:04:16,130 mindset will come out of it. So Jonny, how about you? 81 00:04:16,340 --> 00:04:19,280 Jonny Adams: This week's been amazing. So, I mean, it's great 82 00:04:19,280 --> 00:04:21,950 to have Athena here. She's worked and is a thought leader 83 00:04:21,950 --> 00:04:26,390 herself, and one of my favorite authors, Steve Martin, some of 84 00:04:26,390 --> 00:04:29,300 you may have read some of his books in the past. I went to his 85 00:04:29,300 --> 00:04:32,450 latest book launch. He's just collaborated with the economist 86 00:04:32,750 --> 00:04:37,430 and Matt, you and I have been lucky, but also maybe you know, 87 00:04:37,490 --> 00:04:42,260 showed how our business works is that we've we've worked with his 88 00:04:42,260 --> 00:04:44,720 organization for the last five years. They are behavioral 89 00:04:44,720 --> 00:04:46,880 scientists. They are an organization called influence at 90 00:04:46,880 --> 00:04:50,510 work, and as a SBR, from a business development 91 00:04:50,510 --> 00:04:53,900 perspective, have sort of merged together to bring science and 92 00:04:53,900 --> 00:04:56,300 art together. And for the last five years, we've worked with 93 00:04:56,300 --> 00:05:00,230 him, and I think some of the tips and techniques. That he 94 00:05:00,230 --> 00:05:02,810 writes about in influence and persuasion have really helped me 95 00:05:02,810 --> 00:05:05,600 as an individual, become become better at my job and as a 96 00:05:05,600 --> 00:05:08,570 person. So he talks a lot about the six principles of influence 97 00:05:08,570 --> 00:05:11,510 and persuasion, and he's got some great insights around that 98 00:05:11,510 --> 00:05:14,300 topic. So yeah, went to the book launch. Really good. Had a beer. 99 00:05:14,330 --> 00:05:15,710 It was fantastic. What about you? 100 00:05:15,860 --> 00:05:19,730 Matt Best: Nice. So I went to a slightly different event, 101 00:05:19,730 --> 00:05:25,100 actually, and it was an engaged sales event recently in London, 102 00:05:25,100 --> 00:05:28,130 which talked about Mark was focused on marketing and sales. 103 00:05:28,130 --> 00:05:31,820 And a lot of what we talk about on the growth workshop podcast 104 00:05:31,820 --> 00:05:34,280 is, of course, around kind of client centricity and that focus 105 00:05:34,280 --> 00:05:37,490 on bringing that, that all together in organizations. And 106 00:05:37,640 --> 00:05:40,070 there's a lot of discussion at that, at that event, actually, 107 00:05:40,070 --> 00:05:43,040 which is really insightful, so similar came out with some Yeah, 108 00:05:43,040 --> 00:05:45,050 I feel like I've learned a lot this week, which is always a 109 00:05:45,050 --> 00:05:45,830 good place to be. 110 00:05:46,010 --> 00:05:47,840 Jonny Adams: What was the one thing that you took away that 111 00:05:47,840 --> 00:05:49,430 was like, Yeah, that was good? 112 00:05:49,780 --> 00:05:51,700 Matt Best: You know what? There's one of the speakers was 113 00:05:51,700 --> 00:05:55,240 talking about the importance of when it comes to sales and 114 00:05:55,240 --> 00:05:57,550 marketing enablement, the importance of creating a 115 00:05:57,550 --> 00:05:59,860 community around it, acknowledging that normally 116 00:05:59,860 --> 00:06:03,160 those teams are quite small, so the ability to scale is 117 00:06:03,160 --> 00:06:06,220 challenged, right? So, yes, it's scaling through enabling 118 00:06:06,220 --> 00:06:10,150 leaders, but also creating a community of people to support 119 00:06:10,150 --> 00:06:12,580 getting that out into the market. So selecting individuals 120 00:06:12,580 --> 00:06:15,820 from different teams to be sort of champions and change 121 00:06:15,820 --> 00:06:18,370 carriers, which I think is really, really important to to 122 00:06:18,370 --> 00:06:22,930 maximize the impact of that. SoAthena, as we look, as we look 123 00:06:22,930 --> 00:06:25,330 to the future. Again, I'm going to try not to say that too many 124 00:06:25,330 --> 00:06:28,540 signs, but one of the we were reading in one of your blogs 125 00:06:29,080 --> 00:06:31,810 recently, which is actually titled, could humanoid robots 126 00:06:31,810 --> 00:06:35,560 close the gender equality gap in that book, you or in that blog, 127 00:06:35,560 --> 00:06:38,590 rather, you talk about, you know, the process of adopting 128 00:06:38,590 --> 00:06:42,160 technology, and how it involves overcoming some of those 129 00:06:42,160 --> 00:06:45,580 natural, understandable and justified concerns, right? But 130 00:06:45,580 --> 00:06:49,210 our challenge as humans is not to avoid this progress, but to 131 00:06:49,210 --> 00:06:53,080 make productive, informed and moral use of it. So certainly, 132 00:06:53,080 --> 00:06:57,730 whenever I think about the sort of evolution of AI and look 133 00:06:57,730 --> 00:07:01,780 around at some of the people sat in those C suite jobs, right? 134 00:07:02,170 --> 00:07:06,490 You can, you can sense some fear in that. What do you do? And how 135 00:07:06,490 --> 00:07:09,610 do you help those leaders in getting comfortable with that 136 00:07:09,610 --> 00:07:12,490 and maybe being able to make informed decisions around how to 137 00:07:12,700 --> 00:07:15,910 effectively and appropriately leverage these technologies? 138 00:07:16,260 --> 00:07:18,690 Athena Peppes: Yeah, it's a very good question. There's a lot to 139 00:07:18,750 --> 00:07:23,610 unpack there. You You mentioned the word fear, for example. I 140 00:07:23,610 --> 00:07:26,670 think perhaps there is a little bit of fear, but actually the 141 00:07:27,300 --> 00:07:31,710 the main concern is around those leaders being fully prepared, 142 00:07:32,100 --> 00:07:36,420 especially in an environment where this technology is just 143 00:07:36,420 --> 00:07:41,130 changing so fast. I think we're currently in a in a technology 144 00:07:41,130 --> 00:07:44,400 or innovation super cycle. So one development you see in 145 00:07:44,400 --> 00:07:47,670 artificial intelligence is quickly bringing in another, and 146 00:07:47,670 --> 00:07:50,250 that's how I ended up writing the article around humanoid 147 00:07:50,400 --> 00:07:55,320 robotics, because there's a lot of emphasis in the AI research 148 00:07:55,320 --> 00:07:59,490 community at the moment in terms of how large language models and 149 00:07:59,490 --> 00:08:03,420 generative AI can be embedded into the previous generation of 150 00:08:03,420 --> 00:08:06,360 robotics, and that can bring about another wave of growth. 151 00:08:06,420 --> 00:08:10,890 You know, that's all happened quite fast, and it's only been a 152 00:08:10,890 --> 00:08:15,690 couple of years since chat GPT was released. But actually, the 153 00:08:15,690 --> 00:08:19,500 field of AI, the research that's been done around that, has been 154 00:08:19,500 --> 00:08:23,460 going on for much longer, since the release of chat GPT, there's 155 00:08:23,460 --> 00:08:27,930 been a lot of hype, a lot of media attention on it, a lot of 156 00:08:27,960 --> 00:08:32,550 company investment. So all of that can make leaders feel like 157 00:08:32,610 --> 00:08:35,010 things are changing really fast. What am I doing? Everyone's 158 00:08:35,010 --> 00:08:40,530 doing this. What do I do? The reality is that if you kind of 159 00:08:40,920 --> 00:08:44,250 look at I was thinking about this the other day, because in 160 00:08:44,250 --> 00:08:46,320 the last six months, I think I've had more than 200 161 00:08:46,320 --> 00:08:50,520 conversations with different organizations. The topic that 162 00:08:50,520 --> 00:08:53,250 always comes up is artificial intelligence, in terms of what's 163 00:08:53,250 --> 00:08:56,580 top of mind for those for those companies, but the pace of 164 00:08:56,580 --> 00:08:59,100 adoption is very, very different, right? And that makes 165 00:08:59,100 --> 00:09:04,440 sense, because if you are in consumer goods, you will just 166 00:09:04,440 --> 00:09:07,740 naturally move a lot faster and adopt and respond to trends a 167 00:09:07,740 --> 00:09:12,780 lot faster. But some industries, like pharmaceuticals, are a lot 168 00:09:12,780 --> 00:09:16,320 more heavily regulated, so you'd understand that there might be a 169 00:09:16,320 --> 00:09:20,310 bit more hesitant or slower reluctant, and want to be a bit 170 00:09:20,310 --> 00:09:24,630 more prepared before adopting any new technology. There's also 171 00:09:24,630 --> 00:09:28,920 the dimension of size. I personally have seen that a lot 172 00:09:28,920 --> 00:09:32,040 of the narrative that we get around technology trends, 173 00:09:32,040 --> 00:09:36,090 including AI shaped by very large technology companies, 174 00:09:36,330 --> 00:09:39,840 right? But as I just mentioned, the industry makes a difference. 175 00:09:39,840 --> 00:09:44,700 The size makes a difference of an organization in the UK, if 176 00:09:44,700 --> 00:09:49,530 you if you look at the data, 99% of companies are actually SMEs. 177 00:09:49,560 --> 00:09:55,050 If you look at that by revenues, it's about 50% employees, 60% so 178 00:09:55,050 --> 00:09:59,070 they're the backbone of the economy, right? And so it can 179 00:09:59,070 --> 00:10:03,150 feel like. Everyone's ahead of you in AI, there is a lot of 180 00:10:03,300 --> 00:10:05,850 fear, you know, lack of confidence. How do I go about 181 00:10:05,850 --> 00:10:08,700 this? Surely everyone knows what they're doing. And actually, 182 00:10:08,700 --> 00:10:11,550 that's not the reality, because if you go a bit deeper, you'll 183 00:10:11,550 --> 00:10:14,100 see there's a lot of variation in terms of adoption. 184 00:10:14,310 --> 00:10:15,510 Jonny Adams: There's a piece that I think was really 185 00:10:15,510 --> 00:10:17,550 interesting. I know you're going to continue on this trend of 186 00:10:17,550 --> 00:10:20,670 asking the question is that you reference, you've had many 187 00:10:20,670 --> 00:10:24,060 conversations, and we also do, like we did some market 188 00:10:24,060 --> 00:10:27,240 analysis, where there isn't really a growth consultancy in 189 00:10:27,240 --> 00:10:30,300 the space that's taking and capitalizing on this AI 190 00:10:30,300 --> 00:10:33,120 conversation. But what I mean by that, Athena, is that, could you 191 00:10:33,120 --> 00:10:35,160 give like, a summary, and without putting you on the spot 192 00:10:35,160 --> 00:10:38,040 too much, what are those common questions that you're getting 193 00:10:38,040 --> 00:10:41,640 asked about, or common concerns? Maybe. So why I ask that 194 00:10:41,640 --> 00:10:44,070 question is, I think the listeners that are typically the 195 00:10:44,070 --> 00:10:47,130 individuals that you're talking to might go, Yeah, I also have 196 00:10:47,130 --> 00:10:50,970 got those problems or questions. Do you have like, a trend of 197 00:10:50,970 --> 00:10:52,440 what they're asking you at the moment? 198 00:10:52,630 --> 00:10:56,470 Athena Peppes: It's a little bit varied, I would say. And that's 199 00:10:56,470 --> 00:11:00,730 why I mentioned kind of the industry and size of company. I 200 00:11:00,730 --> 00:11:04,210 won't mention names, I'll just give you a couple of examples as 201 00:11:04,210 --> 00:11:09,340 illustrations. So for example, I was talking to a marketing 202 00:11:09,730 --> 00:11:14,020 agency that works a lot with life sciences and pharmaceutical 203 00:11:14,020 --> 00:11:18,550 companies, and they were the ones that were saying, Look, we 204 00:11:18,550 --> 00:11:22,930 see the potential to use AI and what we do, but it's very 205 00:11:22,930 --> 00:11:26,260 important to take our clients along with us in that journey, 206 00:11:26,530 --> 00:11:28,660 and they're a little bit hesitant, because they worry 207 00:11:28,660 --> 00:11:32,440 about that regulation. And when you say AI, that could mean a 208 00:11:32,440 --> 00:11:35,290 lot of different things. It could be something very simple, 209 00:11:35,770 --> 00:11:38,860 right? It can be something super sophisticated. 210 00:11:38,860 --> 00:11:40,540 Jonny Adams: Should be AI for dummies. You remember the book 211 00:11:40,540 --> 00:11:43,240 that were your like, you know, I think my mom and dad got... 212 00:11:43,390 --> 00:11:45,190 Matt Best: How do you use a computer for dummies? 213 00:11:46,180 --> 00:11:47,230 Jonny Adams: There must be one. 214 00:11:47,620 --> 00:11:50,200 Athena Peppes: I'm sure there is. But it's a little bit like 215 00:11:50,200 --> 00:11:54,490 that. And I think the issue is sometimes, if you're very senior 216 00:11:54,490 --> 00:11:58,420 executive or, you know, then it can feel a little bit like, oh, 217 00:11:58,450 --> 00:12:02,620 I can't say that. I don't know exactly what's happening in that 218 00:12:02,620 --> 00:12:08,410 space. And actually the best way to kind of get around any fears 219 00:12:08,440 --> 00:12:11,200 just better understanding, better knowledge, more 220 00:12:11,200 --> 00:12:16,990 conversation around the talk. Expert, yeah. Another example is 221 00:12:17,020 --> 00:12:21,310 how to use it for pricing. How to use AI for pricing. 222 00:12:21,520 --> 00:12:23,110 Jonny Adams: Describe that a little bit more when I'm 223 00:12:23,110 --> 00:12:26,260 curious, I've talked to a professional service firm. So 224 00:12:26,530 --> 00:12:29,410 future will look like there's been a lot of consolidation in 225 00:12:29,410 --> 00:12:32,500 financial services over the last number of years. There's going 226 00:12:32,500 --> 00:12:34,750 to be a heavy pivot towards professional services being 227 00:12:34,750 --> 00:12:38,110 consolidating in the UK market, with a lot of classic sort of 228 00:12:38,110 --> 00:12:40,930 money needed to be invested by investors, as we all know, it 229 00:12:40,930 --> 00:12:43,300 was going on at the moment, and that was sort of a bit of 230 00:12:43,390 --> 00:12:46,840 insight shared to me this week, but, but pricing was the key 231 00:12:46,840 --> 00:12:49,120 point that investors are looking for at the moment. So how do you 232 00:12:49,120 --> 00:12:52,270 think AI and pricing? I'm curious about that because 233 00:12:52,270 --> 00:12:53,110 that's very topical. 234 00:12:53,170 --> 00:12:56,800 Athena Peppes: Yeah, that came from another kind of lead, and 235 00:12:56,800 --> 00:12:59,740 they're just interested in understanding basically how they 236 00:12:59,740 --> 00:13:05,950 can get better data faster, into their into their sales teams 237 00:13:06,310 --> 00:13:10,480 using it, and how AI can help them do that, so that they can 238 00:13:10,480 --> 00:13:14,050 have more competitive pricing, so they can think a little bit 239 00:13:14,050 --> 00:13:17,350 differently about what their pricing strategy. So my dynamic 240 00:13:17,350 --> 00:13:21,610 sizing board, yes, or modeling different scenarios, right? And 241 00:13:21,640 --> 00:13:25,660 being able to understand, okay, if I do this, what if this 242 00:13:25,660 --> 00:13:30,460 happens? What would the impact be? And some of that might be 243 00:13:30,460 --> 00:13:32,650 already there. But obviously it's becoming a lot more 244 00:13:32,650 --> 00:13:37,540 sophisticated. The data sets on which these models are trained 245 00:13:37,690 --> 00:13:42,610 are a lot bigger. So in theory, they should be better quality, 246 00:13:42,610 --> 00:13:46,720 but then you also get issues around that, like, is it really 247 00:13:46,720 --> 00:13:49,420 that robust? We know there's an issue with hallucinations, 248 00:13:49,420 --> 00:13:54,790 right? So if you create a tool to help your teams with with 249 00:13:54,790 --> 00:13:58,270 pricing, how can you make sure that the data gives them is also 250 00:13:58,270 --> 00:14:02,800 very robust? You're also dealing with very sensitive data, right? 251 00:14:02,800 --> 00:14:07,600 How do you ensure that it's, it's secure when it when it 252 00:14:07,600 --> 00:14:09,790 comes to pricing? So those are some of the questions that are 253 00:14:09,790 --> 00:14:11,050 coming up around this. 254 00:14:11,380 --> 00:14:13,360 Matt Best: Something that's sort of interesting, and I'd love 255 00:14:13,360 --> 00:14:17,410 your perspective on on some of those industry type trends you 256 00:14:17,410 --> 00:14:20,320 mentioned, sort of Life Sciences and pharmaceutical fair to 257 00:14:20,320 --> 00:14:22,660 assume that they're a kind of fairly highly regulated 258 00:14:22,690 --> 00:14:26,800 industry. I think, you know, some of the more consider it 259 00:14:26,800 --> 00:14:29,830 kind of more straightforward, sort of AI technologies are 260 00:14:29,830 --> 00:14:33,100 still struggling to get into into some of these industries, 261 00:14:33,100 --> 00:14:36,790 just as a result of the regulatory environment. Do you 262 00:14:36,790 --> 00:14:43,240 see this as much the job of regulators and policy and policy 263 00:14:43,240 --> 00:14:46,060 makers as it is the businesses themselves? 264 00:14:46,270 --> 00:14:48,580 Athena Peppes: Yeah, I think there is an element of that, in 265 00:14:48,580 --> 00:14:52,510 the sense that sometimes organizations might wait to see 266 00:14:52,510 --> 00:14:57,550 what the regulation is before they adopt something right? I 267 00:14:57,550 --> 00:15:01,900 think then culture comes into. Play and kind of how fast you 268 00:15:01,900 --> 00:15:05,500 can be. Some organizations might say, well, but in the meantime, 269 00:15:05,710 --> 00:15:09,220 we're still missing out on all this potential, this growth. I 270 00:15:09,220 --> 00:15:12,400 think the other thing with being an early adopter is potentially, 271 00:15:12,400 --> 00:15:14,590 of course, it's high risk, but also you might be able to 272 00:15:14,620 --> 00:15:18,910 illustrate the benefits to the policy makers as well as they 273 00:15:18,910 --> 00:15:22,270 shape that policy. The key thing, I think, around 274 00:15:22,270 --> 00:15:26,200 regulation policy making, is that, for sure, organizations 275 00:15:26,200 --> 00:15:30,100 have to deal with just a patchwork of regulation, because 276 00:15:30,130 --> 00:15:34,270 each country or each region, say, in the case of the EU, 277 00:15:34,270 --> 00:15:38,080 takes a very different approach. That's another part of the 278 00:15:38,080 --> 00:15:42,100 dynamic. You have to think about that in the US, they tend to be 279 00:15:42,100 --> 00:15:44,890 a bit more relaxed. I would argue, if you look at the trend 280 00:15:44,890 --> 00:15:47,710 over time, that they are actually becoming a lot more 281 00:15:47,740 --> 00:15:52,240 focused on on regulation, for sure, but look at the UI act 282 00:15:52,240 --> 00:15:56,680 that's already come into into force, that is quite intense. 283 00:15:56,710 --> 00:15:59,890 I'm not an expert in the space. I can't comment specifically, 284 00:15:59,890 --> 00:16:04,360 but on a webinar recently about like, how many assessments now 285 00:16:04,360 --> 00:16:07,900 companies have to do? Because even if you have the same tool 286 00:16:08,230 --> 00:16:11,080 every time you use it, if you use it for sales, you have to do 287 00:16:11,080 --> 00:16:14,170 a different assessment than if you use the same tool for 288 00:16:14,470 --> 00:16:19,240 marketing or internal employee training or different kind of 289 00:16:19,240 --> 00:16:22,390 things. There is a lot of that. But I think the back then, it 290 00:16:22,390 --> 00:16:24,820 comes back to the balance we talked about a little bit 291 00:16:24,820 --> 00:16:28,330 before, right of, how do you make sure you take advantage of 292 00:16:28,330 --> 00:16:31,000 the opportunity and not miss out? 293 00:16:31,220 --> 00:16:33,380 Matt Best: And if we think about, kind of practically 294 00:16:33,380 --> 00:16:35,840 diving into some of the practicalities of the work that 295 00:16:35,840 --> 00:16:41,140 you do, when you sit down with a C suite executive who's feels 296 00:16:41,140 --> 00:16:44,260 maybe a little bit under qualified, is getting pressure 297 00:16:44,260 --> 00:16:47,920 from the business to adopt new technology. He's worried about 298 00:16:47,920 --> 00:16:51,160 the risk. And you know, whether that be, you know the because, 299 00:16:51,220 --> 00:16:53,920 because risk is always on the other side of that, the other 300 00:16:53,920 --> 00:16:56,560 side of that equation, how, what's your approach, typically, 301 00:16:56,560 --> 00:16:59,080 to doing that you mentioned, sort of education and building 302 00:16:59,080 --> 00:17:01,360 and understanding is so important. So is that at the 303 00:17:01,360 --> 00:17:03,660 heart of what you do, in terms of you of helping these 304 00:17:03,660 --> 00:17:06,540 executives in understanding what this looks like and how it could 305 00:17:06,540 --> 00:17:07,800 shape the future of their business? 306 00:17:07,990 --> 00:17:10,360 Athena Peppes: Yes, in terms of how to help them, it really 307 00:17:10,360 --> 00:17:14,830 depends on where they are on that on that adoption journey, 308 00:17:15,490 --> 00:17:20,260 because some come and say, we know this is a trend. We know we 309 00:17:20,260 --> 00:17:25,000 need to do something. We kind of don't know what to do. Others 310 00:17:25,030 --> 00:17:27,550 are experimenting. They've already rolled out some proofs 311 00:17:27,550 --> 00:17:31,870 of concept. And just want to thinking about scaling, you 312 00:17:31,870 --> 00:17:34,960 know, across different business functions. The way I help them 313 00:17:34,960 --> 00:17:37,240 is to think a little bit about the future of their 314 00:17:37,240 --> 00:17:41,230 organization, and where is it that they want to be, and to 315 00:17:41,230 --> 00:17:44,050 also make sure that when they're thinking about those 316 00:17:44,050 --> 00:17:48,340 technologies, they're not just thinking about today, but what 317 00:17:48,340 --> 00:17:53,290 will AI look like in five years time, so that you can be 318 00:17:53,320 --> 00:17:57,310 whatever you introduce now, can be more resilient, I would 319 00:17:57,310 --> 00:17:59,860 always start with the problem. So what's the problem that 320 00:17:59,860 --> 00:18:05,590 they're facing, right? It's the classic consulting of trying to 321 00:18:05,590 --> 00:18:08,950 understand, in some cases, it's not them, it's their teams. How 322 00:18:08,950 --> 00:18:13,150 do we take our teams along with us? In other cases, is like, Oh, 323 00:18:13,150 --> 00:18:15,730 we feel resistance, because people think this is going to 324 00:18:15,730 --> 00:18:18,700 take over their jobs. How can we make them sure there's actually 325 00:18:18,850 --> 00:18:22,720 potential to help? It kind of varies a lot. 326 00:18:22,990 --> 00:18:25,570 Matt Best: If you're looking and again, I kind of digging into 327 00:18:25,570 --> 00:18:28,990 this, and I'm really curious, is your perspective? We talk a lot 328 00:18:28,990 --> 00:18:31,690 when we when we're working with leaders, around, kind of toward 329 00:18:31,690 --> 00:18:34,480 an away motivation of your employees, right? So away 330 00:18:34,480 --> 00:18:37,540 motivation being, I'm afraid of something, so I'm going to do 331 00:18:37,540 --> 00:18:40,180 something because of the consequences of something else. 332 00:18:40,180 --> 00:18:42,160 And toward motivation is I'm going to do something because I 333 00:18:42,160 --> 00:18:45,220 really want to strive for it. So it's sort of proactive versus 334 00:18:45,370 --> 00:18:49,510 cat versus stick, if you like. Are you seeing that, if you had 335 00:18:49,510 --> 00:18:52,570 to sort of balance the the equation on which side is that 336 00:18:52,570 --> 00:18:55,690 falling? Is that? Is it more about with the executives that 337 00:18:55,690 --> 00:18:58,540 you work within the businesses you work with? Is it? Is it fear 338 00:18:58,540 --> 00:19:02,620 of being left behind more than what opportunity could we 339 00:19:02,620 --> 00:19:04,210 unlock? Or is it the other way around? 340 00:19:04,870 --> 00:19:08,950 Athena Peppes: I think it's kind of, it's almost actually a 341 00:19:08,950 --> 00:19:12,820 little bit above that, in the sense of, maybe, I'll give you a 342 00:19:12,820 --> 00:19:15,640 stat, and that might help. So some of the research that I've 343 00:19:15,640 --> 00:19:18,730 done in the past has been focused on change, quantifying 344 00:19:18,730 --> 00:19:24,400 change, a pace of change, and understanding how the C suite 345 00:19:24,430 --> 00:19:28,780 sees that and how that compares to actual data on change on the 346 00:19:28,780 --> 00:19:34,270 ground. Now, 88% of C suite executives surveyed said that 347 00:19:34,270 --> 00:19:38,590 they expect the pace of change to accelerate, but 52% said they 348 00:19:38,590 --> 00:19:41,890 not fully prepared for what that change will mean for their 349 00:19:41,890 --> 00:19:42,820 organization. 350 00:19:42,880 --> 00:19:44,140 Matt Best: So there it is, right? 351 00:19:44,200 --> 00:19:48,070 Athena Peppes: So yeah, because, as I said that, and then you can 352 00:19:48,070 --> 00:19:50,740 go deeper into that, but the implications are huge, right? 353 00:19:50,740 --> 00:19:53,800 Because they have to make decisions about how to allocate 354 00:19:53,980 --> 00:19:56,980 their budgets, their time, for people. So if you're not fully 355 00:19:57,010 --> 00:20:01,930 prepared, then how are you going to make. Those decisions. And 356 00:20:01,930 --> 00:20:06,520 the idea is, most of the time they spend dealing with the day 357 00:20:06,520 --> 00:20:10,180 to day, operational delivery for the next quarter, financial 358 00:20:10,180 --> 00:20:14,320 results, that kind of thing. So how can you, or in this case, 359 00:20:14,320 --> 00:20:19,060 me, help them kind of take some time out of the day to day and 360 00:20:19,060 --> 00:20:22,510 think about their organization in the future and where they 361 00:20:22,510 --> 00:20:24,640 want to go. And there's different ways to do that, 362 00:20:24,640 --> 00:20:28,420 right? You can help them with recognizing the trends that are 363 00:20:28,420 --> 00:20:32,560 out there. I always use rise in my mind as a way to kind of 364 00:20:32,560 --> 00:20:35,800 structure that so eyes for recognizing trends and just 365 00:20:35,800 --> 00:20:39,040 understanding what signals are out there, what's happening, the 366 00:20:39,040 --> 00:20:43,510 eyes around imagining the possibilities. Okay, so, yes, 367 00:20:43,510 --> 00:20:47,140 I've read about this interesting trend. What does that mean? And 368 00:20:47,140 --> 00:20:50,890 so the example of a humanoid robotics I was just thinking, 369 00:20:51,250 --> 00:20:54,940 Who knows whether, in the future, we might have companies 370 00:20:54,940 --> 00:20:58,300 instead of giving you certain types of benefits like health 371 00:20:58,300 --> 00:21:01,660 insurance, what if they gave you a home robot to do all your 372 00:21:01,660 --> 00:21:04,450 housework for you, so you wouldn't have to spend your time 373 00:21:04,450 --> 00:21:06,130 doing that, and you could be more productive. 374 00:21:06,220 --> 00:21:09,220 Jonny Adams: Oh, please, that sounds amazing. 375 00:21:09,220 --> 00:21:12,220 Athena Peppes: I know, but I'm not. The point is not to predict 376 00:21:12,370 --> 00:21:18,100 or to argue for it. It is just to say what if or what you know, 377 00:21:18,100 --> 00:21:20,470 and to pose those kind of questions that really straight. 378 00:21:20,470 --> 00:21:23,500 Jonny Adams: And is that the "I", because it's unpacking that 379 00:21:23,500 --> 00:21:26,500 sort of quite boxed view, and actually we're opening up our 380 00:21:26,500 --> 00:21:27,970 blind spots to think, well, what could... 381 00:21:27,970 --> 00:21:31,210 Athena Peppes: Yes, so imagine, like different possibilities, 382 00:21:31,210 --> 00:21:36,100 right? Then the S is for shape scenarios, because you can 383 00:21:36,100 --> 00:21:37,990 imagine those kind of possibilities, but people 384 00:21:37,990 --> 00:21:41,590 respond differently to seeing a story of what the organization 385 00:21:41,590 --> 00:21:45,040 will actually look like. So you can take, say, two dimensions. 386 00:21:45,040 --> 00:21:48,460 If we take the example of AI, you can take a dimension around 387 00:21:48,610 --> 00:21:52,210 advancement of the technology, and another one around regulate, 388 00:21:52,210 --> 00:21:57,100 like fast to slow, and another one around regulation, lenient 389 00:21:57,100 --> 00:22:02,380 to kind of tight, and then almost create stories of so you 390 00:22:02,380 --> 00:22:05,260 can imagine like a little quadrant of that, and think 391 00:22:05,260 --> 00:22:08,140 about stories of what, what would that world look like, and 392 00:22:08,140 --> 00:22:11,680 how, how do you operate? How do you what does your organization 393 00:22:11,680 --> 00:22:16,660 look like in that future? And then that's where the E comes in 394 00:22:16,660 --> 00:22:19,300 around enabling action. Because once you have those different 395 00:22:19,300 --> 00:22:22,630 stories, what you can do is say, But hold on a second. If that 396 00:22:22,630 --> 00:22:26,950 came to bear, that would mean we're obsolete. Or if that came 397 00:22:26,950 --> 00:22:29,050 to bear, that would mean we'll make a lot more money. How you 398 00:22:29,050 --> 00:22:32,470 influence that? You know? So that's the that's the idea 399 00:22:33,010 --> 00:22:36,700 behind using foresight frameworks to help you think 400 00:22:36,700 --> 00:22:38,770 about the future, but see growth today. 401 00:22:39,250 --> 00:22:41,350 Matt Best: I think anyone like you talk about that, that with 402 00:22:41,350 --> 00:22:43,660 execs, you can imagine the calmness that comes over the 403 00:22:43,660 --> 00:22:46,750 boardroom when they all go, Oh, okay. So we can put this into 404 00:22:46,750 --> 00:22:49,600 those mode, into those moments. And I can think about this in 405 00:22:49,600 --> 00:22:52,180 the big, big picture, without feeling all of that pressure. 406 00:22:52,180 --> 00:22:55,720 And I can be, you know, I can be conscious of of those, of those 407 00:22:55,720 --> 00:22:58,060 other stages. So I love that. I think that's really and I think 408 00:22:58,060 --> 00:23:01,570 actually, for anyone listening to this, whether, regardless of 409 00:23:01,570 --> 00:23:05,020 where you sit in, in your own or in your own organization, is a 410 00:23:05,020 --> 00:23:07,360 really, a really important way of, kind of having that 411 00:23:07,360 --> 00:23:09,730 perspective. Because you mentioned the robot in your 412 00:23:09,730 --> 00:23:11,500 house doing the cleaning, and like, some people will be 413 00:23:11,500 --> 00:23:11,710 going... 414 00:23:11,740 --> 00:23:13,780 Jonny Adams: Athena doesn't have a robot in the house just yet. 415 00:23:14,170 --> 00:23:16,510 Matt Best: But some people will be going to the kind of I Robot 416 00:23:16,510 --> 00:23:18,940 view, right? Will Smith. They go, Oh my gosh, it's gonna, 417 00:23:18,940 --> 00:23:22,420 like, destroy us, Black Mirror. No, God, let's not go there. 418 00:23:22,450 --> 00:23:24,610 Athena Peppes: Which is very forward looking and excellent at 419 00:23:24,610 --> 00:23:25,030 prediction. 420 00:23:25,120 --> 00:23:26,410 Jonny Adams: Somewhat worrying. 421 00:23:26,530 --> 00:23:28,690 Matt Best: But then the reality is that we take ourselves back, 422 00:23:28,690 --> 00:23:33,280 kind of came back, kind of 2030, years and said autonomous cars, 423 00:23:33,580 --> 00:23:36,850 everyone would be going having exactly the same response that 424 00:23:36,850 --> 00:23:39,100 people are probably having now to the idea of having a robot 425 00:23:39,100 --> 00:23:41,950 doing the cleaning. So I think it's just sort of, I guess, 426 00:23:41,950 --> 00:23:44,170 being able to sort of see that and put that in perspective, 427 00:23:44,170 --> 00:23:44,950 it's really important. 428 00:23:45,490 --> 00:23:48,564 Athena Peppes: Well, and also to let you know that that already 429 00:23:48,628 --> 00:23:52,344 actually exists. The I'll send you a video, which I'm sure 430 00:23:52,408 --> 00:23:56,380 you'd love to watch. There's a company called 1x they released 431 00:23:56,444 --> 00:24:00,223 a robot, humanoid robot called Neo. The video is of this. I 432 00:24:00,287 --> 00:24:04,451 don't know. Would I say he, she? I don't, I don't know. It's Neo, 433 00:24:04,515 --> 00:24:08,359 yeah, yeah, helping this person get ready for work, right? I 434 00:24:08,423 --> 00:24:12,458 mean, it's kind of fascinating, and there's a lot of VC funding 435 00:24:12,522 --> 00:24:16,430 and investment, and even the big tech players like Nvidia are 436 00:24:16,494 --> 00:24:19,953 really backing a lot of companies in this space. So it 437 00:24:20,017 --> 00:24:23,989 feels futuristic, and part of the job is to help organizations 438 00:24:24,053 --> 00:24:28,089 see that perhaps it's not as out there as you might think. Some 439 00:24:28,153 --> 00:24:32,060 of that is already there. Yes, you could argue about the pace 440 00:24:32,125 --> 00:24:36,096 of it, whether people will want to adopt it or not. At the end 441 00:24:36,160 --> 00:24:40,132 of the day, technology is not neutral. It's up to us to decide 442 00:24:40,196 --> 00:24:44,104 if we would want a robot around children, for example. On the 443 00:24:44,168 --> 00:24:48,139 other hand, if I was to put my economist hat on, I would argue 444 00:24:48,203 --> 00:24:52,239 might be a good thing, because we have aging demographic, aging 445 00:24:52,303 --> 00:24:56,019 population in many developed countries. How do we care for 446 00:24:56,083 --> 00:24:59,926 them when there's a lack of carers? Should policy. Makers be 447 00:24:59,990 --> 00:25:03,513 investing in this should companies be working with them 448 00:25:03,578 --> 00:25:07,870 to help address this problem? There's just so much to unpack there. 449 00:25:08,520 --> 00:25:10,860 Matt Best: Athena, thank you for joining us, and to everyone 450 00:25:10,860 --> 00:25:13,440 listening. Join us for part two as we continue this 451 00:25:13,440 --> 00:25:14,040 conversation.