[00:00:00] I would like to acknowledge the Dharawal people, the Aboriginal people of Australia, whose country I live and work on. I would like to pay my respects to their elders, past, present, and emerging, and thank them for sharing their cultural knowledge and awareness with us.
[00:00:14]
[00:00:40] Trisha: Hi there, everyone. I'm Trisha Carter, an organizational psychologist and explorer of cultural intelligence. I'm on a quest to discover what enables us to see things from different perspectives, especially different cultural perspectives, and why sometimes it's easier than others to experience those moments of awareness, the shifts in our thinking.
[00:01:02] Trisha: Those of you who've listened to some of our earlier episodes will know that cultural intelligence, the capability to be effective in situations of diversity, is made up of four areas. Motivational, the CQ drive. Cognitive, the CQ knowledge. Metacognitive, the CQ strategy. And behavioral, the CQ action. And all four of these capabilities help us operate effectively in situations of diversity.
[00:01:28] Trisha: Today, I'm delighted to welcome back my dear friend and colleague, Sarah Black. Sarah and I have worked together on global teams for almost a decade, I think, Sarah. And our listeners will remember her from Episode 9, where she shared her insights on communication across cultures and helping communications and PR professionals build their cultural intelligence.
[00:01:53] Trisha: As someone who's had the privilege of seeing Sarah in action across numerous projects, I can tell you firsthand that her straight talking Belfast style, combined with deep, real empathy, makes her the most effective communications expert I know. Sarah is the founder of Athru Communications. and brings over 30 years of rich comms experience to our conversation.
[00:02:19] Trisha: Her journey has taken her across multiple countries, from Ireland to Norway, Texas to England, and now in Scotland, giving her a truly global perspective on communication and cultural intelligence. Over the years, I've watched her work her magic in diverse organizations, in diverse settings, from small startups and not for profits to some of the world's biggest organizations.
[00:02:45] Trisha: In 2024, she achieved her CQ certification as a facilitator from the Cultural Intelligence Center, adding to her already impressive expertise, in comms and PR. Since we last spoke, Sarah has been diving deep into CQ and various communications and PR aspects, and most recently, we have each been exploring fascinating new territory, how we can bring our cultural intelligence to working with AI.
[00:03:15] Trisha: Sarah's been working on a white paper on this topic and a presentation for a future event, looking at how CQ can help us approach and work with AI tools more effectively. I'm particularly excited to talk about this with her today because I know how passionate she is about making complex topics accessible and practical.
[00:03:37] Trisha: Welcome back, my friend.
[00:03:39] Sarah: Oh, it's lovely to be back, and I think I should hire you to do all my PR, Trisha, because that was some introduction.
[00:03:46] Sarah: Thank you. I think I'm the first returner, so it's lovely to be back. I'm very honored to
[00:03:50] Trisha: I think, David Livermore has beaten you to that one, but you know, you are following good
[00:03:55] Sarah: yes, I mean, very happy to be number two behind Dave Livermore. That's, that's very happy with that slot.
[00:04:00] Sarah: Thank you very much.
[00:04:02] Trisha: So those of you who've listened often will know that we have standard opening questions and I asked them of Sarah before in episode nine, but I'm going to ask again. So Sarah, tell us about a culture other than the one you grew up in that you have learned to love and appreciate.
[00:04:20] Sarah: This is, this was actually that the hardest question when I was thinking about coming back on the podcast. I was like, Oh gosh
[00:04:25] Trisha: Which one? Which one?
[00:04:27] Sarah: which one I know. And I think I talked about Norway the last time and I would still stand for that. But I lived just outside Houston in Texas for seven and a half years.
[00:04:36] Sarah: And I don't know that I can say that I loved, I have a very complicated relationship with Texas. I think it's full of. Really interesting tensions and polarities and complexities and, and to be in the center of Houston is very different to be out in the suburbs, which is where we lived. And then again, I worked a little bit further out for a while, but I think it was an experience that profoundly affected me. and in some ways I experienced more culture shock going to Texas than I did going to Norway. And I think that's because, naively, I know this now, but in my head it was like, oh, English, shared language but certainly going back to work full time in Texas, though I love my job and I love the people I worked with.
[00:05:23] Sarah: There was just a kind of a cultural, oh, this isn't like the workplace that I've been in before or experiences that I've had elsewhere. And so that, although it was a wonderful experience, took me a long time to kind of go, right, well, this is very different. And I often tell the story about arriving on my first day with my kettle because they didn't have one and nobody was making tea and that wasn't a thing.
[00:05:43] Sarah: And I'm saying to my boss, people come to meetings and we don't offer them like a beverage. I was mortified because, you know, no self respecting person from the island of Ireland would not offer you a cup of
[00:05:55] Trisha: Exactly.
[00:05:57] Sarah: And maybe a sandwich or a scone. So, there were a lot of things that profoundly, not least of all that I brought two dogs back from Texas and profoundly changed my life, but there were a lot of sort of potential forks in the road in my time in Texas.
[00:06:10] Sarah: And for all the things that I was unsettled by, and, and struggled with open carry, which is this policy where anybody can wear a gun anywhere. And things like that, which were very confronting to me. There were incredible, incredible opportunities. And I'm really glad that I got to take those. So as well as my certification in CQ, I also, it's not on my LinkedIn, but I do have a whole lot of certifications in dog training and not, not like big ones.
[00:06:37] Sarah: I'm not a dog trainer, but opportunities to really dive into and learn things about animal behavior that I wouldn't have had anywhere else that are very much formed. my thinking about a lot of things. So, yeah, complex. I'd say I respect Texas. I don't know if I loved it. I love bits of it,
[00:06:55] Sarah: but it was really complicated for me.
[00:06:57] Trisha: And I wonder if there's echoes back to home in Belfast with some of the polarization between people.
[00:07:04] Sarah: Yeah, I think, well, certainly more so in, in the kind of last couple of years that we were there, we were there when Trump beat Clinton in that election. So we were there for the last couple of years of Obama. And we certainly felt and experienced nuancing changes.
[00:07:21] Trisha: mm.
[00:07:22] Sarah: in, in how things, how people were talking about things, how things were thought about not in our sort of social circle, people that we were close to who would have had very differing opinions and been of different political shades around those things.
[00:07:35] Sarah: But my husband had an experience one night when he was picking up takeaway pizza. We had friends visiting and he'd gone to pick it up. And because he was going to the pickup place for pre ordered thing, it looked like he was cutting in front. And so the gentleman behind him got very aggressive and told him to go back to where he came from.
[00:07:51] Sarah: And my husband's like a six foot, he's Irish white, just like a very pale color. And it was so, and he was just stunned. But the person who started this conversation and said that to him became so difficult about it that an off duty police officer intervened.
[00:08:07] Sarah: And it was just that moment of like, he was very disturbed by it.
[00:08:10] Sarah: So was I, when he told me about it afterwards. And I was thinking, gosh, you know, he's this big white guy, you know, what would that have been like if you've been a person of color? Would their intervention have been there? You know, how terrifying would that be?
[00:08:22] Sarah: so, yeah, there was a lot of very, you know, uncomfortable situations where people would say things about migrants and I'd go, hello.
[00:08:29] Sarah: That's, you know, hi you know, your mate here on the yoga mat next to you. Hello. So yeah, was an interesting time but enriched our lives in very many ways and changed our lives in very many
[00:08:40] Sarah: ways. So whilst I can't say I loved it unreservedly. I think it's interesting sometimes to stop and unpack its impact.
[00:08:48] Trisha: And I know there are many people there that you still have deep relationships with and that you miss. So that's always a measure of, you know, sort of a love of a place as well, isn't it?
[00:08:58] Sarah: Yes. And you will know that occasionally I like a y'all slips out
[00:09:02] Trisha: Yeah, that's right
[00:09:04] Sarah: so y'all need to yeah the all the all will never leave me I don't think completely
[00:09:08] Trisha: It's quite a handy one, really.
[00:09:11] Sarah: Yeah, and I do love to say y'all means all y'all.
[00:09:13] Trisha: And to our second standard question, can you tell me about a time when you experienced the shift, when you suddenly became aware of a new perspective?
[00:09:23] Sarah: this, yes, and I think this ties into what we're going to go on to talk about perhaps a little bit. When I did my CQ certification there's, I think almost like a tipping point where you're in, it was for me anyway, but I was in the session going, uh oh there's, and there's particularly because I'd done the 360 CQ assessment, which is a really interesting experience.
[00:09:42] Sarah: And thank you. You were one of those people, but thank you to everybody who participated in that with me. Where you kind of have a moment where you just go, there's so much to learn,
[00:09:49] Sarah: there's just so much that I don't know. And what was interesting was through the course of those two days. I, the shift for me was to go from the overwhelm of there's so much to learn to go to the excitement of there is so much to learn and the curiosity and the exciting like, oh, and you know, to be stimulated, but excited about learning new things and to see the lack of knowledge.
[00:10:14] Sarah: Or as an opportunity, and I think that was something to do with you could definitely analyze this better than I can. But I wondered if my metacognition, my thinking about my own thinking had shifted back to my, like, sort of reignite my drive a little bit,
[00:10:30] Sarah: Where it kind of amped up my curiosity and my enthusiasm for learning and being curious again.
[00:10:35] Sarah: and I think I have thought a lot, this may well go back to Texas about what the opposite of cynicism is. It's very easy. I think in the industry and the profession that I've been into to be a little bit cynical sometimes.
[00:10:50] Sarah: And I think the more I think about it for me, the opposite of cynicism and the, the antidote to cynicism is, is curiosity.
[00:10:58] Sarah: It's wide eyed curiosity and an openness to new things. And I think more and more the people that I. enjoy working with the most are people who are just openly curious, not nosy. Anybody listening? That's different. But that, how does that work? Why is it like that?
[00:11:15] Sarah: How do we do this? What are the questions we need to ask?
[00:11:18] Sarah: And those kinds of things that was my shift in the course of doing qualification and the course of diving deep into CQ was the, oh, there's so much to learn. Isn't it fabulous
[00:11:30] Trisha: I love that
[00:11:31] Trisha: I love that shift from, there's so much to learn to, there's so much to learn. Yeah, exactly. I'm wondering, like you say, how that does relate to AI. What do you think typically goes through people's heads, their minds, their thinking when they think of AI and how does that impact their ability to, to, work with it?
[00:11:52] Sarah: we could just talk about this.
[00:11:54] Sarah: The next half an hour I think it's really interesting. I can't I was somebody recommended a piece of research that slack. Those of you that know, that sort of process management tool had done some research and they've got 5 personas that they see and I think.
[00:12:09] Sarah: Well, you know me, the answer to this, any question is almost always, it depends, Trisha.
[00:12:13] Trisha: Mm hmm. Mm hmm. Mm hmm.
[00:12:15] Sarah: So I, I think it might depend on, again, where your curiosity sits. And I think, but I think there's sort of a continuum between fear and a sort of a resistance to the uncertainty of this, the scale of this, the potential negatives of AI.
[00:12:32] Sarah: And then there's at the other end, I think the Slack persona is the maximalist. So they're just head into it, diving into it, applying it to everything, exploring it all, kind of borderline obsessed with it. And I think there's a lot of middle ground. I think the other thing. Is from a lot of what I've been reading and figuring out and, and talking to my various LLM friends about is that quite often we talk about AI and what we really mean is chat GPT or Claude other tools are available, but we quite often mean sort of generative of these large language models.
[00:13:08] Sarah: And it's interesting to me to sit back and go, wait a minute, this technology has actually been around since the 1950s. It's been growing. And yes, it has kind of boomed. We've just seen, like, we're recording this, what, two, three days after DeepSeek has merged into the
[00:13:21] Sarah: world.
[00:13:22] Trisha: Yes indeed
[00:13:22] Sarah: who knows what will happen before people are listening to it.
[00:13:25] Sarah: And so I think there's some trepidation. I think there's a lack of knowledge for a lot of people. And I think I'm also conscious about who's who's leading the discourse.
[00:13:38] Sarah: I'd say I've been thinking about this a lot. You know, the industry that I come from feels borderline obsessed with it right now.
[00:13:44] Trisha: The comms and PR people?
[00:13:47] Sarah: Yeah, and the creative industries more broadly, I think, because there's concerns about copyright, there's concerns about creativity, there's concerns about us being replaced.
[00:13:54] Trisha: Of course. And that fits with some of the data coming out of things like the WEF Future of Jobs report?
[00:14:03] Sarah: yeah. And so it's really interesting that, like, the most desired skills are creative thinking in the top, I think it's in the top five, creative thinking. And I'm like, hello, who does that best?
[00:14:12] Sarah: Our industry.
[00:14:13] Trisha: We still need the thinking.
[00:14:14] Sarah: So I think there's a lot to grapple with. I suspect a lot of people just feel overwhelmed.
[00:14:19] Sarah: And I'm also conscious that when I talk about people, I am probably in my head talking about a certain sector of people.
[00:14:27] Sarah: And I, that's probably the educated professionals the weird. So, I can't explain the weird, but it's the Western educated industrial. There's a strata of society that are very worried about AI and thinking about it all the time. And I think there are lots of people who don't think about AI at all, or very little, who are like, if you're using Netflix, AI is in your life.
[00:14:46] Sarah: If you've got a little Amazon speaking device, Siri, or whatever you want to call her AI is in your life, if you're using Amazon I remember Drew Battenhall saying that, you know, Facebook's been using AI for years. This was a couple of years ago saying, you know, that's what's hopefully keeping some of the worst stuff away from us.
[00:15:01] Sarah: Is there weeding some content out? We can debate how well they're doing that at another time. But I, and that's where I worry about people saying, oh, we're going to roll out AI and it's going to improve everybody's life. I'm like, have we skilled people? Have we, have we asked if they want that?
[00:15:14] Sarah: Asked if they're ready for it?
[00:15:15] Sarah: Like,
[00:15:16] Sarah: are we doing things to people or with people when we talk about, and it's government policies in the UK and things like that. So.
[00:15:22] Trisha: You're talking about a number of different responses for people here. You're talking about some of it is emotional, some of it is capability, some of it is thinking. So, as we think about those, Those sorts of terminology, those are often the terminologies that we bring from cultural intelligence as we think about different worlds.
[00:15:41] Trisha: And as we use our curiosity to look at cultures that are different to us. And so you and I have been speaking about doing this with CQ and applying it to AI. So let's, you know, let's, let's think about this. And, and, you know, I know that's part of your white paper. So can you tell us, you know, what you've been thinking and how you're approaching this?
[00:16:03] Sarah: So I have to give credit to my friend Trudy Lewis. Because I was having a conversation with her before Christmas and about lots of things and Judy is a leadership coach and a very wise person. And we were talking about lots of things. And I said, but what if AI was like another country? What if it was like a different world?
[00:16:19] Sarah: And in some ways kind of feels like it is,
[00:16:21] Trisha: Absolutely.
[00:16:22] Sarah: and there's, I keep seeing headlines about the age of AI and what if we, you know, took the cultural intelligence model and applied it as if we were going off to explore a whole new world might well be the theme song and then that sort of sent me down a rabbit hole of, What if now I did ask Trisha knows that I refer to all the large language models as I use collectively.
[00:16:42] Sarah: They're all called Harry in my life after a journalist, former journalist and communicator who I work with the beginning of my career. And so I asked Harry and Harry said, but I have no cultural values. AI doesn't have cultural norms. And I felt a bit like saying, oh, mate, you have ours. Like we have heaven help us.
[00:16:57] Sarah: We have given them all to you.
[00:16:59] Sarah: So they're probably in there somewhere. You just don't have them uniquely as your own yet. So I think that's really interesting process. And I'm conscious that people listening to this who are deep into CQ who are going, you can't do that. But I think it's a really interesting leap of imagination and creative thinking.
[00:17:16] Sarah: She said, justifying herself to apply the CQ model because I also, I do feel it's like embracing a new culture
[00:17:23] Sarah: that we're learning a new language. We're constantly having to think about new perspectives.
[00:17:27] Sarah: There's a little bit of translating because, you know, well documented that the LLM, so Large Language Model, generally hallucinates, makes me mistakes.
[00:17:37] Sarah: Says that like on the tin, please check,
[00:17:39] Trisha: So we need some checking.
[00:17:41] Sarah: yes, it bridges gaps because it goes, well, this is what this makes sense to me. We can all do that actually to a point. And so, and I said to you in a message this week, we were talking about it and about deep seek. And I said, it feels like culture shock. Like, just when you think you've made sense of it, boom, something else you're like, Whoa, no, I didn't.
[00:18:00] Sarah: How do I, how do I do that? How does that work? Which is felt very like waves.
[00:18:04] Trisha: yeah, within the schema that I've built that helps me to understand this place? How does this new piece of knowledge fit? Yeah.
[00:18:12] Sarah: And how do I then layer that across so many aspects? and. And I think also how then, you know, a lot of CQ for me is about self reflection and understanding how we interact with the culture. And so I thought that feels really interesting to apply to AI. How do we interact? Where do our values sit?
[00:18:33] Sarah: There's so much conversation to be had around the ethics of AI. I mean, just yesterday I was in a webinar. We were talking about the impact on the environment and AI's use of water, which I had never, like, I'd heard. I don't know enough about that. I
[00:18:47] Sarah: was like,
[00:18:47] Trisha: We won't go down that one because,
[00:18:49] Sarah: no, we don't. But oh, goodness.
[00:18:51] Trisha: it is massive and the data centers and what they're going to be doing.
[00:18:55] Sarah: and precious metals and all kinds of things.
[00:18:57] Sarah: So, yes.
[00:18:58] Trisha: I'm not sure. You know, in terms of our listeners, some of them are definitely across CQ and some of them may not be so much. So why don't we break it down according to the model, which is, the really helpful way that, in my coaching and in our training, we'll both be thinking about.
[00:19:17] Trisha: And so if we think about it capability by capability, if you like and how it applies to AI. So. Let's start with CQ Drive, a very good place to start and see how that might apply to working with AI. What are your thoughts?
[00:19:34] Sarah: Ah, I, one of the things that I think about when I think about Drive is we do think about what motivates us. And so it's interesting to think about, are we driven to explore AI because we have that genuine, like, Oh, this is so interesting and curiosity and fascinating. But also and you've taught me a lot about this is if you don't have that, then what are the blockers?
[00:19:57] Trisha: Mm.
[00:19:57] Sarah: And how can you identify the blockers? And I think fear can be a really powerful blocker,
[00:20:02] Trisha: For sure.
[00:20:02] Sarah: whether that's fear of technology or, you know, where's the data going, who's like big brothers watching me, where's who's going to use this, is it coming from my job,
[00:20:11] Sarah: a little bit of suspicion perhaps about where it's coming from.
[00:20:13] Sarah: We've seen that in some of the dialogue and commentary on DeepSeek. that maybe wouldn't be coming up if it had come from somewhere else in the world other than China. I think also the thing that stood out for me when I was reflecting on Drive, which is about motivation and confidence, was this idea of persistence.
[00:20:32] Sarah: Because you are going to have to keep learning,
[00:20:34] Trisha: Yes.
[00:20:35] Sarah: that's hard. Like that's, it's a weight, it's a load. And so keeping pace on top of everything else in life in an already very complex world, is a lot. And so being able to dig in and use the CQ model to maybe help you build that persistence think about, and also reflect on intentionally building that persistence,
[00:20:58] Sarah: Whatever that looks like, that will be different for everybody.
[00:21:00] Sarah: But I know you've often said to me, like your CQ is not particularly high if you're tired and stressed
[00:21:05] Sarah: and you know, it's, it's can be situational. And I think, you know, there'll be people going, I just, I just like, And I've learned this bit and then I've learned that bit and yesterday I saw a LinkedIn post where somebody had used AI to turn their weekly newsletter into a song. And I was like, I don't, I mean, I personally don't need that in my life, but I was just like, that's fascinating. And I think it's that ability to go, I don't need that in my life. That's not where it's useful to me.
[00:21:31] Trisha: And at the same time, the value of novelty might, for some people, increase the drive, increase the, the positive feeling towards it. So yeah, it is.
[00:21:39] Sarah: they might go. It's fun. So I think being aware of what motivates you, what's blocking you thinking about building that confidence and persistence in a way that's right for you. And I think that bit about it being right for you is quite helpful because. It's also about reflecting on what you're good at.
[00:22:00] Trisha: Yeah. Yeah.
[00:22:01] Sarah: I think Ethan Mollick said, whatever you're best at, you will always be better at than AI. I hope I'm not misquoting because I hold that very close to my heart.
[00:22:10] Trisha: Because you've got multiple years of knowledge and experience and, and the ability to see it from lots of different perspectives.
[00:22:18] Sarah: yeah, so I think that's, Thinking about digging into and understanding your own motivation, but also if you are in an organization where you're tasked with implementing AI
[00:22:29] Trisha: Mm hmm.
[00:22:30] Sarah: or think you should be implementing AI, being able to understand other people's motivations and blockers is so going to be so important because there will be like those shades of reactions.
[00:22:43] Trisha: Mm. Mm.
[00:22:44] Sarah: Being able to see other perspectives and other, you know, what's holding people back. Whether you've got like super fans, you just want to get stuck into this and then how do they leave people behind or a bit more apprehensive. So all of those kind of perspectives in an organization I think are really, really interesting. And that's, I think, again, where CQ helps us step back and go, wait.
[00:23:05] Trisha: you're almost stepping now into the metacognition in terms of thinking about what other people might be thinking, as well as thinking about what you might be thinking as you come to try and help them. And so metacognition, which is really what we focus on in this podcast, almost more than anything is how to.
[00:23:24] Trisha: Think about what you're doing in a way that will help you to see things from different perspectives. And so, you know, as Sarah shared, you know, about her. Overwhelmed with CQ initially was, Oh, I've got so much to learn. And then recognizing that it changed to, Oh, I've got so much to learn so that it was exciting.
[00:23:44] Trisha: So in the same way, we can recognize how we can shift our perspective about it. And maybe sometimes learning something can help us make that shift. Maybe seeing it from someone else's perspective. One of the things I've been really encouraged to see is how people for whom English is not their first language and who haven't had all of the, all of the benefits that us weird people might have had are using AI to produce things that look like you and I who, who grew up in an English language might have produced. And so there's a, there's an element that AI can democratize people's capabilities and lift up the people at the bottom. And Ethan Mollick pointed out that those people at the bottom of writing capabilities will improve significantly and the people at the top won't gain heaps from it, from the writing parts of it anyway.
[00:24:41] Trisha: And so for me, that's, that's a different perspective that has really helped me to sort of think more positively about it as well.
[00:24:47] Sarah: Yeah, and I think that's really true. And I think also looking at AI globally, because I'm conscious that so much information I consume about it is from certain voices in certain parts of the world. But when you look at it, perhaps in different cultures, the applications of AI to something like health and education are vastly different because they don't have the education system that, we might have, maybe it's more complex, maybe the access to medical care is different and the AI is potentially going to have, you know, more revolutionary impact there. So, yes, thinking about it. Also, I've been thinking a lot about what voices am I hearing in the conversation and where are they coming from, which is probably, again, a little bit around strategy and also into knowledge.
[00:25:28] Trisha: Yeah. So when you think about knowledge, what, I mean, obviously there's the how to that, you know, we're all busy trying to jump into what are the other aspects that you think there?
[00:25:39] Sarah: this could get like a shopping list, so we'll have to contain ourselves. I, for me, the thing I, the thing that I'm increasingly coming back to on the knowledge, and this I think is going to be in the white paper in a bit more detail, is. Yes, there is the kind of technical capability of, I can write a good prompt or I can build my own, you know, LLM or I can, you know, I've been reading a lot about what industry, what agencies and the creative industries are doing.
[00:26:05] Sarah: I mean, incredible stuff. The organization called Agency Hackers have an event coming up just about this and the stories coming out of that are extraordinary. Really, really interesting. But I think that's that feels like the first wave of knowledge and I think increasingly what we need to be thinking about are what are the questions we need to ask around ethics values. And in the same way as we would with any technology or any solution or any digital transformation. Is what's the purpose of this for my organization or my community? Because it's, it's, you know, you can't just have, Oh, we're going to use AI, but to what end and why and how, and is it going to fit with our system and fit with the way we work?
[00:26:47] Sarah: Or is it actually going to cause us more problems? That idea of being like fit for purpose.
[00:26:51] Sarah: I think it's really interesting. Are we using it in a way that really will build productivity, that will build value? That will help people you know, what are our intent, like, sort of, and a little bit of that self reflection, but it's also the kind of knowing your, your own organization or, you know, who are you selling this to?
[00:27:07] Sarah: And there's a lot of thought then around governance and we're seeing the EU's now got guidance should there be, within your own organization, should there be governance? Whose governance? If you're building something as an AI tool to sell. And you're going to sell it outside the EU or in the, like all of those layers.
[00:27:24] Trisha: Yeah, the, the, laws are so different.
[00:27:26] Sarah: And it's also about training people to have the judgment and discernment to go, that's good enough to use.
[00:27:33] Sarah: Or I need to, to me, there's a big difference between, I'm sure I can write a pretty decent press release with the right prompts and the right training, right? Give it the brand values, give it all the things and probably write an okay press release. I've seen really good press releases go out into the world without the right handling and not land well. So that ability to know when to release it, who to talk to about it, how to position the story, all of the thought that goes in that sits behind. So if you have no experience in PR and comms, but you can get AI to write a press release, that does not mean you're going to have a successful media relations campaign.
[00:28:07] Trisha: No, that sounds rather risky
[00:28:08] Sarah: Like massive skills gap there.
[00:28:11] Sarah: So, and it's not always about the press release, so it's really interesting to think about even just how the, the knowledge will change and then thinking about the skills. What are the future skills that we need
[00:28:24] Trisha: Mm.
[00:28:25] Sarah: to really embrace AI in I hope a way that's positive and, and I know ethics can be culturally mitigated a little bit.
[00:28:32] Trisha: .
[00:28:32] Trisha: Mm hmm. Mm hmm.
[00:28:34] Sarah: But
[00:28:34] Trisha: And professionally. So ethics within your profession is different to mine.
[00:28:40] Trisha: Yeah.
[00:28:40] Sarah: yeah, yes. And I think we also often conflate like ethics with our morals and our
[00:28:46] Sarah: values and have to unpack all that a little bit.
[00:28:49] Trisha: And, you know, I would say ethics from a psychological perspective within the Western world is probably different from ethics as a psychologist in other parts of the world. And there are a You know, ways that I operate in some parts of the world, which are like in the West. It's very clear where a psychologist should draw boundaries, but it isn't so clear in some other parts of the world where society is more collective and where it might be more important for you to go to somebody's funeral, who's a client, you know, whose, whose mother has died or something like that.
[00:29:25] Trisha: And it would be inappropriate in the Western world. And so if you take those. ethics and you're applying them to create, say, a training scheme for psychologists, and then that piece of training is supposed to apply around the whole world, then it becomes something that you need to be thinking about.
[00:29:45] Sarah: yeah, and I think building tools that are global or you can tell has been built to regional that maybe have a profoundly, I mean, a lot of what we're hearing about AI up until last week. Was coming out of the U. S.
[00:30:01] Sarah: And it's built a bit of the U. S. into it.
[00:30:04] Trisha: Oh, absolutely.
[00:30:05] Trisha: Yeah.
[00:30:05] Sarah: so thinking about different cultural values show up in the world, are you, like, to your point?
[00:30:11] Sarah: And Dave Livermore says this so beautifully in his, in his book, is that so much of what's written about management and leadership is written from a Western individual perspective. And what 70, 75 plus perspective percent of the world is collectivist. And so the way they think about AI and the way they will use AI could be vastly different from how we're thinking about AI.
[00:30:33] Sarah: And there's a lot of. Discussion about how AI affects the individual performing their job. I suspect that conversation is very different in other parts of the world. So holding those things and thinking about those things, you know, if, if, if that little fact was a huge shock to you, then that's a little knowledge gap that you might want to think about. because I think we just assume that AI is universal. Not, we're Not,
[00:30:56] Trisha: And it's Not, just, it's not just learning how to write a prompt that will tell AI to think more like somebody from another country. That's one element to it, but there's so much more that you need to think as you read whatever might be churned out in response to that. And you think about applying that in terms of into your conversations.
[00:31:17] Trisha: Yeah.
[00:31:18] Sarah: Any quiz I'm saying that I'm conscious that because of where I sit in the world and my own cultural background, that the things I may be selecting to read are not exposing me to perhaps very rich and interesting conversation about AI and other parts of the world.
[00:31:31] Trisha: Mm. Agreed.
[00:31:33] Trisha: Yeah.
[00:31:34] Sarah: If you're a part of the world and would like to send us stuff, then please do.
[00:31:37] Trisha: Yes, absolutely. There was recently at the UN a conference about AI in the developing world. And, you know, I think I said to you would love to have been able to be there just to sort of see how people think that AI can be used productively and be used helpfully. Because the risk is that it could take things away from the developing world where some of the roles are, for example, in call centers, and that may be, you know, less positive.
[00:32:07] Trisha: So how, where does the positivity sit? And yeah, Sarah and I are not saying we have expertise. We're saying we have curiosity and openness about all of this. And we're tossing a lot of ideas around. And I think that's helpful to do. I think that's part of one of the, one of the behaviors, CQ action that we should be applying is to ask questions and to discuss with people and, and to discuss with Harry themselves.
[00:32:36] Trisha: Although we're not. We don't want to anthropomorphize because we know that it's a tool and not a person. I think asking questions, talking with others is one of the things that should be at the top of our list for what we should be doing.
[00:32:49] Sarah: Yes. And as in many other things, having a diversity of perspectives and voices is. I think really helpful. I mean, I listened to a fascinating piece last week about the use of AI and IVF in the UK. It was one of the first clinics that's using AI to, for all kinds of things, really for better outcomes.
[00:33:09] Sarah: And also they were saying if, if we can get the technology to do better data that there is less, you know, risk to the couple or the individuals. There's you know, potentially better outcomes, but also there's a lot of waiting and seeing and hoping and agony in the IVF process. And they were saying, if we can do anything to reduce that, then let's do that.
[00:33:30] Sarah: Let's increase the chances of, you know, happy, happy pregnancies but also decrease the chances of people being very disappointed and heartbroken. And then hopefully spend more time with people because you're not. You know, you're a lot of the data is there for you. I mean, it's just endlessly fascinating.
[00:33:48] Trisha: Yeah. And if we can do anything to make AI more culturally intelligent then, you know, to, to be able to, to help people who don't all have access to the training that we've had. And so they might suddenly, because AI can expose them to it, think more deeply. And just in the, I don't know how many months. It certainly hasn't been years that I've been playing with AI, but I have seen a real growth in AI's use of and reference to different cultural values acknowledging some of the aspects of cultural intelligence and how we can use it to grow and learn more.
[00:34:27] Sarah: Yeah.
[00:34:27] Sarah: And I asked Harry about this in preparation.
[00:34:32] Sarah: And, Harry admits, I think it was chat GPT. But admits that. The nuance of culture and it's such a, I recognize this experience in my own life that it's things like idioms and humor and the subtleties of high value show up that they're like, I'm not getting it.
[00:34:50] Sarah: So it's, those things. And that, again, is what I'm hearing from multiple different sources is that for AI to succeed, it needs the human
[00:35:00] Sarah: and it's the human who can go, wait a minute, you can't tell that joke in Belfast. No or that's offensive in that part of the world, or that's not what that means, or no, that's, that doesn't, and a lot of it is, it's perhaps the emotional side, but it's that knowing that that doesn't feel right, that doesn't feel appropriate. That doesn't feel maybe inclusive, but that language isn't right. I think also the judgment, the creative thinking, the knowing how to brief, the knowing how to use is partly the knowledge, but then also sits into the action.
[00:35:40] Sarah: This is why the CQ model is so great because it's so interconnected that you can't just go, we'll just tackle the knowledge, shall we?
[00:35:46] Sarah: Because, you know, that's. It's all, it all comes together beautifully. but yeah, I think keeping the humanity and that means that we also need to work on our own cultural intelligence and that ability to see other perspectives so that when AI produces something, we can look at it and go, now, how is that going to land
[00:36:02] Sarah: for insert culture of your choice?
[00:36:06] Sarah: And I think that applies in a number of contexts and the training that I do, we don't just talk about national and regional cultural differences, but it's about applying the CQ model to. Organizational cultures. So, is my organization like my customers? Where's the cultural tensions there? The different perspectives on things.
[00:36:25] Sarah: What about functions within an organization? How different teams operate? And there's a lot of interesting research around desk based workers and those who are deskless.
[00:36:35] Sarah: And it was interesting, for example, the Slack personas are based entirely on people that are sitting at desks.
[00:36:40] Sarah: And that's not everybody
[00:36:42] Trisha: no definitely.
[00:36:42] Sarah: Obviously.
[00:36:43] Trisha: And let's not forget the generational differences,
[00:36:46] Sarah: yeah, absolutely. And different identities you know, there's concern about how certain identities might be represented and presented, in AI. And I think going back a number of years, I think was it Google that had the issues with the images it created? I think it was. Apparently all Irish people look like leprechauns with white beards. At that time. Things have moved on enormously. I know enormously, but you know, that holding that little bit of nuance.
[00:37:12] Trisha: I'm curious about what you think might be the things as organizations change as the future of work changes. What are the things we hold onto and say, this is what we can work on developing and growing, because this is not going to be done by AI.
[00:37:31]
[00:37:32] Sarah: Oh so much change and I'm conscious that there are connections between cultural intelligence and our ability to innovate and our ability to handle change. I'm not going to try and talk about that because David Livermoore's written a book and it's, you know, Go read Dave's book. We'll link to that in the show notes.
[00:37:46] Sarah: I think holding on to that ability to take different perspectives is going to be really important because AI will maybe, depending on how we brief it, on how we build it and how we use it. You know, there's gaps in the system. I think we were talking about this before we came on air. The thing is that although we are seeing more learning models and more reasoning models coming through with AI, which I'm just beginning to wrap my head around AI is built by people and we are fundamentally flawed as human beings.
[00:38:16] Sarah: So we do have bias and we do have gaps in our knowledge and we do have things that we don't see about others. and I don't know that we can expect AI to overcome that. you know, on its own. And so judgment, knowing when to use which tool for what occasion and in what way thinking about planning, self reflection, perspective taking, the ability to connect effectively.
[00:38:40] Trisha: I think those are two critical areas when we think about the meta cognitive thinking. the, the planning, the awareness of self and others and the checking, I think AI can probably check. I don't think it has awareness of self and others. And I don't think people, well, from my reading so far, people don't see it as being able to develop that self awareness to that extent, at least for quite some time, that's, that's sort of the boundaries of sentience, really.
[00:39:14] Trisha: so those are the things that I think we will always be good at, those of us who are good at it, and those are things we can always develop because they will set us apart.
[00:39:24] Sarah: yeah. And I think again, that from an individual point of view, it's the self awareness. I love to write. That's I'm very happy if someone puts me in a corner and just leaves me with a keyboard. I'm very, and I do a lot of my thinking through writing. Sometimes I don't know what I think till I write it down
[00:39:39] Trisha: Yes.
[00:39:40] Sarah: a little. It probably doesn't make me an ideal podcast guest. So having chat GPT or another tool do my writing for me, it's not the best outcome for me having an edit or repurpose along. Yes, there are lots of great things I can use it for and lots of fun things I can use it for, but also lots of useful things.
[00:39:56] Sarah: And so knowing that about myself and how I work best.
[00:40:00] Sarah: And then thinking about how do I use the tool and what is the most appropriate tool and are there other things that it makes complete sense. And again, I'm conscious as I'm doing this that I'm falling into that trap of thinking about AI as being chat GPT, when there's so many other applications and ways of doing things.
[00:40:17] Sarah: I, I think what's interesting was I, I read a great LinkedIn post this week from someone called Sharon O'Day. And I will try and put a link to it, but she was. Reflecting on, I think really disconnection.
[00:40:31] Sarah: And so in a lot of workplaces and organizations, we have all this technology for communicating with people. People are not plugging into it.
[00:40:38] Sarah: and so we were talking about the need for connection and the need for real human connection. And I'm conscious that the kind of wave of post COVID and growth of AI are kind of intersecting a little bit for a lot of people. And that what we do as humans feel brilliantly as connect.
[00:40:54] Sarah: Well, not always brilliantly. Sometimes we're very mean to each other and very hurtful.
[00:40:58] Trisha: And some people find it easier than others.
[00:41:01] Sarah: yes, for lots of reasons, there are lots of barriers to connection, but we're heart like we are built for it.
[00:41:07] Trisha: We're built to want to connect.
[00:41:09] Sarah: Yes.
[00:41:10] Sarah: And. That's the thing when I think about the future of work is how do we build an increasingly technologicalized.
[00:41:17] Sarah: If we're having a technological revolution with AI, then how do we build in connection
[00:41:22] Sarah: and how do we build in meaningful connection? I think I'm back to my cup of tea and taking the cattle to work here. But how do we do that? And that feels like a leadership skill and a management skill for the future.
[00:41:33] Sarah: And it also thinking about the workplace of the future is, you know, I think the nightmare for a lot of people is that we're just all automated things doing our AI. But how do we harness technology in a way that really builds connection?
[00:41:44] Sarah: Meaningful connection for people when people want community.
[00:41:47] Trisha: I think there's a risk there that some of those skills have been lost as people have come to the workforce through the pandemic and, you know, have less interpersonal skills and especially perhaps dealing with difficult conversations. And so their first point of call might not be to go and have a conversation with someone.
[00:42:09] Trisha: And so I think the need to step back and perhaps help people develop those skills is really critical. I mean, I think a lot about, you know, what we really are as people. This has made me think so much about who we are as people and how we come to work. And when we think about Theory of mind sort of the self determination basis you know, there are basically three characteristics and one is autonomy that sense of, you know, that, that I can do things and I can make decisions and I can choose to have a career.
[00:42:43] Trisha: And then the next one is competence that I can be good at things. And then the last one is connection. And so the risk is that we feel like each of these aspects might be taken away in a workplace where AI increases. And so it's thinking, how can we ensure that as people, human beings, we, we lead. We teach, we get together in ways that enhance all those three aspects of self determination theory so that people have that sense of joy and satisfaction in their lives, that they feel they have some autonomy and control and they feel that, you know, they are competent and, and most of all that they are connected
[00:43:27] Sarah: And I think one of the things that I've been sort of reflecting on a little bit is certainly in parts of the world, not everywhere, but in parts of the world, we have a kind of a like the badge of busy-ness that you're successful if you're working hard, if you're busy, you know, it's a, it's a thing. I have a sort of a working theory.
[00:43:45] Sarah: That's quite strong. A lot of PR agencies, you know I've got a lot on, I'm busy. But if the best of AI relieves us of some of the things that that we can automate and we can, you know, then are we going to be less busy? Like, what do we fill that time with? I think is one of the really interesting questions. What does that look like?
[00:44:06] Sarah: And there's a lot of discussion in, in the industry particularly in PR agency, not all comms professionals are in PR agencies, but about hourly rates and a lot of how you measure and run businesses are to do with capacity and hourly rates and all that kind of thing. And If you are taking, if AI is able to take work away from people, so instead of, you know, in the bad old days, I would have spent an hour writing a contact report after a meeting.
[00:44:28] Sarah: Well, I don't need to now because I just plug, you know, a bit of kit in on the table and, or I'll just run it on my Zoom call and it'll do it and I'll check it and tidy it up and off it'll go. You know, if I can find one that's good it doesn't have lots of mistakes in it,
[00:44:39] Sarah: but in theory, that's a lot of time saved.
[00:44:42] Sarah: You know, we, we were looking at another tool yesterday, which will trawl academic research but that's days of somebody's life gone, just like, poof!
[00:44:50] Trisha: I know we got very excited,
[00:44:51] Sarah: Oh, we're very excited. so then what, you know, what replaces that culture of busyness and do we have leadership skills if. You know, if we're mandating people must come back to the office because they're not, you know, and that remote works, not real work to quote a headline in the UK last week.
[00:45:06] Trisha: Oh goodness ,
[00:45:07] Sarah: Then what does real work look like in the age of AI?
[00:45:11] Sarah: I think there's a lot of deep reflection and thought about that. And I don't think it's all bad. I think it's all good, but I think it's a lot of change. And I think one of the core skills. I think I'm talking about suppose leaders and managers and people who are hopefully leading people through these is change management. And I think that how cultural intelligence helps us perspective take is really important for change management.
[00:45:36] Trisha: Yeah, a hundred percent.
[00:45:38] Trisha: I'm aware of time and don't want to
[00:45:41] Sarah: Be here for weeks,
[00:45:43] Trisha: yeah, no, we could, we could, we could go on for ages which is going to be your challenge as you work out your presentation for this. I'm wondering what advice you'd give to someone who wants to develop both their cultural intelligence and AI capabilities.
[00:45:57] Sarah: Oh, in one sentence or less. I, I think curiosity, be curious about yourself and others
[00:46:04] Trisha: hmm. Yeah.
[00:46:07] Sarah: and about AI. And I think one of the pieces of advice. Maybe more than advice, and that comes from in the C, any kind of CQ teaching that you do have the CQ model it's baked into it is you flex and adapt, but you don't change your values, don't you're going if you're stepping into a new culture, or in this case AI. Yes, you need to adapt, you need to flex. But not to the point that you become somebody else, and I think that's a good thing to hold on to right now when you're scared, or if you are scared, or if you're a bit worried that, you know, the kind of cultural values that you hold are very difficult to shift, as we know, but you shouldn't suddenly become a complete, you know, you shouldn't mask who you are, be somebody different, to take the good in AI and use it appropriately.
[00:46:49] Sarah: I think the other thing I would say that is a learning from both for me anyway, is. I think embrace learn and fail, learn and fail, learn and fail.
[00:46:59] Sarah: When it comes to other cultures, you're not going to get it right all the time.
[00:47:04] Sarah: I'm not even sure what right is anymore.
[00:47:05] Sarah: But there will be times where you take a step and go afterwards. Ohhh
[00:47:09] Trisha: I know. I know. And that can really hit your drive because it hits that, you know, self efficacy, but you can, you can regain it and reminding yourself you're not perfect.
[00:47:20] Trisha: It's okay to make mistakes.
[00:47:22] Sarah: yeah, and as we experiment with and embrace AI and build new things, sometimes it'll be brilliant.
[00:47:28] Sarah: But I think there's probably going to be a, oh, or maybe not fail, but the technology will move forward as we're building the thing that we're building and be like, oh, well, that's not going to work because it's accelerated.
[00:47:37] Sarah: And so that also, I think, speaks to the importance of psychological safety if you're in a workplace.
[00:47:42] Trisha: Mm hmm.
[00:47:43] Sarah: And the importance of that,
[00:47:45] Sarah: and we know that embracing cultural intelligence across a team can really boost psychological safety because it's people centered. You're thinking about the impact on people.
[00:47:54] Sarah: You're thinking about their perspectives and considering those.
[00:47:57] Sarah: So that self reflection.
[00:47:58] Trisha: And it's safe to express different perspectives. Yeah.
[00:48:01] Trisha: Yeah.
[00:48:02] Trisha: As you think about, you know, you, your life and the people you've worked with in your profession and your family and the community, and you think about the future of human AI interaction, what are you hoping for?
[00:48:15] Sarah: I think I'm hoping, I think things like that IVF application that I listened to, it was woman's aura. I'll find the link and we'll put it in the show notes really excite me.
[00:48:25] Sarah: The potential to, I hope that AI makes us better humans
[00:48:31] Trisha: Yeah, that's great. Yeah.
[00:48:32] Sarah: in the sense that I hope we have more time for each other.
[00:48:36] Sarah: And so to hear conversations where they say, this is freeing up us to spend more time with patients. This is freeing us to spend more time in conversation. I think that's the best possible outcome is that we use it wisely.
[00:48:50] Sarah: And I, part of that is making knowledge more accessible to more people. You know, democratizing that I think potentially is very exciting. So that's, that's, the optimism. That's the hope.
[00:49:00] Sarah: Yeah,
[00:49:00] Trisha: that's lovely. I love it. Thank you Listeners, I will put in the show notes, Sarah's contact information, which includes her LinkedIn profile and on LinkedIn, she has a newsletter that you can opt in for. I'll also put a link to her quick dip, quick dip, whatever accent I say it with quick dip. Sorry, Kiwis who are listening to me and think I'm being very Aussie, podcast, and also when she has that white paper produced, I will be sharing that also on LinkedIn.
[00:49:33] Trisha: so that will what is available now will be in the show notes, and we'll put some of these things that we've referenced to as well so that you can keep learning, and that you can be connected to us as a community as well
[00:49:48] Sarah: Yeah. Thank you for listening. What was really an extension of the, the, the WhatsApp conversation that Trisha and I have on a daily
[00:49:53] Sarah: basis,
[00:49:54] Trisha: yes, that's right.
[00:49:55] Sarah: where she sends me a note going, look what I found. I
[00:49:58] Trisha: And both the positives and the negatives that we often share with each other. thank you so much, Sarah. Really appreciate your time here.
[00:50:06] Sarah: Thank you, Trisha. And I look forward to continuing
[00:50:09] Sarah: the conversation with you on our next WhatsApp messages.
[00:50:11] Trisha: Indeed.
[00:50:12] Trisha: Thank you for being part of the shift today. These conversations become even more valuable when we share them with others. If today's discussion offered you practical insights, pass it along to someone in your network who's on their own cultural journey, and you can find us on your favorite podcast platform for more perspectives on cultural intelligence through the shift.
[00:50:35]