Michael Conner: [00:00:00] Good morning, good afternoon, good evening. Welcome to another episode of Voices for Excellence. I am your host, Dr. Michael Conner, CEO, and founder of the Agile Evolutionary Group, and also the proud host of VFE and today's guest I am absolutely biased and subjective when I say this. I love this man to death.

Michael Conner: I love him. He is by far, I, I would say he, he's brilliant, right? Brilliant. With the vigor and the rigor that he's bringing into the education ecosystem, I, I love the tactics of the knowledge, core knowledge, but be able to bring that level of objectivity. In the context of disrupting historical systems, structures that have been concretized within our education ecosphere specifically on [00:01:00] the status quo, and he challenges the status quo with.

Michael Conner: Step and breath. Kunal Dalal is absolutely, I like to say I, I look up to him and, you know, the stuff that he, he presents the innovations around AI that he, he underscores has to be heard and we've been talking for a while now and to absolutely have him on VFE, this is kind of my excuse to give to you the audience, a brain dump of one of the smartest individuals, specifically in that vertical of AI that I know.

Michael Conner: So without further ado Kunal. Good brother. How are you man? Welcome to the show and to my audience. He is the proud CEO of AI parentology, which he's going to unpack with that, but also brings a level of expertise from the K 12 vertical as well, being in positions that focus on [00:02:00] ai. Innovation on a systems and a accounting level.

Michael Conner: He'll be able to bring that experiences, all his educational experiences to this podcast. But Al Man that look, I'll tell you, it is an absolute honor not only just to call you a brother, to call you a good, good, good friend of mine, but just to have you on VFE to share some knowledge and expertise what's going on in the sunny state of California.

Kunal Dalal: Man, I, I feel like I should retire now after that introduction. Like that was, that was, that was, I've done my job. I can move on now. When Dr. Conner introduces you, like that, you can go ahead and step off the stage. You've done your work. I'm doing well. I'm doing well.

Michael Conner: Kunal. This, this is, this will be the quickest episode of VFE.

Michael Conner: Just the introduction and that's it.

Kunal Dalal: We're moving on. We're done. No, this is, you know, I'm gonna tell you this. I was, I was. Laid out in bed last, like the whole day yesterday, not feeling well. [00:03:00] And, and I was sort of like worried that I wouldn't be my best self coming in to this, to this podcast episode this morning.

Kunal Dalal: Uh, but I'm gonna tell you, I woke up and I realized that my body was telling me something yesterday and my body was telling me that I need to slow down for a second and I'm gonna use this moment here with you. To be able to really sort of think big picture, think, you know, think granular, but also let's, let's really, let's really consider like the, the larger and more systemic ramifications of what we're talking about here.

Kunal Dalal: Uh, and of course there's the nitty gritty of getting everything done every day of, of what's this strategy and what, what, like what, what it is it that we need to do on the ground on the day to day, but also, you know. I'm also gonna call you former superintendent, Conner. This is big picture stuff and this is fundamental [00:04:00] foundational stuff and the work that you're doing.

Kunal Dalal: Uh, and my hope is that the work that I'm doing. Feeds into that. So, you know, thank you for that introduction. I'm not sure how to

Michael Conner: follow that up, but I'm gonna do my best to my audience. The last time Kunal and I saw each other, we were at A-S-U-G-S-V in San Diego and Kunal, I'll never forget, we were walking from the convention center back to the main hotel and the conversation that we had.

Michael Conner: On how we can be able to improve the education model and totality. That was when I said, I gotta get you on my podcast because right there it was hot that day. I remember because it was cold back home in New York and coming to California, San Diego, and I was profusely sweating as I was walking. But the intent, the intent.

Michael Conner: So I was so focused on you. I was like, man, I need to go back, take a shower. I wasn't even, didn't even know I was sweating like that. And [00:05:00] literally to my audience, it was literally like a a a, a seven or eight minute walk up the street. But the nuggets that I got from Kunal, just from that eight minute walk.

Michael Conner: Wow. Now that we have 60 minutes to be able to unpack this in totality for my audience. Again, this was purely selfish, uh, getting you on VFE so they can be able to hear that that reciprocal conversation that we had and really go deep now in some of those spreads from from April at the A-S-U-G-S-V summit, but.

Michael Conner: The first question, Kunal, I always like to use this question as a fun question, so my audience to be able to get to know you, you are, and this is, this is factual and statistically, statistically grounded. You are a national voice with ai to me and my lens objectively and objectively to many others. You are regarded.

Michael Conner: Thought leader in this space when it comes to this topic in education. But for those, whether it be [00:06:00] domestically here in the United States or internationally, who tune in to voices for excellence that do not know who Kunal is, what song unpacks your leadership signature and impact in the ecosystem?

Kunal Dalal: Well, I, I hope this isn't too unserious, but it is serious.

Kunal Dalal: Lupe Fiasco. Go, go. Gadget flow. Is what I, what when I need to get started. I put that on and I feel like flow go go, and it's just, it hit, it has a, it has a step to it that tells me to move. And so when I saw that question, you sent that question to me ahead of time. I was like, do I say this out loud or do I make up something more inspirational?

Kunal Dalal: But no, I'm just gonna tell you from the heart, I put, I put Lupe on. That's where I go.

Michael Conner: I'll tell you Lupe Fiasco. Wow. That Kunal I love Lupe, right? I love his [00:07:00] album. That is a classic, I consider kind of like a Lupe with like the roots, you know that, that that family, right? That's in there. But Lupe, yes, absolutely.

Michael Conner: But the, the, the song you talk about specifically go-go gadget flow. When I think about that. The AC stage of education, I always like to say there's that level of strategic urgency, right? We're working in this urgent manner where we have to have this go go type of gadget flow, but then also at the same time with the strategic urgency, there's that level of strategic intent.

Michael Conner: As we're flow, so we're accelerating, moving fast. Because ai, I I, I is revitalizing and revolutionizing education as we speak and is so fast paced in this flow that it's just natural that when we think about the AC states of education, after COVID-19, we do need to [00:08:00] have that strategic. Urgency and intent with this go-go gadget flow with that.

Michael Conner: But we are on the move and Kunal to, to be honest with you, you know, I still hear, and this is probably gonna curl your skin when you hear this, but I still hear we're not going to adopt ai. We're not going to. Exponentially expand with this level of intentionality with technology and various machine learning platforms.

Michael Conner: Moreover, this confluence of AI is going to end at a certain point in the future, but when we real, when, when, when people don't realize that this is gonna be the mainstay in education and we can't keep on kicking the can down the road, go, go. Gadget for such a good song in Lupe. I didn't know you were gonna bring out Lupe.

Michael Conner: Now that was the first on VFE. That is A on VFE, my brother. But. Here's the thing. I want to keep on tapping your work because I'm just absolutely fascinated by your work and [00:09:00] also a fan of it as well, Kunal, keep doing that, keep impacting and influence, but you are on the forefront of AI innovation in this ecosphere.

Michael Conner: So with regards to your expertise, right, I want to part and parcel this isolated in a linear context with regards to your expertise with micro schools. First of all. What is this model for my audience and how do you see it being leveraged broadly by 2030 and Kunal? The rationale is, say 2030. 2035 is that's really when my theory of action takes into effect, where the 22nd century education model has to be lamented to support generation alpha and generation beta.

Michael Conner: Right. So that's the first question about this mic, about these micro schools. The second part is. From a disruptive innovation lens, how will AI frame the model of micro schools in this AC stage of education?

Kunal Dalal: This is like, I mean, I appreciate this conversation. Like we're going into [00:10:00] the real meat of, of the matter here.

Kunal Dalal: We're not, we're not in the talk fluff here, and I appreciate that. I think, let, let me step back and let's, let's just sort of unpack micro schools in general, right. I think the idea of micro schools in general has been around for a long time. Uh, and we, we tended to call them that we, we, we grouped them with homeschooling.

Kunal Dalal: Uh, up until I think more recently and homeschooling meaning here in the US specifically, that parents take on the responsibility of, you know, doing. The student would have been completing in a public school, but they do it at home. Uh, and then they check in with their resident public school district with evidence that their student has made the progress or, or completed whatever they're gonna complete.

Kunal Dalal: Sometimes the school itself will provide materials to the parent and the home, and then the parent will, will shepherd that, that that learning. And so that used to be homeschooling. And then you'd start. And then with COVID, what you started to see was. [00:11:00] These pods forming. These learning pods. Right. Which, you know, we still sort of thought of 'em as homeschool learning pods, but really we're talking about micro schools now, where we have a community built.

Kunal Dalal: System or community built space, right? It could have been someone's home. It could have been like at, at probably someone's home because it was COVID. We probably weren't going into public buildings at the time. Uh, but it'd be at someone's home. Now it might be at a community center. It might be at, at a faith-based organization, something like that.

Kunal Dalal: But these, these little pods of parents, little pods of students would. Would go through whatever curriculum they needed to go through, but then also I think, experienced something that students in a larger setting were missing, which was the intimacy of being in that small community. And we have cliques forming in large schools anyway, which tends, which sort of mimic those pods right?

Kunal Dalal: In a way. [00:12:00] But those clicks off clicks often form in. Almost like antagonistically right. To the larger group sometimes. And you and I both as, as with the skin color we walk around with, we know that often we had defined clicks that were in opposition to sort of the, the, the, the main narrative that's that's going on out there.

Kunal Dalal: And sometimes it was in defense of, and so I noticed during COVID there were a lot of parents who started to really, for the first time, I think ask the question of is. School, the best place for my kid to spend most of their day. Now, this question wasn't being asked. In a legitimate sense. I think mainstream wise, it wasn't being asked yet, uh, because we were on autopilot, and it's hard to sort of go back to 2019 because the world has changed so much.

Kunal Dalal: But in 2019, we weren't thinking about, and I, I, I know because I was, and nobody was listening and nobody cared about what I was [00:13:00] saying. And everybody thought I was crazy and everybody thought I was anti-public schools. Which I was not, and I never have been. But then at COVID, finally I started to see the first sort of inkling of parents being like, hold on a minute.

Kunal Dalal: Like I'm seeing what's happening. They're sitting on a computer and they're learning. They're learning. They're not really learning. They're just going through these exercises. And then the wait is this, is this what you do when you're at school? Yeah. This is what we do when we're at school too. You just kind of going in this sequential order and doing things that like.

Kunal Dalal: Basically that we were doing when we were kids in school as well. And so right on the heels of that comes the public release of generative ai. So we've got these two monumental things that happened back to back. I have a theory of what would've happened had it had the order been reversed, which I'll get to in a second.

Kunal Dalal: But the order of COVID and then generative AI making its public release it. Parents are [00:14:00] in this stage, and first of all, let me say, there was a lot of black and brown parents specifically who were asking this question of, do I need to send my kid to this place where they're not made to feel good about themselves every day, where they have to essentially, in a way, defend their existence every day?

Kunal Dalal: Do I need to send my kids to that school? Of course. In most cases, those parents, not in most cases, but in many cases, those parents are also potentially the least resource to go out and go find some private alternative or find some, you know, whatever it is, some logistical alternative. And so I have no choice, but I'm gonna send my kid to this school.

Kunal Dalal: Um, but even before COVID, I was noticing in, in a charter school that I was partnered with a lot of parents. Some more wealthy parents saying, you know what? I don't think my kid needs to finish senior year. I don't think my kid needs to go to college because they're not learning anything. They're just gonna go into this internship with this person I know.

Kunal Dalal: And then they're gonna go from there. They'll do their high school stuff, you know, online and they'll get it done. And these were [00:15:00] privileged parents, right, that were doing that. But they were privileged. They had the chance to do that. But anyway, getting back to getting back to this to, to this e, what I see as an evolution now with ai, not only.

Kunal Dalal: Do a lot of parents see an opportunity for their kids to be a little bit more independent in their learning and be independently collaborative in their learning? I know that sounds contradictory, but I mean that kids have an opportunity to determine their own collaborative space and build their learning and their, what they see is their efficacy in the world, in their own supportive, collaborative space.

Kunal Dalal: AI and micro schools were really hard to do in the past. You had to create all your materials on your own. You had to create all your materials on your own. You had to jump through all these hoops around whatever it is the district was looking for. It was hard work. It's not [00:16:00] so much, it's not such hard work.

Kunal Dalal: Now with ai, you have a partner. Now. This AI partner can help you deal with some of this more bureaucratic stuff that frankly does not help learning, right? It's just compliance stuff that you have to do because bureaucracies drive on compliance. But learning, learning is not a monastic bureaucratic space.

Kunal Dalal: Learning is a space of change, of efficacy. Of value to yourself and to your community? I don't need, I don't need to fill out paperwork to do that. Uh, um, no, we've never filled out paperwork to do that. Um, and so in this moment, right now, in this post COVID, this AC world that you're talking about, micro schools right now have an opportunity to be sort of the laboratories of a new model of self-efficacy.

Kunal Dalal: That's what I see education as is, is a, is a exercise in providing our young folks [00:17:00] an unequivocal look into the self-efficacy they have. And if the world was a great place to live in right now and everything was sustainable and people were joyful, then I'd say, you know, they don't need to have any efficacy in the world.

Kunal Dalal: We can, we can just learn for the sake of learning, but this world needs all the help. We can get, and kids have beautiful, brilliant ideas and they have not been soured by the world the way that many of us adults have, and they can solve this world. I often say the only people who can see a new world are the ones that have no stake in the current one.

Kunal Dalal: And no matter how much I want to change this world. I've found some success. I'm on this podcast right now, right? And so I might still say, oh yeah, you know, let's chase that. Let's chase that. But you know, this part isn't so bad. But that's not how we wanna brainstorm. I want kids to brainstorm. I want you to have no connection to the world we have right now.

Kunal Dalal: Imagine a good one. And now with. With the help of ai, we might actually be able to get there, but we're not gonna get there. [00:18:00] If we create, if we continue to build this compliance-based mass model, age stratified, you know, standards, like standards are good. But not when they're tyrannical, because we're at a point where we're al, they're almost tyrannical at this point.

Michael Conner: Thank you for that. Because when I think about this now, with this whole micro school evolution, the model, we really, and you're absolutely correct, we really haven't talked about disruptive and radical model designs in the context of teaching and learning, specifically learning where we're shifting the autonomy to the students.

Michael Conner: We're still even with, I always like to say in the DC stage of education during COVID-19, we truly. Had to challenge the status quo of education. And when we think about what, what, what evolved during that specific timeframe, generative AI as you talked about, right? The, the [00:19:00] continuous questioning and the continuous unpacking of the standardized learning model and asking those very hard essential questions to challenge the status quo model that you and I.

Michael Conner: Pretty much every other educator has been educated sequential activities with delineated cut scores. And if you don't meet the specific cut scores, you are categorized or even classified into a specific quadrant where sometimes that's where. We know biases, we know stereotypes are elicited from that.

Michael Conner: But this whole micro school model where it shifts the autonomy to the students, where we're looking at this elevated self-efficacy of a student, and like you said, the, I like the, the dichotomy between independency and being collaborative. I think we have to explore. These models. I mean, Kunal, if you think about it, people used to think we were crazy when we were talking about [00:20:00] AI in 17 and 18 and having the same level of conversations that we're having in 2025.

Michael Conner: So when people say you were crazy talking about the micro schools, they were all, they also thought we were crazy when we were talking about AI back at 17 and 18. With that, so now Kunal, I just want you to expand on this before, shortly, before we get into your critical work. Where do you see micro schools in the next five years, and do you see parents shifting these choices to attend micro schools as opposed to some of the traditional brick and mortar programs IE school that we know of?

Kunal Dalal: I mean, I think. The shift is in some ways going to be inevitable. The question is, do districts want to be a part of that or not? I've met multiple superintendents now who have said, I've been, I've talked to my board [00:21:00] and I've told my board, if we don't build micro schools within our district, we're not gonna have a district.

Kunal Dalal: Um, and I, and multiple, multiple superintendents have said that to me, and so they know. And that may not be all superintendents, but the fact that it, you know, you know this well, in order to get to be a superintendent, you have to have spent a lot of time in public education, and you have to have a deep, deep belief in what you're building, right?

Kunal Dalal: And when someone, when someone at that level. With that much and especially the I one superintendent who has spent his entire career in the same district. Like he started as a teacher in the A para in that district teacher and then now as a superintendent has an incredible story. And he himself said, look, we are I, this district is.

Kunal Dalal: Done if we don't have micro schools as part of our offerings as a district. And so I think [00:22:00] districts actually have an incredible opportunity because they have an infrastructure that that sort of isolated pods would not have. But you can't take away the autonomy. Of the micro schools, if you start to, you know, district ize or, or you use the district model to then to build, uh, we had the small school movement, but the small, small school movement didn't exist when generative AI was public.

Kunal Dalal: It was hard to do. We can't, we can't, like, yes, we can learn from failures from the past, but we can't dismiss. Opportunities that we have because we say, Hey, we tried this before. No, we didn't try it before. NVH generative AI or in, in, in an AC world, right? We, no, we didn't try, we have no idea. And so sure small, you know, breaking up big schools into smaller things that didn't work in a lot of places.

Kunal Dalal: Building small charter schools that didn't work in a lot of places. It did work in a lot of places. I, I lived in Boston for a while where there are an incredible [00:23:00] array of small charter schools and so that model. Can work in, in a pre AI world, but certainly in an AI world, if we figure this out, we can have, we can have a word beautifully, potentially, potentially, but we as adults, you and me, teachers, leaders, superintendents, boards, we have to believe in this model we're these are kids can't build this on their own.

Kunal Dalal: Kids are the clients and the customers of education, yet they're the most powerless. And so if we don't. Listen to them like they are the most powerful people in the room, then we're not gonna get this right. And so that's what we have to do. We have to listen to them as if they're the most powerful people in the room.

Kunal Dalal: And what are they saying? This is what I talk about with followership, and I talk about this a lot. There are bajillion, bajillion, bajillion things on leadership. We don't talk about followership. And the last book I found on followership was like in 1988 or something. Um, but followership to me, it, it sort of mimics, [00:24:00] um.

Kunal Dalal: Not mimic necessarily, but there's a lot of elements of servant leadership. I think that, that are embedded in followership. But followership to me is I'm gonna see what the students are saying and I'm not gonna just get outta the way and let do it. No, I'm gonna use what expertise I have, what connections I have, what networks I have, and I'm gonna try to, if I need to break down some barriers, I will.

Kunal Dalal: If I need to offer some support here, I will. If I need to collect some people to help you realize something or help you, you know, whatever it is, I will do that. I'm gonna lean into followership. I'm not gonna, I cannot lead you to your future. That's absurd. So I am gonna lead the adults around me to try to understand that you all are the, are, are the folks we need to, to, it's your world.

Kunal Dalal: And we, we've done enough to make it hard for them. Let's be honest. Our generators like, so I'm just like, let's get outta the way a little bit, man.

Michael Conner: And, and let them lead and Kunal. I'll tell [00:25:00] you this, if I, if I was sitting in the seat right now, I would be thinking about designing a micro school's vertical pathway K through 12 within my learning organization where parents will be able, students will be able to opt into it at that specific whatever grade level.

Michael Conner: With that, that, that is just me just thinking about this. I just hearing you speak. I just loved how you, you, you separated. Functions or the functionalities because one of the misnomers is that micro schools is this enhanced, en enhanced model implementation of the small school movement, which there's some, I like to say, difference of functionalities from micro schools versus the small school movement.

Michael Conner: Thank you for clearing that. But gal, man, you, you,

Kunal Dalal: and I'll add in real quick, I think to, to really nail in the distinction, the small scoop movement was. Determined by districts, by boards, top down micro schools is bottom up. It's determined by students, by parents. [00:26:00] Bottom up.

Michael Conner: Unbelievable. Unbelievable. And I'm sure I, I'm sure my audience is gonna reach out to you with regards to that.

Michael Conner: Clearly the distinction and what is micro schools with that. But Kunal Man, I'll tell you this. And I've been around, I've been around. Then at different ends of the country, different ends of the world, and I have not met an AI ologist. Okay, I'm just letting you, I have not met an AI ologist in education.

Michael Conner: That's why I consider you one of the extreme brilliant innovators in the ecosystem. So it is, first of all, it's cutting edge. Right, because you're gonna have to unpack that for my audience, what you're doing with regards to innovation and AI and parents. So first, my first question is define and unpack what an AI parent is.

Michael Conner: It's so necessary. I have many conversations with you about [00:27:00] this. Know your work exclusively, but I wanna be able to get the word out there about what you're doing specifically in the context of this AI parent. But I've seen your work in the UAE or you having some work that's, that's being rooted in the GCC area, which is phenomenal that this work is getting, getting to be recognized internationally.

Michael Conner: But with this 22nd century focus in the AC stage of education, how does your AI ology institutes, right, that has already been launched in the Middle East, how does that work deep in the scope of supports? With generation alpha, generation beta, our most important customers in the ecosystem.

Kunal Dalal: Yeah. Yeah. I mean, this is the heart of what I do.

Kunal Dalal: This is, this is the stuff that I, you know, I've been a teacher and an educator my whole career, and I'm a more recent parent, but I didn't realize that I was a parent my whole life. I just didn't have kids yet. And so, so this is [00:28:00] something that's been, been a joy to me to realize. And so, you know, this started, I'll, I'll sort of tell the story of how it started.

Kunal Dalal: I'll try not to be longwinded, but it started when. I, I had been laid off during the COVID layoffs, and I also had had my first child at that time, right before COVID started, and so as I got laid off, I said, you know what? This is actually a great opportunity. I got, I got a decent severance. I, and then we had enhanced on unemployment during COVID and then, and, and I was.

Kunal Dalal: Financially we were okay. So I was like, I get to be a stay-at-home dad. I wanna be a stay-at-home dad. I've been wanting to do this. I was like, lemme do this for a year. Lemme do it for a year and a half. It ended up being three years longer than I thought, but a great three years. But during that time chat, two PT became public and.

Kunal Dalal: At the time, I, I was shocked because I'd been following AI and I'd follow, been following generative ai and I definitely thought we were still like 10, [00:29:00] 20 years from something like this going public. And not only did it go public, but it was somewhat decent. It wasn't trash. Um, and, and so then at some point, you know, I was looking at it, pool, whatever, and then.

Kunal Dalal: There was a moment where my little boy, he's two at the time, he got a cut in his, on his hand, and so he put a bandaid on and his cousin went and put a bandaid on his hand as well, on her hand as well. And so he asked me, why is she wearing a bandaid? She doesn't have a cut. I have a cut. And so then I, in my infinite wisdom, decided I'm gonna try to explain solidarity to a 2-year-old.

Kunal Dalal: And so I'm sitting here being like. So, uh, uh, you know, fumbling around. I try to try to help her under, help 'em understand solidarity. Um, and so then I go, I'm like, you know what? There's this Chad chicken tea thing. Lemme go ask. And so I'm like, how do I explain solidarity to a two-year old? And the answer it gave me was so good.

Kunal Dalal: It was so good. I was [00:30:00] like, wait, what? And so then I went in and started asking a few other questions. I was like, Hey, this is his favorite TV show. Can you make parallels with Shakespearean characters? And it did. And you know, of the five, two of them were made up, but three of them were real. And I was like.

Kunal Dalal: What is happening right now because I was like there at that point we were still in the stage of there has to be a paper written about this somewhere there something has to exist, otherwise they can't figure that knowledge out. Uh, now we've gotten more accustomed to it, making stuff up, but at the time, you know, we still didn't even understand this thing.

Kunal Dalal: And so I was just like, in that, in that moment, I was like, two things happened. First. Education is never gonna be the same. I had been away from public education for 10 years at that point. I've been in Silicon Valley companies, startups in the Bay Area, and I was like, I need to get back into public education 'cause this is going to change everything.

Kunal Dalal: And the second thing I thought was, this is gonna make me an insanely awesome parent. I need to just figure out [00:31:00] how to do it. In the right way and not do it in a, and so the solidarity thing, A, it taught me that. But the second thing I did was I was like, oh, we can also make images. Let me try to make an, and it made this image of these two kids kind of looking at each other and, and kindly looking at each other.

Kunal Dalal: And I was like, they're solidarity again. Uh, and both of these examples are in my, in, in what then led me to write my first book called the AI Parent. And it's. It was the first one, March, 2023. It was a thin little volume. There wasn't much in it, but it was really sharing how I, how I use AI almost every day.

Kunal Dalal: Less so these days, but at the time, almost every day with my child. It was that kind of stuff. Learning solidarity, but also. He was asking me about my childhood when I was a kid and I was growing up in India, and I don't have images, I don't have pictures, I don't have a whole lot of contacts like visual contacts for him to look at.

Kunal Dalal: So I go online, YouTube, and [00:32:00] everything looks different, right? Like it's, you know, of course things in the nineties look different from today, but certainly if you're talking about a developing country, things look nothing. They did in the nineties, right here in America, maybe you can see some similarities, but in a developing country, the it nothing looks like anything did in the nineties today.

Kunal Dalal: And so I went in and I started asking, Hey, pretend it's 1993 and you're in Calcutta. And the images it came up with we're so good. I mean, they weren't super accurate, but they, they gave me the feelings. And so I shared that with my son and I was like, Hey look, this is what it felt like to ride a train with my uncle and eat food.

Kunal Dalal: In Calcutta and he could feel, he could sort of feel me being there. And so that led to my first book, the AI Parent, i I it made it to the New York Times bestseller. Oh wait, sorry, sorry, sorry. I misread that. The New York Times Worst selling list. I definitely thought, I thought people would be so into it and it made nobody even [00:33:00] notice because.

Kunal Dalal: Well, how, who was thinking about AI and parenting like that was not a thing, and I realize that now. But I went in, I, I started my work at Orange County Department of Bed and then I wrote my second one last year, second edition. Much, much better response, much more dense volume, talking about music that I make with my son, art that we make together.

Kunal Dalal: Dream journals that we do. So, and I'll give you a few examples of what it means to be an AI dermatologist, is I have developed a two year. Window into my son's dreams, how his dreams have evolved because we have been doing AI image generation of his dreams for two old years now. We started when he had a dream and he was describing it, and I was like, should we do this?

Kunal Dalal: Should we make an AI image of it? And he was like, you know, sure. Whatever. And so we do. And sometimes it's accurate, sometimes it's not. But I've noticed, for example, his dreams. This is so, I mean, to me this is, this is such an incredible insight to have [00:34:00] of your child. His dreams two years ago were always about other things, right?

Kunal Dalal: It was about monster trucks or dinosaurs or you know, whales or whatever, spiders. And now he himself is in all of the dreams. Like he is, he is like running through a field. And the other day we did an image generation with him running through field, running through the field in, in one of his dreams, and he discovered how to iterate.

Kunal Dalal: With ai. I did not teach him this, by the way. This is something he just discovered on his own. And what did he do? He, he was like, here, I'm gonna just, and now you know, he can just talk to chat. Bt right before I, he had to tell me and I to type it in. Now he can just talk. So he talks, it creates an image and he looks at it and he's like, nah, that's not right.

Kunal Dalal: So he goes to his drawing corner. Uses a pen and paper draws a little pencil do diagram. Graham brings it back. It says, take a picture of that. I took a picture of it and then he describes that picture to Chat [00:35:00] GPT and then a new image pops up and now he's like, yep, that looks a lot closer to to my dream.

Kunal Dalal: He just figured that out on his own as a 4-year-old. Um. And so that, you know, he, we were walking across the street the other day. I shared this story a little bit because I think it's just the most incredible. We were walking across the street the other day and he says, bpi, the world is changing so fast.

Kunal Dalal: I'm like, whatcha, whatcha talking about your old 5-year-old? What do you mean the world changing so fast? And I was like, but you know, the world has changed in five a lot in five years, so he's not wrong. But, but then I was like, so what do you mean by that? And he was like, well, BPI AI is getting so strong.

Kunal Dalal: I was like, damn kid. Like what? Like where's this coming? Like, okay, so I, so then I was like, yes, you're right, but what do you mean by that? And he was like, well, the images have gotten so much better than they used to be. And I was like, see, so he, through the lens that he can understand is [00:36:00] visualizing our world changing.

Kunal Dalal: And to me as an AI parent, and by the way. Well, lemme complete that sentence and then I'll say that. By the way, AI parent. As an AI parent onologist, that is my role. My role is to, to help shepherd my little boy to a space where he can understand that the ground underneath him is moving and shifting. And for that to not be frightening, to not be scary.

Kunal Dalal: To not, and, and to know that he's got me, he's got his mom, he's got his sister, he's got his grandparents, he's got his friends. That we are all constant, this whole other thing. We don't know what's happening and we're gonna have to try to keep up or we're gonna have to try to do whatever. We are constant.

Kunal Dalal: I am constant. My humanity here does not as an AI parentologist and I'll, and I'll say. That term, by the way, was given to me. Someone [00:37:00] else you absolutely need to have on your pod. Uh, Wakanyi Hoffman, who is, you may, you may know her from her LinkedIn work, but, but she is a leader in indigenous ai. She's originally from Kenya, currently lives in, uh, the Netherlands, and she is the lead of the inclusive AI lab at Ure University in, in the Netherlands.

Kunal Dalal: And she travels the globe talking about, um. Uh, ai, basically indigenous. The indigenous incorporation of AI we're AI incorporation, indigenous thought and how we can rethink. AI from an efficiency perspective and rather think of it from a more indigenous community perspective, an incredible human being.

Kunal Dalal: But she's the one, I had her on a, as a guest on a webinar and she actually was like, you're an AI ologist. And I was like, I'm using that and I'm, that's just gonna be what I do. And now I have a website, now I'm doing the work. And I'll say, just a side little vertical that I have is the corporate lens to this.

Kunal Dalal: Um, and if so, if you go to the website, you'll see that I. [00:38:00] I'm offering to companies who want to take, who want to be a part of this. I'm offering to companies, Hey, do you know that three out of four of your of employees who are parents are losing sleep over AI for their kids? Three out of 75% of parents are losing sleep because they're worried about their kids' features in this AI world, and is your.

Kunal Dalal: Employee wellness program addressing the most immediate fear that your employees, that your parent employees have. Because if it, if you're not, then you're just putting paint on this really, really rattly building. But if you, but otherwise, Hey, I, I can, you can if you want, but I have a program too. Bring me in for a day.

Kunal Dalal: I have a two day workshop too. Bring me in for two days. Have your parent employees and others who are not parents who also wanna understand this moment from that lens, bring them in and let's talk. Let's talk [00:39:00] this out. Let's build something together. Let's, let's, let's uplift the work that the parents are doing and the pain and confusion and uncertainty that they're all feeling, because that is a productivity linchpin, is that your employees feel like you as an organization.

Kunal Dalal: Care about the things that they care about, not just. Paint, put painting on, but actually care about the things you care about. So that's also, now I've shift, I've, I've sort of evolved my work. There's the personal side. I had a Lighthouse Homes event two weeks ago where I had my first Lighthouse Homes event is so awesome that I got to have that.

Kunal Dalal: But we had 15 parents. From around the community come together and we just were, uh, we ate, I cooked for everyone. I cooked delicious Indian food, I must say. Uh, and, and I had, we all sat around and we talked about ai. We talked about our kids, we talked about. What this might mean for a future. And we, we tried to start making a plan about how we were gonna stay [00:40:00] connected and create an intentional community within ourselves.

Kunal Dalal: So I'm doing that on a personal community level. And then my hope is I can get some, get some folks at a corporate level who see the value in, in having this conversation and having these sort of sessions.

Michael Conner: Great, great response with that because I think that now this can't just be limited to education. I think that it has to be scaled and it has to have more of a cataclysmic impact on the context of being, uh, from this positive notion, of course, because I just love, just love the fact of how this model, this model that you constructed was built through the eyes of your son.

Michael Conner: Right, and that's where I think that now more education leaders, more, I like to say education architects with this notion of understanding that we have to be able to design with our most important customers [00:41:00] lens as the primary factor for that. And using generation alpha, generation beta to be able to construct.

Michael Conner: I think some of the most important next steps because this is going to impact the 22nd century, and believe it or not, and I tell people this and I'm glad that, you know, people aren't looking at me as if I was crazy, but the focus is that now we have to be able to have this shift to the 22nd century, even though it's 75 years away.

Michael Conner: But Kunal, I wanna, I want to go to a topic that's very near to your heart. And I want you to focus this question for our AI developers, right? And I want you to unpack your, and I wanna display exhibit your expertise, but I want you to unpack this from this context of promulgating impact. We know you and I know a lot of people know this al rhythmic.

Michael Conner: Bias, and that is a universal concern with the development of artificial intelligence. Now, when I speak about this [00:42:00] specifically, these concerns from many of our cadres that we talk to at the executive level, cabinet level within our school districts, they do exist. When we talk about this, about what are some of your broad concerns with ai?

Michael Conner: One of the data models that I was taking a look at suggests by 2030 that 60%, 60% of our student enrollment demographics will be black, brown, or two or more races, right? So we're diversity within our classrooms, diversity within our learning organizations are gonna continue to grow. But when we think about these economic shifts, right?

Michael Conner: One of the contributing factors is because of who we just welcomed in on generate first generation beta. One of the most diverse generations of mankind that supersedes generation alpha. 'cause at the time, generation Alpha was one of the most diverse generations of mankind. But first I want you Kunal, because you can speak this technical language to them, [00:43:00] to our software engineers, to our junior engineers, to our analysts or UX designers, designers.

Michael Conner: What are the key considerations, Kunal that. We have to, or they have to take in, in, in consideration in the context of design, what are those non-negotiables to eliminate biases within our models that we're designing. And second, how can we ensure that these AI models are trained on diverse data sets?

Michael Conner: That accurately represent the cultural backgrounds and perspectives of generation alpha and generation Beta because we know Kunal, we, we sit in the back in all these conversations, the degree of accuracy within the models. We're continuing to test and train these models until we hit a F1, F2 accuracy level.

Michael Conner: Um, but we're not taking into consideration the diverse data sets. We're not taking into consideration that [00:44:00] some of the rigidity. With the outputs of these models is because we're factoring student demographics and groups that haven't historically been integrated from this input throughputs perspective of the design of AI models.

Michael Conner: But what would you say to them, Kunal?

Kunal Dalal: I almost feel like this is an infinite question because I don't think we are ever gonna get

Kunal Dalal: to.

Kunal Dalal: Resolution per se. I think we're gonna, it's gonna reach an asymptote where we slowly get closer and closer, but we gotta keep working and we gotta keep working and we're gonna always keep getting closer.

Kunal Dalal: Uh, but we're never, I don't think we're ever gonna solve. So let me, let me, let me start by saying I don't, I don't even use the term. AI bias. I use the term embedded prejudice because I think sometimes when we use the term AI bias or algorithmic bias, we, we sort of unload the bias to the machine rather than appreciating that it is just us.

Kunal Dalal: Um, it has nothing to do with the machine. The [00:45:00] machine is just doing math based on what we fed it. And, and so that step one is to recognize. Where it comes from. It's not coming from ai. AI didn't make any of this stuff up. AI didn't exist, right? And so that's step one. Step two is I think for me, I, I, and I say step two, this isn't, I don't mean this to be sequential.

Kunal Dalal: Step two might be sideways to step one. I don't know. But I think another step here is to appreciate that these language models, no matter how much we try to diversify their training data. I just did, I just did a little exercise with one of the deep research models on this because I was thinking, I was like, you know, these data models, these training data models, they're only trained on written things, on words that have been written down.

Kunal Dalal: 99% of human history did not have written language, [00:46:00] and up to 99% of what we do isn't written down. We just talk. Right? And so. We're only capturing 1% of the human experience. Even if we capture 100% of everything that's ever been written down in the history of humankind, we're still probably only capturing 1% of what we are.

Kunal Dalal: And so there's no way that these models are gonna represent us as humans in, in a, in a, you know, in the way that we would. Hope it would in a complete way. It, they just can't. And so I think that to me is an understanding of the limitations of what these models are. They're just models and they're just, they're just, they're just, I always use the word, these are reflections, right?

Kunal Dalal: These are just reflections of what we put into it. My, my view of generative AI is. I, I sort of came up with this analogy maybe two months ago, but it's this reflective galaxy [00:47:00] of words, right? I put words out in the form of a prompt, and then those words go out into this AI galaxy swirl around in. Its.

Kunal Dalal: Galactic soup of words that it has, and then based on what my words were in the words that it runs into, in its swirl of galaxies, it spits out a new set of words. And those words are now what I see as my reflection through this AI model of what it is I'm trying to accomplish, what the things I'm trying to do, what I'm trying to understand.

Kunal Dalal: So it's this reflection of me. Now I have, if I don't understand. That reflection is going through this galaxy of words that is not a galaxy of words of human experience, but rather a galaxy of words solely of written and digitized human knowledge. Then I'm not understanding the value of what this process is and what is written and what is digitized.

Kunal Dalal: That to me is the step [00:48:00] where we start to. Ask. That's where we ask questions of whose voices are being written down, whose voices are actually being codified, whose voices are we gonna be able to embed for future generations? And then I'll leave a step back, another step, and this is, this might make you feel a little bit weird too, uh, but I, I did this provocation the other day, and in, in one of my sessions where I said, you know, Moana, she's, she's from an, she's from a indigenous Polynesian.

Kunal Dalal: Culture. They didn't have a written language. Every single solitary thing that they learned did, understood. None of that's in any AI model. Is it because it camping? It wasn't written down. These are large language models. They only work on law written, and so I can complain that the indigenous Polynesian lens is missing.

Kunal Dalal: But what am I complaining about? Like it's, it's something that's non-existent. It doesn't exist, and [00:49:00] that the step where it doesn't exist, that's the step that we need to reflect on. That's the step where we say, Hey, okay, the Portuguese colonizers came in and they removed all indigenous things. So now let me go back there and now let go study that and lemme go think about what that means for today.

Kunal Dalal: Right? This is, to me, an incredibly complex. View and ai. To me, that is one of biggest powers that AI has given us is it's given us this reflective lens where we don't get to bullshit ourselves anymore, pardon my language, but we just, we don't get to mess with ourselves anymore because this. Statistical algorithmic robot is sending this stuff back to me.

Kunal Dalal: I can't. I mean, how am I gonna say that? You've got some sort of like agenda. You don't have an agenda, you don't care. You're just an algorithmic reflective robot. And because of that, I am forced to do [00:50:00] my own, our own as a world gaps silences, right? This is my, my partner, Wes, who you know well, he talks about the gaps in silences all the time.

Kunal Dalal: Uh, and that's what we are forced to look at. And so when we're talking from a technical perspective, I think the most important technical perspective is actually technical on our end is. To understand the limitations that frankly, the si these, these, these, uh, model builders are working with, they don't actually have a whole lot to work with.

Kunal Dalal: They only have at most, 1% of anything useful that we've ever done at their hands to even feed the models, let alone then work from there. And so that comes on us to sophisticate ourselves. I don't know if that's a word, but I'll use it to sophisticate ourselves to make sure that we understand. That we are using this thing gives us really cool information.

Kunal Dalal: And it sounds really interesting sometimes, and I use it in that way [00:51:00] all the time, but also it's coming from a knowledge base that is fractional to who

Kunal Dalal: we are.

Kunal Dalal: And it's an infant. It's barely been out in the world. I, I can't expect AI to be all, you know, be a sage at two years old, at three years old. It's a little baby right now, so that would be it.

Kunal Dalal: I, I, I, I hope that's not a sidestepping of your question, but that, that's sort of my focus on it.

Michael Conner: That this is good because I, I like the reclassification of algorithmic biases to embed it prejudices. And I think we need to be able to expand on that and unpack that with more depth and breadth. Because the example that you provided with regards to these trained models is compelling, but it has enhanced us to have this reflective lens, right.

Michael Conner: And. Absolutely. Absolutely. Because one thing that you talked about and highlighted is that these, this level of complexities that [00:52:00] we need to continue to investigate and interrogate, and that you stated it perfectly, that. We are still in the infancy stages of this level of implementation and you're so right about this.

Michael Conner: Algorithmic bot is not, you know, put in this strategic intent with regards to this embedded prejudice. But again, you know, we have to continue to look at some of these. Train models and some of these data sets that as you stated, you know, they're fractional. And to my audience, this is one of the answers.

Michael Conner: We use this as an asynchronous tool for professional learning, self-directed learning, IE pedagogy. And this last statement or answer that you provided, there could be roughly about six or seven different strategies that can be unpacked. High level discussions and conversations because this conversation around bias or embedded prejudice within these specific [00:53:00] models, large language models, outputs from these generative tools is something that I'm seeing a lot of educators get.

Michael Conner: Frustrated with, right. And there could be, then the frustration is coming again. And as you're saying, they're coming where they're pulling, extracting data from multiple areas that are being trained and these hallucinations that come up on the specific outputs with some of these tools is frustrating educators.

Michael Conner: And they're asking me, how do we eliminate bias? I'm like, that is a daunting task. That is a large scale daunting task that is going to take a multitude of years. That's like asking how do we eliminate biases within instruction? How do we leverage culturally responsive practices? And that was a part of my dissertation and that was 15 years ago.

Michael Conner: We're still having the same discussion. I, I'm inferring that we're probably gonna have the same discussions when we talk about. Prejudice embedded [00:54:00] prejudices and biases within algorithmic models. But Kunal, last question. Good brother. Last question. Now I don't usually, I try not to contain. Individuals to only three words when they come on VFE.

Michael Conner: But I want you to be able to take this question as it is, and I'm sure you're gonna break the rules with this because you're a disruptive innovator just like myself, but good brother. What three words, right? Limiting you to you and I to three words that. What three words do you, I don't even wanna get into it, but what, what three words do you want our audience to leave with regarding innovation and AI in the AC stage of education?

Michael Conner: The

Kunal Dalal: one word that I constantly think about is resilience, and I think what, and this is coming [00:55:00] from. Sort of the tail end of what you just said. It, it, uh, right before this where teachers are asking you, how do we eliminate the bias? How do we eliminate the bias? Well, if we eliminated the bias, then we would also likely have, we could, we could then eliminate our critical thinking too, because there's no bias anymore.

Kunal Dalal: And so now we don't have to think critically because there's nothing bad out there. So in a way, the biases are keeping us. If they're keeping us, keeping our antenna up, right. Making sure that we're, we're aware of what we're looking at, and it's more real that way. Right? And so to me that's resilience. It is knowing.

Kunal Dalal: You might see something that's gonna make you feel like hell, or you might see something or read something that demonizes somebody else or, or some, you know, whatever it is, right? But then finding that resilience within you to be able to hang on. And also resilience in this moment, it is so uncertain. We are not, we are not built for this level of rapid pace and uncertainty [00:56:00] and so, so leaning into resilience, that's step one.

Kunal Dalal: Don't run from it. Lean into resilience. Two, I don't know if this is a single word or not, but open-minded. We are. We are not in a world that resembles our world. Even five years ago, and listen to the words of my little five-year-old boy when he said, papi, the world is changing so fast if a five-year-old can see what is happening.

Kunal Dalal: Is like that. This is like that Reddit thing. Explain it like I'm a five-year-old. Well, my five-year-old explained it to us. If my five-year-old can explain to us that this world is moving so fast, then let's all pause for a minute. Take take stock of that. It doesn't mean stop, it doesn't mean whatever, but take stock.

Kunal Dalal: And if you feel uncertain, if you're feeling unsteady, if you're feeling queasy, that's because it's, that's what's happening. And so be open-minded. To a, the challenges and, and how much is shifting, but also be open-minded, uh, [00:57:00] in, in, uh, what the potential positives and the potential upsides what's possible.

Kunal Dalal: That's it. And the last one. This is a word I think we all just always have to have, and that's joy. Joy, joy, joy, joy, joy. This is not a time. To overanalyze things because things are too uncertain. This is not a time to get mired in fear and angst and, and all of that. Um, there's, there's, there's a little bit of grieving that we need to do because there is a world that we're losing and I don't think there's any way around that, but there's a world that we're gaming that we don't know.

Kunal Dalal: And we get to make that world how we wanna make it. And that if we do it with joy, then our kids, our generation alpha, our generation beta, and us as the, as the, [00:58:00] as the stewards of that world, we all get to be joyful. So lean into joy. Don't be so serious all the time folks. Lean into joy.

Michael Conner: Absolutely resilient.

Michael Conner: Resilience, open-minded and Joy. Kunal Dalal. Thank you. The first AI parent in the world. Your service, sir. Listen, and when we speak up at your service, please, if my audience want to come out to, to contact you, to have you in their district or corporation, what's the best way to get in touch with you? My email, Kunal,

Kunal Dalal: K-U-N-A-L, at ai parent.com, and just send me an email, get me connected.

Kunal Dalal: Go to my site, ai parent.com. You can fill out a contact form there as well. You can see a little bit of what I'm offering and, and what my hopes are. You can see a little bit about the a UAE workshop that we're doing in a couple of weeks. Uh, but yeah, go to ai [00:59:00] ology.com or email me directly at kunal@aiparentology.com.

Kunal Dalal: That's

Michael Conner: it. Kunal. It was such an honor to have you on VFE. Appreciate you. Listen, I'm coming out to the left coast, coming to see you. It's like I said, I've been, I've been busy as you know, with this AEG and this 22nd century model and, and my platform and AI and all that. But I gotta break bread with your brother and listen, make sure we gotta, we a weekly basis just because iron sharpens iron.

Michael Conner: Trust me. You know, I learned so much from you. So from the bottom of my heart, brother, thank you. Thank you for coming on VFE.

Kunal Dalal: It has been my joy and my pleasure. Can't wait to talk again soon.

Michael Conner: Absolutely. And on that note, onward and upward, everybody have a great evening.