March 5, 2025

Episode 316: Ken Patterson

Spotify podcast player badge
Goodpods podcast player badge
Apple Podcasts podcast player badge
Amazon Music podcast player badge
Pandora podcast player badge
RSS Feed podcast player badge
Spotify podcast player iconGoodpods podcast player iconApple Podcasts podcast player iconAmazon Music podcast player iconPandora podcast player iconRSS Feed podcast player icon

 Why Transparency Matters Now More Than Ever with Ken Patterson 

In this hard-hitting episode of My EdTech Life, I sit down with Ken Patterson, a former principal turned AI consultant, who pulls no punches on how AI is really being used in schools. Ken saw firsthand how AI transformed a struggling school into a top performer, but the system pushed back hard instead of celebrating.

💡 Here’s what we dive into:
✅ How AI helped Ken’s school thrive—before the system shut it down.
✅ Why so many EdTech companies are misleading teachers about AI.
✅ The truth about "wrapper" AI sites—what they don’t want you to know.
✅ Why teachers should be leading AI adoption, not just tech departments.
✅ The missing piece in AI-driven education that no one is talking about.

Ken doesn’t hold back, and this episode is a must-watch for educators, administrators, and anyone who cares about the future of AI in education.

Timestamps:

00:00 Celebrating 100,000 downloads! Huge thanks to our community. 🎉
01:00 Shoutout to our amazing sponsors! Thank you, Yellowdig and Book Creator, for supporting this mission.
02:00 Meet Ken Patterson—his journey from music teacher to AI-driven principal.
04:00 Turning a failing school into a top performer using AI.
07:00 Why the system wasn’t ready for AI success—and how it pushed back.
10:30 The shady side of EdTech—Ken calls out misleading AI claims.
15:00 Should teachers use AI for instruction or just for admin work?
20:00 The problem with "wrapper" AI sites—what they’re not telling educators.
25:00 How AI can either empower or exploit teachers—depending on transparency.
30:00 Ken’s unfiltered take on the future of AI in education.
40:00 Final thoughts and a challenge to educators everywhere.

Special Thanks to Our Sponsors!

💡 This episode is powered by Yellowdig and Book Creator—thanks for supporting real conversations in EdTech!

👇 WATCH NOW and join the conversation!
💬 Do you trust AI in education? Why or why not? Drop your thoughts in the comments!

🔗 Follow Ken Patterson: LinkedIn
🎧 More Episodes: https://www.myedtech.life/

🚀 Stay Techie!

Peel Back Education exists to uncover, share, and amplify powerful, authentic stories from inside classrooms and beyond, helping educators, learners, and the wider community connect meaningfully with the people and ideas shaping education today.

Authentic engagement, inclusion, and learning across the curriculum for ALL your students. Teachers love Book Creator.

Support the show

Thank you for watching or listening to our show! 

Until Next Time, Stay Techie!

-Fonz

🎙️ Love our content? Sponsor MyEdTechLife Podcast and connect with our passionate edtech audience! Reach out to me at myedtechlife@gmail.com. ✨

 

00:00 - Introduction and gratitude for support

01:00 - Welcoming Ken Patterson to the podcast

02:00 - Ken's background in education and passion for kids

10:00 - Challenges faced as a first

15:37 - The role of AI in successful school management

25:37 - Initial hesitance towards AI in education

36:29 - Insights on teacher perspectives and AI utilization

46:29 - Integrity issues in educational technology

56:29 - Ken's vision for the future of education with AI

WEBVTT

00:00:30.077 --> 00:00:32.963
Hello everybody and thank you for my bad.

00:00:32.963 --> 00:00:33.326
Here we go.

00:00:33.326 --> 00:00:37.691
Hello everybody, and welcome to another great episode of my EdTech Life.

00:00:37.691 --> 00:00:45.862
Thank you so much for joining us on this wonderful day and, as always, wherever it is that you're joining us from around the world, thank you for all of your support.

00:00:45.862 --> 00:00:52.634
We appreciate all the likes, the shares, the follows, and, if you know, you know that you've seen that we hit a hundred thousand downloads already.

00:00:52.634 --> 00:00:58.731
So thank you all, as always, from the bottom of my heart, and to all our sponsors, thank you for your support as well.

00:00:58.731 --> 00:01:05.210
So big shout out to EduAid, to Yellowdig, to Book Creator and many more.

00:01:05.210 --> 00:01:11.709
Thank you all for believing in our mission and today I'm excited, as always, to bring you an amazing conversation.

00:01:11.709 --> 00:01:15.968
So today I would love to welcome to the show Mr Ken Patterson.

00:01:15.968 --> 00:01:17.691
Ken, how are you doing this evening?

00:01:18.319 --> 00:01:20.966
I am wonderful man, I'm wonderful.

00:01:20.966 --> 00:01:21.929
Thank you for having me.

00:01:21.929 --> 00:01:24.382
Thanks for what you do, Really thank you for what you do.

00:01:24.963 --> 00:01:25.766
Thank you so much.

00:01:25.766 --> 00:01:32.950
I really appreciate it, ken, and I'm just really excited to get to talk to you and just to hear a little bit more about the work that you are have you?

00:01:32.950 --> 00:01:51.808
You have been doing, I should say, because I know I've been following you on LinkedIn for a while now, over the past, you know, kind of year and seeing you grow, seeing you do a lot of great things and then, obviously, you know, bringing some great conversation into the LinkedIn space and really just getting a lot of people to kind of think about things differently, see things differently.

00:01:51.808 --> 00:01:54.040
So I'm really excited to have you here today.

00:01:54.040 --> 00:02:09.425
But before we get started with the meat of the conversation, for our audience members that may not be familiar with your work yet, can you give us a little bit of a background story and what your context is within the education space?

00:02:10.448 --> 00:02:14.264
Yes, so I'm going to make it short, but I am essentially.

00:02:14.264 --> 00:02:25.794
Most people think about me as I'm crazy about kids, right, I'm absolutely crazy about kids just because I think as long as they are here, we have hope, right, and so I'm relentlessly passionate about them.

00:02:25.794 --> 00:02:34.667
My email signature when I was in education was always unreasonably committed to kids, because I think we all should be, and so that's my framework.

00:02:34.667 --> 00:02:38.449
The last position I held in education was principal.

00:02:38.449 --> 00:02:42.187
So I started off as a music teacher and then I was living life and enjoying kids, having fun with kids.

00:02:42.187 --> 00:02:45.978
And then it, you know, I was living life and enjoying kids, right, having fun with kids.

00:02:45.978 --> 00:02:56.748
And then it's kind of like I was called up, like you know, you would run into some great administrators who said, man, you know, you got to do more, you got to do more and I was afraid of not being able to be around kids.

00:02:56.748 --> 00:03:05.902
So I became a principal, right around, like that was the highest I was going to go, like I needed to see kids every day.

00:03:05.902 --> 00:03:06.417
So I became a principal.

00:03:06.417 --> 00:03:12.608
I knew I wasn't going to go higher than that, but what happened was I got promoted from the classroom to assistant principal the year of COVID.

00:03:12.608 --> 00:03:15.026
So my promotion was online.

00:03:15.026 --> 00:03:29.014
I interviewed in my pajamas and then the next year, when we opened back up, I fell in love and that passion in that district I was in a very large district in love and that passion in that district I was in a very large district in Maryland and that passion they promoted me really quickly.

00:03:29.014 --> 00:03:37.703
So I was essentially an assistant principal for a year and so I became a principal and principal late, because you know, some things happened.

00:03:37.703 --> 00:03:38.366
It didn't happen there.

00:03:38.366 --> 00:03:39.572
So I got the gig.

00:03:39.572 --> 00:03:41.981
So it doesn't matter what happened, I got the gig.

00:03:42.021 --> 00:03:49.812
So I got the gig and my first year, uh, they were 18 empty classrooms out of 30 classrooms out of 18 empty classrooms.

00:03:49.812 --> 00:03:53.747
And I don't know about if you've been in education, right, but it's like a shortage.

00:03:53.747 --> 00:04:00.487
So I had to grab teachers off of um, uh, facebook basically, where I got like a teaching force from um.

00:04:00.487 --> 00:04:05.971
But that little red tag group man, we've done a little gritty group man I just said just come, show up every day for kids, we'll make it happen.

00:04:05.971 --> 00:04:11.187
Chat GPT came out that year and, like I was like this is it?

00:04:11.187 --> 00:04:17.526
This is it Like for principal, data analysis and speed is really what it's about, so I didn't use it.

00:04:17.526 --> 00:04:21.064
Long story short, the end of that year we got second out of the entire district of 125 schools.

00:04:21.064 --> 00:04:24.329
First year principal got second out of the entire district of 125 schools.

00:04:24.848 --> 00:04:35.043
First year principal 18 out of 30 teachers had two of them worked at Panera before I hired them, like we should not have had any success.

00:04:35.043 --> 00:04:39.682
But the AI kind of kicked in and I thought I said this is, we have found the golden spoon, like you know.

00:04:39.682 --> 00:04:56.334
So I was passionate about it and then I just think that education wasn't ready at the time, because you can't tell I'm passionate, ok, you can't tell I'm passionate, I don't know that the system was ready at the time for someone who was genuinely passionate and genuinely in love with kids and generally passionate about this new technology.

00:04:56.334 --> 00:05:08.872
And so as it stood, you know the system, I think honestly, I think they do their best, but you know, when you have this type of growth, it was problematic and so I became a consultant.

00:05:08.872 --> 00:05:11.329
I became a consultant reluctantly.

00:05:11.620 --> 00:05:17.411
I was not trying to consult, so I was trying to just tell people hey, man, this thing is it, man, this thing is it.

00:05:17.411 --> 00:05:20.699
And that's how I got started.

00:05:20.699 --> 00:05:24.021
And so I'm here, man, just really I like what you do.

00:05:24.021 --> 00:05:27.226
It's funny that you mentioned EduAid man, because Thomas is the one that told me about you.

00:05:27.226 --> 00:05:31.374
I don't know, I don't know if I ever told you that, but Thomas said, man like you gotta, you gotta, you gotta see my ed tech life.

00:05:31.374 --> 00:05:42.165
And you were the first podcast I listened to.

00:05:42.165 --> 00:05:46.327
Like when I was in my conference and spoke to a.

00:05:46.327 --> 00:05:52.752
It was a national conference and I just was speaking about passion for education, right, and like people were like, hey, you know, come talk to me.

00:05:52.752 --> 00:05:55.175
And I was like, oh crap, I got to have a business.

00:05:55.175 --> 00:06:52.139
So like that's why I say consultant, this kind of came in reverse, if that makes sense no-transcript.

00:06:52.220 --> 00:06:55.552
We help the students succeed, because that is your goal in mind.

00:06:55.552 --> 00:06:59.283
What was that reaction from people around your district?

00:06:59.302 --> 00:07:20.415
So I don't know that anybody expected that level of success and, I'll be honest with you, I don't think anybody expected I wasn't know that anybody expected that level of success and, I'll be honest with you, I don't think anybody expected I was expecting it, but I was willing to try anything right, like I felt like if I'm in the position, particularly as someone who has had great experiences and has had challenging experiences in school, if I'm in the position to make some sort of a difference, I'm going to do that.

00:07:20.415 --> 00:07:22.127
And so I used it.

00:07:22.127 --> 00:07:27.889
And what happened was I started experimenting with ChatGPT, really as a first year principal.

00:07:27.889 --> 00:07:31.483
Having only been a assistant principal for one year, you have to understand I'm like I had to learn it.

00:07:31.483 --> 00:07:34.071
And so I said, well, let me get ChatGPT to help me.

00:07:34.071 --> 00:07:34.880
I knew that it was.

00:07:34.880 --> 00:07:35.562
They were.

00:07:35.601 --> 00:07:42.935
When the internet first came in, when AI first came out, they were advertising it as like optimization, you know, and all that.

00:07:42.935 --> 00:07:47.105
So I said, well, I can optimize, like my home businesses or something like that, so I can learn principle.

00:07:47.105 --> 00:07:51.625
And then when I clicked, I was like, whoa, wait a minute, wait a minute, I can bring it here.

00:07:51.625 --> 00:08:02.262
And so I started like just doing all the data analysis and my staff is on board, like if you're a first year teacher and you have never taught you like, and all I got to do is show up every day and you can do all my data work.

00:08:02.262 --> 00:08:04.423
So you're like, and all I got to do is show up every day, you can do all my data work.

00:08:04.423 --> 00:08:05.165
All of my leaders.

00:08:05.165 --> 00:08:16.314
I had to use them and I was doing their work, essentially because I was getting success and I was early reaching out to like other principals and other friends and they were like oh, oh.

00:08:16.314 --> 00:08:22.024
At first it was really like you don't touch that, we don't know, we don't get in trouble.

00:08:22.024 --> 00:08:23.447
So I was like let's go use it.

00:08:23.447 --> 00:08:29.608
So if you know about AI, it's use cases, right, and it's actually technology in reverse.

00:08:29.608 --> 00:08:34.642
I literally needed like a fifth year principal to just tell me some stuff right, like I could have.

00:08:34.642 --> 00:08:38.432
They could wipe the floor clean with me on AI, but I'm here just like.

00:08:38.432 --> 00:08:41.349
So I had to borrow use cases from my own team.

00:08:41.349 --> 00:08:44.583
So, like all the leaders the math leader, the reading leader I'm just.

00:08:44.583 --> 00:08:46.486
I said I'm doing data analysis.

00:08:46.486 --> 00:08:48.490
They're like are you?

00:08:48.490 --> 00:08:51.674
Yes, I am, I'm learning AI, right.

00:08:51.674 --> 00:09:03.546
So I would do the data analysis and we flipped it to where, instead of them coming together in that collaborative planning and just doing data analysis, no, when they came together in collaborative planning, the answers were already the data was already analyzed.

00:09:03.546 --> 00:09:05.308
Now it's like, how do we apply this?

00:09:05.308 --> 00:09:09.321
And they were like, oh my gosh, this was so like you know.

00:09:09.380 --> 00:09:16.866
So that developed and developed to where nobody was doing anything but being with kids If it took you away from kids, give it to me is what I said.

00:09:16.866 --> 00:09:18.171
I had a secret weapon.

00:09:18.171 --> 00:09:20.562
I mean, in that time it was a free chat at GBT.

00:09:20.562 --> 00:09:32.701
You had to prompt it, you had to give it an identity and all that, and so that that did it, and so so, to fast forward, we were I'm a process guy, right, like I knew that data and all that in the school systems.

00:09:32.701 --> 00:09:35.328
Like I don't, I'm an integrity and process guy.

00:09:35.328 --> 00:09:39.063
You just show up every day, work hard and do right, do absolutely right by kids.

00:09:39.063 --> 00:09:40.547
Everything else will take care of itself.

00:09:41.368 --> 00:09:51.344
And so it happened and I was sharing some stuff with other principals because, you know, I'm like use this, use this, use this.

00:09:51.344 --> 00:09:52.327
Everybody was afraid, everybody was afraid.

00:09:52.327 --> 00:09:57.006
So the initial reaction was great job, like I got emails from the State Department and, like you know, superintendent had like shouted me out and everything.

00:09:57.006 --> 00:10:07.841
But when it kept going, like I think people were like wait a minute, like this guy's excited and he won't shut up about it and he's just because I couldn't understand why people just wouldn't use it Like it's free.

00:10:07.841 --> 00:10:08.804
It's free, I will.

00:10:08.804 --> 00:10:10.931
And then the custom GPTs came out.

00:10:10.971 --> 00:10:11.130
Right.

00:10:11.200 --> 00:10:11.321
Like.

00:10:11.321 --> 00:10:13.147
I'm like I will create you what it's free.

00:10:13.147 --> 00:10:14.740
It's free, no.

00:10:14.740 --> 00:10:16.743
So that was it.

00:10:16.743 --> 00:10:23.792
That was a crisis for me because I think at that time I realized education wasn't ready and there was a lot that went bad.

00:10:23.792 --> 00:10:26.975
Honestly, that I don't know.

00:10:26.975 --> 00:10:33.804
I don't blame anybody, but I think we are so accustomed to I mean, it's a hundred year institution, right Like we.

00:10:33.804 --> 00:10:39.143
We are not even aware of what we have been ingrained in and what we push out and what we do.

00:10:39.143 --> 00:10:39.844
It's just autumn.

00:10:39.844 --> 00:10:45.461
So a lot of it was that, but it was a brutal like it was, you know.

00:10:45.480 --> 00:10:49.568
So I became a consultant, you know, and then it was weird because I'm trying to tell principals to use it.

00:10:49.568 --> 00:10:52.187
It's like he's selling free stuff, like that's I was.

00:10:52.187 --> 00:10:53.561
It was like it's not free.

00:10:53.561 --> 00:10:55.988
It's chat, tvt, it's free.

00:10:55.988 --> 00:10:58.701
Well, if it's free, you're probably trying to get money.

00:10:58.701 --> 00:11:01.063
On the other end, we know about free samples.

00:11:01.063 --> 00:11:04.628
We go to Costco's and we know what a free sample is.

00:11:04.628 --> 00:11:05.369
You're trying to get rid of it.

00:11:05.369 --> 00:11:07.291
I'm not Like, how do you sell?

00:11:07.692 --> 00:11:09.734
I did not, I did not sell it.

00:11:09.734 --> 00:11:47.552
I just I wanted you to have it, I wanted you to have, I wanted to see what.

00:11:47.552 --> 00:11:55.504
Honestly, okay, if I'm being honest, I wanted to see what somebody who had been a principal for longer could do with it, like I had run out of stuff to do, so I wanted just to see what other people could do with it so I could get more ideas.

00:11:55.504 --> 00:11:58.874
I wasn't thinking about money, so I was like make money, make money.

00:11:58.874 --> 00:11:59.533
I was like, no, no, no.

00:11:59.533 --> 00:12:01.277
So that was the crisis.

00:12:01.277 --> 00:12:02.860
And then I said, well, let me charge a little bit.

00:12:02.860 --> 00:12:05.764
And then when I charged, it was like, oh, he's trying to make money.

00:12:05.764 --> 00:12:26.750
We knew he was just trying to make money no-transcript.

00:12:26.950 --> 00:12:29.254
Sometimes I feel that it's a mixed bag.

00:12:29.254 --> 00:12:46.774
Where it's, the teachers are driving it, but the tech departments maybe don't even know that this is being used, and so it's all over the place, many places, and I know that there's a lot of people with that enthusiasm that are like, well, if you haven't been using it, you're missing out and you're hurting kids and you're doing this.

00:12:46.774 --> 00:12:51.232
And I was like, well, there's also the other side that's like, hey, you know what?

00:12:51.232 --> 00:12:59.110
We just want to be very cautious, we want to make sure that what is being put out and what is being used is something that is going to be beneficial and ethical.

00:12:59.110 --> 00:13:07.143
And, of course, you take all of those considerations and then you've got, like you said, the speedboats that are no, no, no, move fast, break things, don't worry about it, we'll take care of it.

00:13:07.569 --> 00:13:17.144
And I'm more kind of like in the middle and like I people on the show, you know, just being a very cautious advocate, where sometimes I'll see the news and I'm like, all right, like yes, okay, this is going to help us move forward.

00:13:17.303 --> 00:14:00.078
And then all of a sudden, something happens and it's like, oh, you know, let me kind of like, okay, let's, let's slow it down and maybe it's just me overanalyzing things, but that's why I love having these conversations, because I get to amplify so many voices and so many experiences here, including your own, which I think is something that is very attractive to have a principal or former administrator using this and showing the success that they had, but then also getting your perspective that when you go out there and you visit with administrators people that were in the same roles, doing very similar things and are not you know that they're a little, very cautious, or maybe overly cautious, in seeing what this might be able to do for them, like it did for you.

00:14:00.178 --> 00:14:11.014
So I want to ask you now, from that moment on in 2022 to 2025, how are you feeling about AI and what it is that you're seeing through now that you're a consultant?

00:14:11.014 --> 00:14:22.052
In that consulting perspective, are there still a lot of people holding out and, through your experience, what might be some of the main reasons that they're holding out?

00:14:23.396 --> 00:14:28.391
So I'm going to, I'm going to be, I'm going to shock you, right, like I'm going to be honest, right, so you.

00:14:28.391 --> 00:14:45.141
So, so it's in reverse, the people who are holding out and they're holding out because they don't understand it or don't really see what it is, and all that If they are genuinely doing that, they are in the perfect position, and I mean this is going to be the wildest thing.

00:14:45.141 --> 00:14:57.476
Educators as a, as an industry, and not even industry education as as all of us, right, we are the most beautiful humans, like the absolute most beautiful humans.

00:14:57.476 --> 00:15:01.880
We internally care about kids and want to protect them.

00:15:01.880 --> 00:15:17.899
So something, and I will tell you this, something on the like, some intuition if you really do care about kids and you really are good at what you do, you, you Did, you would not have jumped on, or, if you did, it would have been reluctantly because it was presented wrong.

00:15:17.899 --> 00:15:19.389
And this is the issue.

00:15:19.389 --> 00:15:22.710
This is what I was afraid of, right, the dirty, dark, little secret.

00:15:22.710 --> 00:15:43.307
If I'm being honest with you, the year after I had all that success, it was total failure, right, let me be honest, right, and that's what made drove my guilt to be honest with you, made drove my guilt, to be honest with you, and that's what drove me like on this mad rush to make sure that everybody because I felt like I had I knew what the issue was and I didn't want it to to progress.

00:15:43.307 --> 00:15:44.572
And here's what the issue was.

00:15:44.873 --> 00:15:48.144
The first year, ai, I saw it as an adult thing.

00:15:48.144 --> 00:15:55.284
Right, I saw it as data analysis because I saw I still see teachers as the gifts of the world to education.

00:15:55.284 --> 00:15:58.646
So I use it to get everything out of the way of teachers.

00:15:58.646 --> 00:15:59.649
Teachers did not use ai.

00:15:59.649 --> 00:16:01.093
Like we had some ai stuff.

00:16:01.093 --> 00:16:02.687
They would come and like do lessons here and there.

00:16:02.687 --> 00:16:12.968
The older teachers like they actually the seasoned teachers, like the five or six I had, like that were really seasoned, were actually using it to bring the younger teachers up to speed, like so it was an adult thing.

00:16:12.968 --> 00:16:14.513
Definitely the kids don't have it.

00:16:14.513 --> 00:16:24.715
The next year, like we did get a grant from the district and they gave us a bunch of money to use it and so I gave it to the teachers to put inside the classrooms and they were doing all the wonderful things.

00:16:24.715 --> 00:16:27.529
They were up, you know, they were fortnighting lessons and things like that.

00:16:27.971 --> 00:16:31.644
But my data went way in the trash and I was I panicked.

00:16:31.644 --> 00:16:34.126
I said what is what is this?

00:16:34.126 --> 00:16:39.390
It dawned on me at the time at least, that AI is very abstract.

00:16:39.390 --> 00:16:40.150
It's math.

00:16:40.150 --> 00:16:41.972
It's basically math with words, right.

00:16:41.972 --> 00:16:47.697
So it's very abstract in how it presents things, but it does not present as pedagogy.

00:16:47.716 --> 00:16:55.001
As a teacher, our standards are aligned and they're like things that are connecting standards that we are not even aware of.

00:16:55.001 --> 00:17:06.798
So if you are a new teacher, yes, you could punch in something and a lesson comes out in 17 seconds, but it may be missing a key that went to the standard two days before or standard back there.

00:17:06.798 --> 00:17:08.265
It's basically a hotspot.

00:17:08.265 --> 00:17:27.239
So the kids knew that one plus one was two, but because the common core and because some of this other stuff is more conceptual in learning, while AI is abstract, my kids knew one plus one was two, but when the test came out and they had to draw the relationship between numbers, they did not do it and my teachers did not do it.

00:17:27.239 --> 00:17:30.855
So it was a mismatch with kids, at least in elementary.

00:17:30.855 --> 00:17:33.692
I was like crap and then I figured I know why it is.

00:17:33.692 --> 00:17:34.426
It is about.

00:17:34.426 --> 00:17:37.056
It's about collaboration and synthesis.

00:17:37.056 --> 00:17:41.150
So they were too young, they were still learning, like the kids have to learn the thing.

00:17:41.150 --> 00:17:44.394
The synthesis, without AI, for us, comes in middle school.

00:17:44.394 --> 00:17:45.650
Right, we're working in groups in middle school.

00:17:45.650 --> 00:17:49.192
High school, we're putting papers together by college, we're doing research.

00:17:49.192 --> 00:17:58.747
That's the synthesis adds as you go up Elementary, I was like oh no, so that really was the danger.

00:17:58.767 --> 00:18:02.136
And so I saw that that's where the entirety of K-12 started.

00:18:02.136 --> 00:18:12.513
So in the summertime I was like no, I was crashing out, as the kids would say on Facebook, and the principals groups like hey, you guys got to use this.

00:18:12.513 --> 00:18:22.810
No, no, I saw the teachers run into it and it to me that the teachers are kids at all, but it's like they're going to do ed tech stuff with it.

00:18:22.810 --> 00:18:28.956
Like, principals, please, superintendents, I was yelling, I was screaming, anybody would listen and everybody was silent.

00:18:28.956 --> 00:18:32.875
But the teachers were running with and I was like no, no, no, no.

00:18:32.875 --> 00:18:36.288
So that was the painful part for me in the summertime, like.

00:18:36.288 --> 00:18:39.030
But what I had to realize is it was really an issue of control.

00:18:39.030 --> 00:18:47.879
I can't control that, right, like, I wanted to be one man that like had the answer, you know, but I couldn't control it, so I just tried.

00:18:47.879 --> 00:18:49.501
So then I kept trying to get into systems.

00:18:49.501 --> 00:18:52.248
Right, edna Bay came and I'm helping systems out.

00:18:52.268 --> 00:18:59.051
But what was happening is the other thing about ai it has to land for all of us at the same time and then we apply it differently.

00:18:59.051 --> 00:19:00.316
It's not core.

00:19:00.316 --> 00:19:04.269
So if edivate was getting these, we were getting contracts right.

00:19:04.269 --> 00:19:17.313
But it's like if I just go and work with your special education department and just use ai there, it's going to offset, it's going to throw everything off, like it's I'm actually hurting adoption, like at some point it's going to come back and hurt, and so I.

00:19:17.633 --> 00:19:28.198
So I had that crisis and, and what happened recently, which is great is is now a couple of school districts are like hey, we get it, let's go, let's do it as a district.

00:19:28.198 --> 00:19:30.150
Everybody does it together.

00:19:30.150 --> 00:19:33.498
Now we apply it differently, but it makes teachers humans again.

00:19:33.498 --> 00:19:36.193
Right, because now teachers it's basically knowledge.

00:19:36.193 --> 00:19:37.869
In reverse, they have everything they need.

00:19:37.869 --> 00:19:39.869
So the teachers have always been a gift.

00:19:39.869 --> 00:19:41.174
They've always been a gift.

00:19:41.174 --> 00:19:43.092
Now it's us getting out of their way.

00:19:43.092 --> 00:19:52.205
It's what I did with it the first year Get out of the teacher's way, shut the door, let them do what they do, and we use AI to do all the data analysis and all that stuff like that.

00:19:52.205 --> 00:19:55.849
So now I'm full circle because I see I think it's picking up.

00:19:55.849 --> 00:19:58.673
I think it's picking up, I think people are getting it.

00:19:59.153 --> 00:20:00.875
Ed tech is not out of the picture.

00:20:00.875 --> 00:20:06.260
It's just that ed tech, like teachers, kind of moved to aiding.

00:20:06.260 --> 00:20:16.472
Right, we're aiding, assisting more so than dominating, and I think that identity is what ed tech has to take in order to really, like you know, be relevant in the new era, right?

00:20:16.472 --> 00:20:20.098
So we want teachers to really be autonomous and we really believe in teachers.

00:20:20.098 --> 00:20:24.006
We've got to let them be them.

00:20:24.006 --> 00:20:25.645
That that that's that to me, that's.

00:20:25.645 --> 00:20:26.406
I hope that helped a little bit.

00:20:26.406 --> 00:20:27.987
I know, yeah, yeah, no, no, no, of course.

00:20:28.346 --> 00:20:57.738
Which kind of leads me to to my next question and, just to be open and honest, I follow you on LinkedIn and, of course, on LinkedIn I did see a post you know you posted up recently I saw it yesterday and it gained a lot of traction and it really just highlights the concerns about edtech platforms just repackaging, you know, themselves, you know, and just really taking those chat, GPT, AI models and again talking about those wrapper platforms and so on.

00:20:57.738 --> 00:21:01.658
So I want to ask you, you know what are your thoughts on that?

00:21:01.658 --> 00:21:12.083
You know, right now we're talking, you know how you use ChatGPT on its own, but, as you know, you know, there are several platforms out there, very popular ones, that are just really full on.

00:21:12.143 --> 00:21:12.824
Can I call them out?

00:21:12.824 --> 00:21:13.804
I'm going to call them out.

00:21:13.804 --> 00:21:15.165
I'm going to call them out.

00:21:15.165 --> 00:21:16.548
I don't owe nobody anything to kids.

00:21:16.607 --> 00:21:21.016
I'll kids all right, let me just say right, I'm sorry.

00:21:21.016 --> 00:21:25.328
I let me just say rapper sites, that the okay.

00:21:25.328 --> 00:21:28.954
Let me, let me say it this way, that to me, integrity is first.

00:21:28.954 --> 00:21:42.852
Integrity is first I, I, when I say rapper sites, it is okay to say we use chat, gbt and we're making it easier for you as a teacher, because it's a lot of learning curve.

00:21:43.092 --> 00:21:57.096
But here's what water is, water is H2O, like that's water, we're ice, right, we're the ice company, right, and if that's how we teach, so if I don't make that absolutely clear, there's a little bit of deception to me.

00:21:57.096 --> 00:21:59.309
Like you've got to be absolutely clear, especially for teachers.

00:21:59.309 --> 00:22:00.855
They're exhausted.

00:22:00.855 --> 00:22:01.786
I've been a principal.

00:22:01.786 --> 00:22:04.315
Like they're exhausted, I did everything I could do to help them out.

00:22:04.315 --> 00:22:06.126
They're exhausted.

00:22:06.126 --> 00:22:09.625
They don't deserve to have to look at fine print, they don't deserve to have to do that.

00:22:09.625 --> 00:22:11.088
So you've got to be honest.

00:22:11.088 --> 00:22:38.664
So in me I was giving for free, I was literally for free, no-transcript.

00:22:38.664 --> 00:22:55.132
But when you don't tell teachers who are being introduced for the first time to an LLM what an LLM actually factually, without question, is, to me that's the beginning of deception and I can't trust nothing else that happens.

00:22:55.744 --> 00:22:55.925
Now.

00:22:55.925 --> 00:23:00.163
Rapper sites are great if it's like you know, like I go to 7-Eleven.

00:23:00.163 --> 00:23:02.712
I don't know, you know I go to 7-Eleven because I don't feel like cooking.

00:23:02.712 --> 00:23:09.552
So you know, I know that, like they, probably, you know, I'm probably paying three times or whatever, but they're honest about it, right?

00:23:09.552 --> 00:23:14.173
So, uh, educate, I know that you, I know that I'm not.

00:23:14.173 --> 00:23:16.657
I get paid nothing from nobody.

00:23:16.657 --> 00:23:22.444
I want to be very, very clear.

00:23:22.444 --> 00:23:23.768
I am talking about integrity versus not integrity.

00:23:23.768 --> 00:23:26.112
Thomas I talked to, like I call him T squared, t squared, tom Tom.

00:23:26.112 --> 00:23:27.775
So they.

00:23:28.155 --> 00:23:38.199
What I loved about them, what warmed my heart, was that they are very clear about what it is.

00:23:38.199 --> 00:23:43.096
They are like this is an LLM, this is chat GBT or this is whatever.

00:23:43.096 --> 00:23:45.691
This is what we do to help you.

00:23:45.691 --> 00:23:51.094
We are in, we are in education and we're going to make it easy, we're going to make the bridge easier for you to move into this.

00:23:51.094 --> 00:23:54.193
There is no, no question as to what's going on.

00:23:54.193 --> 00:24:02.424
Every time I was on the internet, people who were part of like magic school I'm going to call them out magic school and I'm only being fair because I reached out to them several times.

00:24:02.424 --> 00:24:11.310
I went off the last couple of days because somebody inboxed me and said I was doing it for clicks and likes and that is low.

00:24:11.310 --> 00:24:11.992
That is low.

00:24:13.025 --> 00:24:17.615
So EduAid spends a lot of time being very honest about what they are and what they do.

00:24:17.615 --> 00:24:20.589
And I can, I don't when I figure that out.

00:24:20.589 --> 00:24:24.446
I just I left them alone because I can trust anything that they do at this point, right, like what.

00:24:24.446 --> 00:24:35.288
On the flip side, if you obscure that, because teachers don't know what LLMs are, they don't know what AI is, they don't understand that, like you're really just prompt, pre prompting, I'm just literally pre prompting.

00:24:35.288 --> 00:24:39.432
They don't understand it, but they could appreciate it if you're honest.

00:24:39.432 --> 00:24:41.292
So there are a lot of sites.

00:24:41.292 --> 00:24:53.701
I don't even know all the AI sites, but if AI sites are honest about what they are and what they do and they are really giving you their expertise to unlock your expertise and they are closing that bridge and fixing that bridge for you, that's perfect.

00:24:53.701 --> 00:24:54.525
I think that's perfect.

00:24:54.525 --> 00:24:58.016
I think that's perfect, like because I think it's perfect because AI is personalized.

00:24:58.016 --> 00:25:06.977
But if I have to hide or pretend or obfuscate or say, well, sometimes we use this, sometimes we use that, we can't do that Like, we can't do that, right, you can't do that.

00:25:06.977 --> 00:25:09.244
So that to me that hurts teachers.

00:25:09.244 --> 00:25:11.807
That assumes that they are not intelligent.

00:25:11.807 --> 00:25:18.692
It takes advantage of what they don't know instead of helping them understand, and we don't have any place for that in the field of education.

00:25:19.313 --> 00:25:23.757
Ed tech is going to be strong in this new era right, because ed tech still has this place.

00:25:23.757 --> 00:25:29.801
But if ed tech wants to slide over it and pretend to be something, that's going to be a problem, right, but there may be some AI.

00:25:29.801 --> 00:25:36.849
Ai gives you an opportunity to be more specialized, even right.

00:25:36.849 --> 00:25:41.071
So I see AI sites for, like, the revolutionary war right, like you know, you might go to this.

00:25:41.071 --> 00:25:42.609
You know it gives you a chance to be specialized.

00:25:42.629 --> 00:25:43.713
But you have to be honest.

00:25:43.713 --> 00:25:49.497
We are missing integrity and it was not our fault, it was allowed.

00:25:49.497 --> 00:25:51.673
Our entire nation is going through the same thing right now.

00:25:51.673 --> 00:25:53.629
Right, we're just a microcosm.

00:25:53.629 --> 00:25:54.753
Be honest.

00:25:54.753 --> 00:25:59.709
We have to be true, because all of that has torn us apart.

00:25:59.709 --> 00:26:02.214
It has been things that have been hidden and the bureaucracy and all that stuff.

00:26:02.214 --> 00:26:04.346
Thankfully, I think it's going away.

00:26:04.528 --> 00:26:07.695
But if we're in ed tech, it's got to be about honesty and truth.

00:26:07.695 --> 00:26:10.828
No one should be confused about what we have behind the scenes.

00:26:10.828 --> 00:26:20.917
A rapper's side is fine, as long as they know that, hey, I can go right to magic school and get a recipe for my dinner tonight right in that second grade English slot.

00:26:20.917 --> 00:26:25.412
Teachers didn't know that and when I would say that, they would tell me that I'm wrong.

00:26:25.412 --> 00:26:32.266
So I said you don't, that means they're not, it's not being clear to them or it will be deleted, like, like there was some other stuff going on.

00:26:32.626 --> 00:26:37.434
So I reached out, cause I said you know, I know that a deal wouldn't do this, like maybe he doesn't know whatever.

00:26:37.434 --> 00:26:39.960
I reached out, I reached out and I got it's the swarm.

00:26:39.960 --> 00:26:42.911
The swarm came and the swarm to me, honestly, are.

00:26:42.911 --> 00:26:54.754
I think those educators are, are great people, like, like, anybody who wants to innovate, to fix education is a great human.

00:26:54.754 --> 00:26:57.858
But I think that they may have been misled.

00:26:57.858 --> 00:26:59.000
Also, they don't know.

00:26:59.000 --> 00:27:07.529
I'm waiting for the day that we're just all honest with each other and our teachers are able to be special with kids and we can get out of their way.

00:27:07.529 --> 00:27:09.531
That's what it is, yeah.

00:27:10.066 --> 00:27:17.989
And that brings me also to listening to you to where I have been, even since 2022,.

00:27:17.989 --> 00:27:27.361
What happened for me was that I was taking a doctoral course and the 2022, that November, had just come out Chad GPT.

00:27:27.361 --> 00:27:28.849
I was playing around with it.

00:27:28.849 --> 00:27:30.285
I was like, oh my gosh, this is amazing.

00:27:30.285 --> 00:27:38.146
And then I had to do research on AI and I'm talking about, you know, from 2020 on, you know even prior to that.

00:27:38.647 --> 00:27:50.086
And then, when I got to the section of learning what data rentiership is, is all that data, everything that they collect, what they do really essentially is they're you know, that's how they're making money.

00:27:50.086 --> 00:27:56.965
You know a lot of the platforms and that's why they said if it's free, then you are the product.

00:27:56.965 --> 00:28:08.115
And many times we do not get a lot of clarity in the terms of service as far as how this is going to be used, how those clicks are going to be used, how the student data is going to be used.

00:28:08.115 --> 00:28:15.935
They always you know they and they do their due diligence in wording it, and I know that you know many say like, well, we put guardrails on this.

00:28:15.935 --> 00:28:34.491
My stance sorry, real quick my stance has always been how can you put guardrails on something that you don't own, you're plugging in to something else other than your programmers, saying, if this, don't do that and then just to you know kind of control that output.

00:28:34.491 --> 00:28:37.598
So yeah, that's where I'm at on that.

00:28:37.704 --> 00:29:31.100
And the knowledge cutoff dates too on a lot of platforms that I've seen it in some, where their knowledge cutoff date is, you know, 2023.

00:29:31.100 --> 00:29:32.872
We're already 2025.

00:29:32.872 --> 00:29:39.067
Now they either haven't updated their terms of service and privacy or maybe that's really where they're at, you know.

00:29:39.067 --> 00:29:48.218
So there's still a lot of stuff there that for me, I want to cheer them on, but when I read those things I'm like I don't know if I'm there yet.

00:29:48.218 --> 00:29:49.989
You know I'm seeing it.

00:29:49.989 --> 00:29:58.833
But again, that transparency and being open about it sometimes, that that is where I'm kind of like in the middle all the time and just very cautious.

00:29:59.654 --> 00:30:01.076
And you should be like.

00:30:01.076 --> 00:30:04.061
So I think, I think so.

00:30:04.061 --> 00:30:06.402
We need the like, the ones that run out front.

00:30:06.402 --> 00:30:07.042
Right, that's me.

00:30:07.042 --> 00:30:08.042
Right, I'll.

00:30:08.102 --> 00:30:13.710
The first time I hear about something, I'm running after it, I don't care, right, like, you need people like that, but you also need to listen to people like that.

00:30:13.710 --> 00:30:14.874
That that's the other issue.

00:30:14.874 --> 00:30:19.673
Right, you got to listen to them because if they they've had some success, I started realizing something was off.

00:30:19.673 --> 00:30:26.457
When you're asking me for data, and I gave you the data that it didn't matter, right, like, like, like, what, what else do you?

00:30:26.457 --> 00:30:29.573
I said I started with the data that the school system is failing.

00:30:29.573 --> 00:30:30.456
That's the data I started with.

00:30:30.456 --> 00:30:33.398
Like, I don't need any other data, then we're failing, right, and so.

00:30:33.398 --> 00:30:38.631
So there was a little bit more like I was starting to learn a little bit that didn't know.

00:30:39.553 --> 00:30:44.060
And education, and you know this, in education, the layers are thick in between.

00:30:44.060 --> 00:30:47.715
So teachers really don't know what goes on in the second layer.

00:30:47.715 --> 00:30:53.034
Principals, really, I'll tell you, do not know what's going on above them, right, like I owe no one but kids.

00:30:53.034 --> 00:30:54.650
So when I talk, I'm telling the truth.

00:30:54.650 --> 00:30:56.154
So they don't know.

00:30:56.154 --> 00:30:58.086
And that's by design, right, that's by design.

00:30:58.086 --> 00:30:58.307
So.

00:30:58.307 --> 00:31:01.535
So what was happening was now.

00:31:01.535 --> 00:31:05.739
If you see me looking down at teachers, looking at them, now I see, knowing what I know, ai.

00:31:05.739 --> 00:31:08.954
They're not getting direction and they are trying their hardest.

00:31:08.954 --> 00:31:12.067
Like I'm not trying to be like emotional, but like these teachers are trying their hardest.

00:31:12.067 --> 00:31:18.865
Man like and I know what it is and every teacher that has left these teachers are not bad people like.

00:31:18.865 --> 00:31:20.927
They left because they cared.

00:31:20.927 --> 00:31:26.951
So if you have any like hesitation, it's probably because something in your soul says this is not it.

00:31:26.951 --> 00:31:28.731
This something is off about this.

00:31:28.731 --> 00:32:00.335
Right are lots of things that can be in the way of success that are intentionally placed there Data privacy if you know anything about AI, you just cut and paste the internet policy that you had, right.

00:32:00.595 --> 00:32:02.159
You don't need to do much more than that.

00:32:02.159 --> 00:32:15.656
And, as a matter of fact, if you don't clarify that you're using a wrapper site, then you get the opposite, because then teachers trust an AI site that's third party because they think in their brain we've been looking at the educational site for our entire careers.

00:32:15.656 --> 00:32:21.950
You don't know, if you don't tell them the truth, they'll put more information in AI just because they.

00:32:21.950 --> 00:32:27.671
I hope that makes any sense, but it's just by doing it the opposite, you're hurting the teachers.

00:32:27.671 --> 00:32:29.150
They have already gone through enough.

00:32:29.150 --> 00:32:30.835
They've gone through enough, like, stop it.

00:32:30.835 --> 00:32:38.027
So what I think we have to have a discussion about and I think it's fair is, if we're honest, that's how I know when that magic school thing happened.

00:32:38.027 --> 00:32:39.428
That's how I know when that magic school thing happened.

00:32:39.428 --> 00:32:40.710
That's how I know people are not being honest.

00:32:40.710 --> 00:32:43.353
If we're honest about data privacy, you would have checked that already.

00:32:43.353 --> 00:32:55.517
If you're marching around here as an AI ethicist, ai governance, whatever you would have known, or unless you don't know AI, either you don't know AI or you would have known that that is not a safeguard.

00:32:55.517 --> 00:32:59.096
And why are we guarding anything?

00:32:59.096 --> 00:33:01.072
We should be guarding things away from kids.

00:33:02.305 --> 00:33:04.192
Today I saw for the first time and I'm going to shout after I say this.

00:33:04.192 --> 00:33:09.371
Today I saw for the first time and it warmed my heart and I'm trying not to be emotional man, like, like.

00:33:09.371 --> 00:33:10.133
I'm passionate.

00:33:10.133 --> 00:33:21.762
I'm passionate because because I know in my community, as a minority I'll be honest with you I know what not having access can do and perpetuity, right.

00:33:21.762 --> 00:33:23.465
I know what not having access can do in perpetuity, right.

00:33:23.465 --> 00:33:28.495
I know what not having access can do, and so I felt that this was intentional.

00:33:28.884 --> 00:33:35.133
And so today I saw it was an educator in Africa and he said, like he's a proud gatekeeper.

00:33:35.133 --> 00:33:38.990
And I immediately was like, no, no, the gatekeeper, because I had been trying.

00:33:38.990 --> 00:33:40.691
Nobody was giving Ken any attention.

00:33:40.691 --> 00:33:42.451
Right, because I know what it is now.

00:33:42.451 --> 00:33:45.393
Now, it's because Ken was not in it for anything wrong.

00:33:45.393 --> 00:33:46.726
I was in it for you, right so like.

00:33:46.726 --> 00:33:50.807
You can't have Ken around if you're trying to steal money from people, right so like.

00:33:50.807 --> 00:33:51.993
But this guy called gatekeeper.

00:33:51.993 --> 00:33:54.454
I was like, no, the gatekeeper's kept Ednavate out.

00:33:54.454 --> 00:34:06.282
So I almost like, in that world they see gatekeeping as protection, protecting the kids from the bad stuff.

00:34:06.282 --> 00:34:08.572
That's the gate they keep In America.

00:34:08.572 --> 00:34:12.782
It seems like the gate we're keeping is protecting the bad stuff from the good people.

00:34:12.782 --> 00:34:16.579
And so when he said that he says I'm a proud gatekeeper, I got it.

00:34:16.579 --> 00:34:18.616
I got it.

00:34:19.090 --> 00:34:24.842
We need gatekeepers, the real ones, the ones that make sure that teachers should not have to read fine print.

00:34:24.842 --> 00:34:28.230
They should not have to feel like they're going to get in trouble.

00:34:28.230 --> 00:34:31.840
If they do, they don't even need, they don't need the ed tech.

00:34:31.840 --> 00:34:53.050
They need, like, the honest ed tech because you need kids, you need to be able to see if kids are learning, yes, but they don't deserve to have to sift through more paperwork to figure out what they, what they could put in there, what they can't they, they don't do that.

00:34:53.070 --> 00:35:00.594
That's why I said and this is it enterprise when I, when I put out the um, the evidence of things unseen, or the evidence of things not, saying that was a a new era of framework for education based on what I know is truth.

00:35:00.594 --> 00:35:00.795
I, I don't.

00:35:00.795 --> 00:35:02.358
I know it's truth, I don't, I know it's truth.

00:35:02.358 --> 00:35:03.960
And what it does is it it?

00:35:03.960 --> 00:35:05.143
It builds the teacher.

00:35:05.143 --> 00:35:15.215
We have so much talent outside of the field of education that just simply could not stand one more day of their dignity being ripped apart.

00:35:15.215 --> 00:35:18.532
Right, and we've, we've pushed them out and we've made them look bad.

00:35:18.532 --> 00:35:21.858
They may be the best, the best force.

00:35:21.858 --> 00:35:23.782
Right, they may be the best force out there.

00:35:23.782 --> 00:35:30.396
So the ones that are remaining, that are holding on tight, you don't lie to them, you do not take advantage of them and stop treating them like children.

00:35:30.789 --> 00:35:38.930
So, when the AI policies and governance and all these people are marching around with it, that was I had been with kids as a principal.

00:35:38.930 --> 00:35:45.056
I, every day, I had to shield my teachers from stuff I knew did not make sense.

00:35:45.056 --> 00:35:46.681
I would get in trouble.

00:35:46.681 --> 00:35:48.755
Actually, I won't get in trouble because this is all fake.

00:35:48.755 --> 00:35:49.980
I will not get in trouble.

00:35:49.980 --> 00:35:50.831
You're not getting rid of me.

00:35:50.831 --> 00:35:56.054
I knew that all I had to do was love my babies, and if I loved them babies, the parents would stand up for me.

00:35:56.054 --> 00:35:56.615
I knew that.

00:35:57.197 --> 00:36:02.414
So if there was something that didn't make sense, that robbed teachers of dignity, I did not bring it to my building.

00:36:02.414 --> 00:36:08.764
I did not bring it to my building, and all I want is that for the entirety of education.

00:36:08.764 --> 00:36:20.563
And so when I see these things, when I see things like data privacy and this and that, and oh, we got to do governance, we got to put guardrails, we got to do all that, I feel like it's limiting.

00:36:20.563 --> 00:36:30.624
I feel like it's intentionally limiting, because none of these words occur in other countries and none of these words are priorities in private schools in our own nation.

00:36:30.624 --> 00:36:39.262
Let teachers be human, use AI to clear the forest and let them shut their doors and be great.

00:36:39.262 --> 00:36:40.250
That's what AI is for.

00:36:40.250 --> 00:36:41.231
I'm sorry.

00:36:42.554 --> 00:36:45.039
That is very well stated and very well said.

00:36:45.039 --> 00:36:47.603
I'm passionate man.

00:36:47.603 --> 00:37:02.097
No, no, no, and that's great, and that's what we love here, and that's what I love about doing this show is the fact that it's I am right in the middle and I give equal time to both sides, and, again, it's just to continue these conversations.

00:37:02.097 --> 00:37:47.052
Sometimes, you know, maybe there's somebody there like like I reached out to you because of what you shared and I was like I gotta have ken, because I need somebody that is again like you right now, that it has nothing to fear, and you're just being open and honest with your observations because, like we were talking right now, even myself, in my current role and in any role, it's like there are very few and many people know this, and I've talked to those people and I said that there are probably just a handful of platforms or apps that I will directly stand behind because of their openness, transparency, genuineness, authenticness and authenticity, I should say, and about them and what they're doing, and that they're very open about it.

00:37:47.052 --> 00:37:52.695
There are many other apps, too, where I'm just like, well, I'll say I've seen stuff that comes up.

00:37:52.735 --> 00:37:55.346
You know, sometimes it's like, hey, I got this weird answer.

00:37:55.346 --> 00:38:00.159
This student got this weird answer and I was like, well, this shouldn't be happening.

00:38:00.159 --> 00:38:13.239
You know why should a teacher have something else to worry about on their plate when they put this application on there and the student gets a wrong answer or something that is harmful or gets, you know, even a lot of the image creators.

00:38:13.239 --> 00:38:35.541
When you type in certain words there still hasn't been a fix for that images, we, even when I try to create myself and then I put their stocky Hispanic male, it always puts me with a beard and a mustache, you know, and I'm like, and that's because I put like, no beard, no mustache, and I always look the same, you know.

00:38:35.541 --> 00:38:42.911
So it's a lot of those things that I'm concerned about, and worried about because, yeah, maybe some people, and sometimes I think like, am I overthinking this?

00:38:42.911 --> 00:38:49.110
But no, it's like you're saying that intuition that I feel like, wait a minute, like this isn't where it needs to be yet.

00:38:49.451 --> 00:38:55.764
So maybe we need to hold off a bit on that and let's see which ones are really working and are doing like you said.

00:38:56.470 --> 00:38:58.057
So you're right, you're right.

00:38:58.057 --> 00:38:59.293
But here's where we lean in.

00:38:59.293 --> 00:39:01.400
It's opposite, it's everything in reverse.

00:39:01.400 --> 00:39:08.041
This is I tell teachers and I tell educators, literally everything in reverse, everything in reverse.

00:39:08.041 --> 00:39:09.326
That's my, that's my thing.

00:39:09.326 --> 00:39:10.594
That slipped your thinking.

00:39:10.594 --> 00:39:12.420
Just just try, just flip your thinking.

00:39:12.420 --> 00:39:17.295
They're large, I call them large learning models, but they're large language models.

00:39:17.295 --> 00:39:21.315
I'll use L for learning just to kind of give teachers understanding that it does learn right.

00:39:21.755 --> 00:39:23.239
So it also wants to be corrected.

00:39:23.239 --> 00:39:24.061
It can be corrected.

00:39:24.061 --> 00:39:25.664
I had the same issue you had.

00:39:25.664 --> 00:39:27.213
I typed in on a Father's Day.

00:39:27.213 --> 00:39:30.422
I said good father tying a tie for a black son.

00:39:30.422 --> 00:39:36.409
And it was a white dad tying a tie for a black child, Not that white, that's not what I'm saying.

00:39:36.409 --> 00:39:45.021
It's just assume that good father meant white male, and if a son needed a tie to be tied, it's a black male.

00:39:45.021 --> 00:39:49.619
So because at first I said father tying, you know, good father tying tie, and it was too white.

00:39:49.619 --> 00:39:51.016
Then I said no, black son.

00:39:51.016 --> 00:39:52.476
Then it was a white father, black son.

00:39:52.476 --> 00:39:55.451
I said it didn't even put the two together, you know.

00:39:55.471 --> 00:40:00.418
So as you use it and you fix it and you do that, we need to lean in, because the more we lean in, the more it's learning.

00:40:00.418 --> 00:40:07.291
It learned what we had put in in the beginning.

00:40:07.291 --> 00:40:08.753
So everything that we gave it was a snapshot of who we were.

00:40:08.753 --> 00:40:12.802
If we want to be better now, we should see it being better.

00:40:12.802 --> 00:40:21.514
Like you may put something in Los Angeles, I may put something in Maryland, if we're both fixing chat, gpt and correcting no, not that, no, not that.

00:40:21.574 --> 00:40:25.021
This it's learning and collectively we're building it.

00:40:25.021 --> 00:40:38.336
So, really, if you think about it in reverse and this is this sounds really crazy, I know, but we almost can tell how much better we are as a society if everybody's using it and we're seeing better outputs, because it's going to do what it's learning from us right now.

00:40:38.336 --> 00:40:38.898
It's it's.

00:40:38.898 --> 00:40:40.181
It has what we had.

00:40:40.181 --> 00:40:41.371
It has what we were.

00:40:41.371 --> 00:40:45.501
That's why we have to use it, and educators especially have to use it, because we might have it.

00:40:45.501 --> 00:40:48.099
We're actually the ones you think the robots are teaching the kids.

00:40:48.099 --> 00:40:49.396
No, we need to teach the robots.

00:40:49.396 --> 00:40:50.829
They're not robots, by the way.

00:40:50.829 --> 00:40:57.141
We need to teach them and we need to teach the kids right so that when they use it, it's building the LLM the right way.

00:40:57.141 --> 00:40:59.090
It's building the right way.

00:40:59.912 --> 00:41:04.456
And that kind of brings me to, kind of my last question, as we kind of start wrapping up.

00:41:04.456 --> 00:41:21.615
I want to ask you, you know, for you, how can we better this education scape, you know, as far as the collaboration with, maybe, teachers and ed tech companies, you know, should there be?

00:41:21.615 --> 00:41:36.322
You know, I think many times it's there, there's companies that have been started by educators, but then, at the same time, it's like there's some that haven't and are still being put out there and maybe they don't have that experience and they give you what they think the teacher wants.

00:41:36.322 --> 00:41:41.623
So, for you, what might your suggestions be as far as your perspective?

00:41:41.623 --> 00:41:43.597
What do we need to do to make this better?

00:41:43.597 --> 00:41:45.854
So, what?

00:41:45.934 --> 00:41:47.217
I've thought a lot about this, right?

00:41:47.217 --> 00:41:56.835
So the reason why and you've got to look at that framework the reason why that framework is so complete and so pure is I sat as a principal.

00:41:56.835 --> 00:41:59.820
I, that is a.

00:41:59.820 --> 00:42:03.892
That is a very unique position because you know everything, like.

00:42:03.892 --> 00:42:04.854
You know how everything works.

00:42:04.854 --> 00:42:14.521
You don't know what they're doing because things are hit, but I, I can 100 tell you what a friday school lunch in elementary school has to do with the budget at the top, right?

00:42:14.521 --> 00:42:17.351
So that's what ai actually does.

00:42:17.452 --> 00:42:22.081
So, to begin, there cannot be any silo discussions, right?

00:42:22.081 --> 00:42:27.657
That's the issue that started it, where teachers ran to it because they heard it's not going to take your job.

00:42:27.657 --> 00:42:29.215
Ai is going to take your job.

00:42:29.215 --> 00:42:30.512
It's somebody who uses AI.

00:42:30.512 --> 00:42:31.237
Whatever the saying is.

00:42:31.237 --> 00:42:33.537
I was looking like no teachers.

00:42:33.537 --> 00:42:34.773
You guys deal with kids.

00:42:34.773 --> 00:42:37.367
You're never losing your job.

00:42:37.427 --> 00:42:39.597
Anybody that tries to put robots in a classroom is going to fail.

00:42:39.597 --> 00:42:42.675
It's going to fail, right, they tried something somewhere else is going to fail.

00:42:42.675 --> 00:42:47.282
So what we need to do is it needs to be, always need to be diverse groups.

00:42:47.282 --> 00:42:48.144
That's the point of AI.

00:42:48.144 --> 00:42:53.378
Ai is trying to pretend to be collaboration.

00:42:53.378 --> 00:42:54.981
That's what it's pretending to be.

00:42:54.981 --> 00:42:58.276
Every piece of technology is pretending to be something.

00:42:58.276 --> 00:43:05.811
We just did not know what AI was pretending to be, because we do not collaborate, we don't have like, we don't naturally do that anymore.

00:43:05.811 --> 00:43:09.117
The internet is fast communication between computers.

00:43:09.117 --> 00:43:13.335
Ai is like internet in 3D it's collaborating with information.

00:43:13.335 --> 00:43:14.838
Ai is what you don't even need.

00:43:14.838 --> 00:43:16.731
I kept saying AI is that tech and people are like.

00:43:16.731 --> 00:43:17.293
This guy is weird.

00:43:17.293 --> 00:43:18.255
It's not tech.

00:43:18.255 --> 00:43:19.199
I bet you're figuring it out now.

00:43:19.199 --> 00:43:21.251
Ai is what happens when we collaborate.

00:43:21.251 --> 00:43:26.278
When three people get together and start sharing ideas, all of a sudden, it seems like we all get smarter.

00:43:26.278 --> 00:43:34.351
Or when I was a ap, my joke was I was a seventh grade assistant principal and I had three kids in detention that were like I was like, no, they should not be together.

00:43:34.351 --> 00:43:35.615
They're going to be very creative.

00:43:35.615 --> 00:43:36.338
What happened?

00:43:36.938 --> 00:43:41.936
So we have to start by being um, collaborative, but everything in reverse.

00:43:41.936 --> 00:43:43.780
The teacher is now the boss.

00:43:43.780 --> 00:43:45.383
We've, we've, we've hurt them enough.

00:43:45.383 --> 00:43:50.021
We've hurt them enough and I am unapologetic about that and anybody who does not see it that way.

00:43:50.021 --> 00:43:56.512
You can absolutely probably work backwards and see that they are still waiting to maintain control and treat teachers like kids.

00:43:56.512 --> 00:43:59.300
The teachers are the bosses now, right?

00:43:59.300 --> 00:44:02.133
So what we need to do is figure out what they want right.

00:44:02.213 --> 00:44:03.075
Edtech has tried.

00:44:03.075 --> 00:44:06.583
Edtech has been in the worst situation.

00:44:06.583 --> 00:44:09.539
Right, they pay for it, but the teachers use it and they don't.

00:44:09.539 --> 00:44:11.876
Edtech knows where the gaps are right.

00:44:11.876 --> 00:44:14.095
So EdTech is like you know.

00:44:14.095 --> 00:44:17.782
We built this thing but the teacher needs to be the boss.

00:44:17.782 --> 00:44:20.778
Whatever the teacher wants is what the teacher gets.

00:44:20.778 --> 00:44:26.454
Everybody else needs to move out of the way.

00:44:26.454 --> 00:44:27.931
And if we don't do that, I will leak all of the secrets going up, because none of that really matters.

00:44:27.931 --> 00:44:35.059
If the teacher isn't in charge, anything up there that we're doing that doesn't serve the teacher means that we don't want them with the kid.

00:44:35.059 --> 00:44:38.617
We gotta move everything out of the way so the teacher and the child can be together.

00:44:39.150 --> 00:44:49.829
And our start in conversation is with teachers say what do you need?

00:44:49.829 --> 00:44:50.025
What can I do?

00:44:50.025 --> 00:44:50.739
They know more than we ever will.

00:44:50.739 --> 00:44:51.374
I learned that in my classroom.

00:44:51.374 --> 00:44:52.226
I told that to my teachers when I was a principal.

00:44:52.226 --> 00:44:53.163
I don't know fifth grade ELA.

00:44:53.163 --> 00:44:54.197
I won't even pretend to know fifth grade ELA.

00:44:54.197 --> 00:44:54.572
You are the fifth grade ELA.

00:44:54.572 --> 00:44:55.413
You got a master's in reading.

00:44:55.413 --> 00:44:56.637
I'm not trying to be that.

00:44:56.637 --> 00:44:58.481
I can help you connect with that kid.

00:44:58.481 --> 00:44:59.923
What can I do to serve you?

00:44:59.923 --> 00:45:01.103
What can I do?

00:45:01.103 --> 00:45:04.846
What data can I give you that will help you out in your classroom?

00:45:04.846 --> 00:45:05.846
That's what happened.

00:45:05.846 --> 00:45:06.807
We need to start with the teacher.

00:45:06.807 --> 00:45:08.126
They are the boss.

00:45:08.126 --> 00:45:08.867
We owe them.

00:45:08.867 --> 00:45:09.548
We have hurt them.

00:45:09.548 --> 00:45:10.251
We have hurt them.

00:45:10.251 --> 00:45:10.733
We have hurt them.

00:45:10.733 --> 00:45:11.253
We have hurt them.

00:45:11.253 --> 00:45:17.179
They are the boss, and then we ask them what they want and when that happens, we all get around them and we figure out.

00:45:17.179 --> 00:45:19.130
So they leave and go back with kids.

00:45:19.130 --> 00:45:24.554
Then we sit around and figure out how to put into action what they asked us to do.

00:45:26.876 --> 00:45:27.997
That is fantastic.

00:45:27.997 --> 00:45:30.338
Thank you so much for that perspective, ken.

00:45:30.338 --> 00:45:50.971
Thank you so much for today too, as well, like everything that you shared, your passion, your authentic voice, your genuineness I mean, like I said, this is I'm thankful that we were able to make this conversation happen, and I really do appreciate your shares, ken, thank you so much.

00:45:50.971 --> 00:45:52.934
But before we wrap up, I always love to end the show with the last three questions.

00:45:52.934 --> 00:45:53.796
So, ken, hopefully you are ready.

00:45:53.796 --> 00:45:58.365
So, as you know, every superhero has a point of weakness or something that weakens him.

00:45:58.365 --> 00:46:01.916
So, for example, superman, kryptonite kind of weakened him.

00:46:01.916 --> 00:46:11.099
So I want to ask you, ken, in the current state of education, what would be your current edu kryptonite?

00:46:12.461 --> 00:46:16.130
So I'm going to flip it, man, what was a kryptonite?

00:46:16.130 --> 00:46:20.938
And the old may be our strength and the new right.

00:46:20.938 --> 00:46:24.555
So, before I would say, my kryptonite is this passion, this excitement.

00:46:24.555 --> 00:46:32.271
Like I get riled up because I love kids and so, you know, in the old system we're supposed to be controlled, maintained, right.

00:46:32.271 --> 00:46:33.313
Can't be like that.

00:46:33.313 --> 00:46:38.409
Now, I don't know, I got to be me, like, let me be me Right.

00:46:38.409 --> 00:46:45.005
So my kryptonite was, was really that I get passionate and that passion sometimes goes before me.

00:46:45.005 --> 00:46:47.552
So so that is something I still have to work on, right.

00:46:47.552 --> 00:46:49.276
So, like you know, I do have to.

00:46:50.820 --> 00:47:01.691
Just because I grabbed AI and ran with it and it was the best thing since sliced bread the first day I touched it, I have to know that, just like I want everybody to be themselves, I have to be patient with people who are being themselves.

00:47:01.691 --> 00:47:04.498
I can't interpret that because they're not.

00:47:04.498 --> 00:47:08.874
You know, after I did it long enough, I realized that they're actually being wise about it.

00:47:08.874 --> 00:47:09.536
Right, I'm not.

00:47:09.536 --> 00:47:10.432
That's when I think I'm not.

00:47:10.432 --> 00:47:16.735
I'm going to jump into it and jump out of it if it hurts, but I think for me it's learning just what I want.

00:47:16.735 --> 00:47:25.963
Seeing that I am also not doing that sometimes when I'm driving AI because of the benefits, when you've got to give people a chance to it's an adjustment.

00:47:25.963 --> 00:47:27.334
So that's for me.

00:47:27.334 --> 00:47:28.418
That's my education at Crypt Knight.

00:47:28.418 --> 00:47:31.300
So if I'm excited, it's not because I'm passionate, that's all.

00:47:31.300 --> 00:47:34.255
I'm not trying to push anybody down the road anymore.

00:47:34.376 --> 00:47:35.197
Love it, love it.

00:47:35.197 --> 00:47:36.159
Great answer, ken.

00:47:36.159 --> 00:47:37.483
I appreciate it All right.

00:47:37.483 --> 00:47:48.076
Question number two Ken, if you could have a billboard with anything, on it.

00:47:48.097 --> 00:47:48.697
What would it be and why?

00:47:48.697 --> 00:47:50.222
Tell the truth, tell the truth, truth wins.

00:47:50.222 --> 00:47:51.807
Tell the truth.

00:47:51.807 --> 00:47:52.530
And the reason?

00:47:52.530 --> 00:47:57.123
Because once truth is here, then we all can live free, right.

00:47:57.123 --> 00:47:58.253
We all can be free, right.

00:47:58.253 --> 00:47:59.157
Nobody's hiding anything.

00:47:59.157 --> 00:48:01.518
We can trust, because if you trust, you move fast, right.

00:48:01.518 --> 00:48:08.175
And no one needs us to trust more than people who work with kids, because because they are dependent on us.

00:48:08.175 --> 00:48:13.922
So I'm going to say tell the truth, um, be truthful, be open, be honest, um, even if people don't agree with it.

00:48:13.922 --> 00:48:17.934
But we've got to be able to trust each other because kids are at stake love it.

00:48:18.075 --> 00:48:18.956
Great answer, all right.

00:48:18.956 --> 00:48:27.858
And last question if you could trade places with anyone for a day, anyone, who would it be and why?

00:48:38.771 --> 00:48:40.177
I didn't think you would ask me this man.

00:48:40.177 --> 00:48:52.642
It was a five-year-old kid that I had I was a principal and it was a five-year-old kid that I had that I was a principal, and it was a five-year-old kid, and the system was trying to make me not give the kid what they needed.

00:48:52.642 --> 00:49:02.117
And if I could trade places because I haven't seen him since I've been gone I would trade places with him so I could be five again and have fun away from all this.

00:49:02.117 --> 00:49:03.702
That's why I do this.

00:49:03.702 --> 00:49:07.697
I'm sorry that I I'm sorry, but if I could take places, it would be with him.

00:49:07.697 --> 00:49:10.733
They wanted me to.

00:49:10.733 --> 00:49:13.297
It wasn't anybody individually.

00:49:13.297 --> 00:49:13.980
It was not designed.

00:49:16.070 --> 00:49:16.813
He has autism.

00:49:16.813 --> 00:49:17.797
He had autism.

00:49:17.797 --> 00:49:20.014
He was tearing the room up.

00:49:20.014 --> 00:49:21.179
He's a little black boy.

00:49:21.179 --> 00:49:27.548
All I saw was the other 19 kids see a black boy mad.

00:49:27.548 --> 00:49:28.894
They didn't understand he had autism.

00:49:28.894 --> 00:49:30.059
They just saw him throwing chairs.

00:49:30.059 --> 00:49:32.650
So so this was a psyops nightmare.

00:49:32.650 --> 00:49:39.092
I I didn't see it as just a kid being a problem and I tried to move him and they did try this.

00:49:39.092 --> 00:49:45.043
You try to try that like I got him moved, risking my life and career, basically.

00:49:45.043 --> 00:49:45.704
But I don't care.

00:49:45.704 --> 00:49:48.135
But I want to see him again.

00:49:48.135 --> 00:49:49.200
I want to see him again.

00:49:49.200 --> 00:49:52.878
I would trade places with him to see if he's okay All right.

00:49:53.079 --> 00:49:54.815
Thank you, ken, I really appreciate that.

00:49:54.815 --> 00:49:59.141
And again, just the passion, I love it.

00:49:59.141 --> 00:50:04.360
Thank you so much for just really opening up and sharing your heart, sharing your experience.

00:50:04.360 --> 00:50:15.134
Like I said, this is why we do what we do, man just to bring some great, honest conversations, and sometimes the conversations may be a little different than what other people expect, but sometimes we got to speak the truth.

00:50:15.134 --> 00:50:17.820
That's right, you got to ask Ken about it.

00:50:19.532 --> 00:50:21.074
No, my kindergarten team bought me the shirt.

00:50:21.074 --> 00:50:21.836
It's a private gift.

00:50:23.197 --> 00:50:26.162
Oh okay, there you go, I love it.

00:50:26.162 --> 00:50:27.202
I love it.

00:50:27.202 --> 00:50:33.528
Well, ken, before we wrap up, can you tell our audience members who might be interested in reaching out to you how it is that they can connect with you?

00:50:34.110 --> 00:50:35.776
If you're on LinkedIn, hit me on LinkedIn.

00:50:35.776 --> 00:50:36.499
That's the best place.

00:50:36.499 --> 00:50:39.570
And we are building the edge of Renaissance.

00:50:39.570 --> 00:50:40.952
It's called the edge of Renaissance.

00:50:40.952 --> 00:50:42.014
So innovate is really.

00:50:42.014 --> 00:50:47.271
We're here to help people make it make sense for their districts.

00:50:47.271 --> 00:50:51.771
Right, and in a real way, an honest way, Like we have no product but we will direct you to products.

00:50:51.771 --> 00:50:54.920
Who are telling the truth, we will help you align things.

00:50:54.920 --> 00:50:55.849
That's what we do, right?

00:50:55.849 --> 00:51:03.873
So the Ednavate is kind of like outlines that, but hit me on LinkedIn or go to wwwednovate LinkedIn or go to wwwednovate.

00:51:03.873 --> 00:51:05.673
That's like educational innovation.

00:51:05.673 --> 00:51:09.335
So E-D-N-O-V-A-T-E ednovateai.

00:51:09.335 --> 00:51:11.936
Or you can email me at ken at ednovateai.

00:51:11.936 --> 00:51:18.239
So we are here, in truth, to help people move forward, Like let's do this thing right.

00:51:18.239 --> 00:51:18.699
That's all I want.

00:51:18.699 --> 00:51:19.479
That's all I want.

00:51:19.479 --> 00:51:20.059
That's all I want.

00:51:20.059 --> 00:51:20.980
That's all I want.

00:51:21.000 --> 00:51:22.019
Perfect, Piper.

00:51:22.019 --> 00:51:23.019
Thank you so much.

00:51:23.019 --> 00:51:24.601
I really appreciate it, my friend.

00:51:24.941 --> 00:51:26.481
And for all our audience members.

00:51:26.481 --> 00:51:31.764
Please make sure that you visit our website at myedtechlife, where you will watch this episode.

00:51:31.764 --> 00:51:35.865
We'll definitely make sure that we link all of Ken's information his website.

00:51:35.865 --> 00:51:40.907
That way you can go ahead and connect with him, reach out to him if you have any questions or if you need anything.

00:51:40.907 --> 00:51:51.416
Just feel free to reach out, because he definitely has a heart to help people, and especially if it's something that's going to get down to the students as well.

00:51:51.416 --> 00:52:04.847
So please make sure you reach out to him and make sure that you check out the other 315 episodes that we have, where I promise you 315 other episodes where I promise you you will find a little something that you can sprinkle onto what you are already doing.

00:52:04.847 --> 00:52:05.090
Great.

00:52:05.471 --> 00:52:08.699
Like I mentioned to you guys at the very beginning, we love you.

00:52:08.699 --> 00:52:11.474
We thank you so much for all of your support.

00:52:11.474 --> 00:52:13.925
Thank you so much for all the downloads.

00:52:13.925 --> 00:52:16.773
Thank you so much for the follows to our sponsors.

00:52:16.773 --> 00:52:19.941
Thank you to book creator, who is our newest partner.

00:52:19.941 --> 00:52:20.802
Thank you to you.

00:52:20.802 --> 00:52:24.096
Big shout out to you, to Edu8, to Yellowdig.

00:52:24.096 --> 00:52:26.380
Thank you for believing in our mission.

00:52:26.380 --> 00:52:27.764
You will appreciate.

00:52:27.764 --> 00:52:29.235
Yes, that's right.

00:52:29.235 --> 00:52:31.335
Thomas Thompson, thomas, text me back.

00:52:31.335 --> 00:52:35.260
Thomas, thomas Thompson and Thomas Hummel Great people.

00:52:35.260 --> 00:52:37.577
Make sure that you follow them too as well.

00:52:37.577 --> 00:52:39.096
So thank you, my friends.

00:52:39.096 --> 00:52:44.652
I really appreciate all of your support, and until next time, my friends, don't forget, stay techie.

00:52:44.652 --> 00:53:14.505
Thank you.
Ken Patterson Profile Photo

CEO

Envisioning the Future of Education | Championing Human Flourishing for a New Era of Learning.

At the core of my journey lies an unshakable mission: to reimagine education as a catalyst for human potential. From my days as a school principal to leading as CEO of EdNovate, I’ve committed my life to transforming how we teach, lead, and innovate for the benefit of every student.

With over two decades of experience across the education spectrum, I’ve cultivated a perspective that bridges the classroom and the boardroom. I’ve stood on the frontlines of education, navigating its challenges and celebrating its triumphs, and now I lead at the intersection of technology and humanity—ensuring artificial intelligence isn’t just a tool but a transformative force in education.

As CEO of EdNovate, I partner with school districts and educational leaders to harness human potential and emerging tech opportunities to solve systemic challenges, streamline operations, and create environments where every child can thrive. But my mission goes deeper: I work to humanize learning, to bring back the connection, creativity, and curiosity that education has long risked losing to bureaucracy and burnout. Every decision I make is rooted in a singular vision—empowering students by empowering those who shape their futures.

In an era of unprecedented technological change, I see AI not as a disruptor but as an amplifier of humanity’s potential. My goal is to ensure leaders, educators, and communities embrace its possibilities, crafting vision-driven systems that honor t…Read More