WEBVTT
00:00:11.119 --> 00:00:14.960
Hello, everybody, and welcome to another great episode of My Ed Tech Life.
00:00:14.960 --> 00:00:17.920
Thank you so much for joining us on this wonderful day.
00:00:17.920 --> 00:00:23.359
And wherever it is that you're joining us from around the world, thank you as always for all of your support.
00:00:23.359 --> 00:00:26.960
As always, we appreciate all the likes, the shares, the follows.
00:00:26.960 --> 00:00:34.640
Thank you so much for interacting with our content, for your messages, and just for your overall listenership.
00:00:34.640 --> 00:00:35.280
Thank you.
00:00:35.280 --> 00:00:41.119
It really means the world to us that we can bring a great quality podcast for you to continue to learn.
00:00:41.119 --> 00:00:47.439
And that's our goal to make sure that we continue to give you conversations that'll help us continue to grow.
00:00:47.439 --> 00:00:52.240
And before we dive in, I definitely want to thank our wonderful sponsors.
00:00:52.240 --> 00:00:57.920
Thank you so much to Book Creator, Eduaid, and Yellowdig for sponsoring our show.
00:00:57.920 --> 00:01:02.880
Without you and without you believing in our mission, we wouldn't be doing what we're doing.
00:01:02.880 --> 00:01:04.959
So thank you so much for that support.
00:01:04.959 --> 00:01:11.280
And if you're interested in being a sponsor, please feel free to reach out to us and we can definitely set that up.
00:01:11.280 --> 00:01:14.159
But I am excited about today.
00:01:14.159 --> 00:01:19.439
Uh today's guest is somebody that I have been following for a very long time.
00:01:19.439 --> 00:01:25.840
And I have just seen not just her account, but just her and this glow-up.
00:01:25.840 --> 00:01:32.959
She's been doing so many great things, and she's going to tell us about some exciting things that are happening that are coming soon.
00:01:32.959 --> 00:01:40.560
But she has been talking not only at school districts, she'll participate, you know, at Stanford University.
00:01:40.560 --> 00:01:47.519
She's working with so many people in so many different ways and talking to them about AI.
00:01:47.519 --> 00:01:55.040
So I am excited to welcome to the show our wonderful guest today, Marissa Sadler Holder.
00:01:55.040 --> 00:01:57.599
Thank you so much for joining us this evening.
00:01:57.599 --> 00:01:58.959
How are you, Marissa?
00:01:59.200 --> 00:01:59.760
I'm great.
00:01:59.760 --> 00:02:01.599
I'm so excited to be here.
00:02:01.920 --> 00:02:03.840
Well, I am excited to have you here.
00:02:03.840 --> 00:02:13.599
It was great just talking in the pre-show, just getting to just the way that we've connected and we've connected on LinkedIn and you know, for a for a good while now.
00:02:13.599 --> 00:02:21.199
And obviously, uh we connect with the same circles as far as AI, AI and education conversations and so on.
00:02:21.199 --> 00:02:24.719
So it's just been great to see your input.
00:02:24.719 --> 00:02:36.639
It's been great, like I mentioned, seeing your journey and the wonderful opportunities that you have had in helping educators and also just bringing your knowledge to, like I mentioned, wonderful places.
00:02:36.639 --> 00:02:41.919
I know Stanford, you know, and you've got some great things coming up, but I'll make I'll let you announce that.
00:02:41.919 --> 00:02:44.000
But I'm really excited about that for you.
00:02:44.000 --> 00:03:02.240
But before we dive in into our conversation, Marissa, for any of our audience members that are listening at this very moment that may not be familiar with your work just yet, can you give us a little bit of background and what your context is within the education ed tech space?
00:03:02.639 --> 00:03:03.520
Yeah, sure.
00:03:03.520 --> 00:03:07.919
So um I have a background in teaching.
00:03:07.919 --> 00:03:11.120
Uh 13 years, I was actually a French teacher.
00:03:11.120 --> 00:03:20.560
And during COVID, I hopped into getting um a master's in um instructional design and technology for education.
00:03:20.560 --> 00:03:22.719
And I just I really dived in.
00:03:22.719 --> 00:03:29.840
I've always been one of those people who like to tinker with education and tinker with technology and see where there's an intersection there.
00:03:29.840 --> 00:03:38.879
And um after COVID, you know, I just I was kind of feeling like I wanted to do something with the two together.
00:03:38.879 --> 00:03:45.680
And, you know, lo and behold, I mean, it kind of just happened with AI kind of being open to the masses.
00:03:45.680 --> 00:03:50.639
And I thought, oh my gosh, this is, you know, this is gonna make a wave in education.
00:03:50.639 --> 00:03:52.879
I think this is something that I can dive into.
00:03:52.879 --> 00:03:56.960
And educators are, you know, gonna be looking for help.
00:03:56.960 --> 00:04:16.480
And um, so I kind of just decided to go ahead and create teaching with machines, which is really about helping educators kind of learn about these new technologies that are out there, see how they can apply their expertise to this technology and see what they can create.
00:04:16.480 --> 00:04:32.800
And um, my ultimate goal with teaching with machines is to really have the teacher um feel empowered and excited, kind of like that new shot in the arm in education that we all kind of need to get us excited about what we're doing again.
00:04:32.800 --> 00:04:39.040
And um I speak a lot about AI and education, the integration.
00:04:39.040 --> 00:04:50.879
Um, I've uh work with schools and uh I worked with Orange County Department of Education as well as an AI consultant and speak at Stanford.
00:04:50.879 --> 00:04:56.240
I I am speaking at conferences, and yeah, that's kind of what I'm doing right now.
00:04:56.240 --> 00:05:02.639
And again, like it's all about just sharing, going out there, learning the thing, and then sharing whatever I learn.
00:05:02.879 --> 00:05:03.920
Yeah, and that's great.
00:05:03.920 --> 00:05:08.959
And I, you know, one of the things that I love too that you mentioned is, you know, empowering teachers.
00:05:08.959 --> 00:05:16.079
And as we know, you know, we've seen so many things, you know, in the news, how things have changed in education and so on.
00:05:16.079 --> 00:05:27.680
And, you know, it's very important that we do help support our educators in every which way possible, you know, from pedagogy to including the tech in pedagogy and finding that balance.
00:05:27.680 --> 00:05:46.720
And so one of the things that I love uh, you know, following your page and seeing um teaching with machines, which we will make sure that we link in the show notes, guys, so you can go ahead and visit Marissa's page, but how important that technology is, but it will never compare to the impact that teachers have.
00:05:46.720 --> 00:05:54.240
And I think that that's so important that you not only help teachers, you know, learn, like you mentioned, a new technology.
00:05:54.240 --> 00:06:04.560
And and I don't know, we can still say relatively new, even though I mean, since November 2022, we're already headed to November 2025, and we've seen how it has evolved.
00:06:04.560 --> 00:06:26.079
But I think that that's something that is great, and it's something that is gonna be continuous because as the tech changes, as the tech progresses, there's still gonna have to be uh people such as yourself, myself, and many of the guests that I've had on the show to be able to share their experiences with educators to help them as we continue to move forward.
00:06:26.079 --> 00:06:28.079
So that's something that's very exciting.
00:06:28.079 --> 00:06:43.759
So I want to ask you, uh, Marissa, well, when was it that you made that jump or that choice to go from educator to say, hey, I'm gonna go ahead and just go all in on teaching with machines?
00:06:43.759 --> 00:06:47.199
What was that aha spark moment for you?
00:06:47.199 --> 00:06:48.639
Gosh.
00:06:48.879 --> 00:06:49.680
I don't know.
00:06:49.680 --> 00:06:59.040
I sometimes I go, am I crazy for even to because you know the thing is, is it really is it's a huge career shift, you know.
00:06:59.040 --> 00:07:05.040
13 years in my position, and you know, it's a stable position in a great high school.
00:07:05.040 --> 00:07:09.439
And to jump into this, it was a big decision.
00:07:09.439 --> 00:07:17.360
But like one of my friends said, you know, it's now or never, you know, I mean, this is this is this moment.
00:07:17.360 --> 00:07:26.480
And if you can help educators kind of navigate this so they can in turn help their students navigate this, this is the time.
00:07:26.480 --> 00:07:29.279
This is the only moment in time you can do that, you know.
00:07:29.279 --> 00:07:32.240
And so I just I did it.
00:07:32.240 --> 00:07:44.079
But you know, the funny thing is is that when I when I left the profession and I started exploring what I could be doing, none of the jobs that I wanted to do existed.
00:07:44.079 --> 00:07:54.399
And so I kind of just said, all right, well, I guess I'll create teaching with machines so I have something that I can put on my resume to say that I'm doing something.
00:07:54.399 --> 00:07:57.040
But I honestly, I ended up subbing.
00:07:57.040 --> 00:08:02.160
I ended up subbing for almost a year while I built out teaching with machines.
00:08:02.160 --> 00:08:06.480
And um through that, it was a humbling experience.
00:08:06.480 --> 00:08:08.399
It was a very interesting experience.
00:08:08.399 --> 00:08:16.959
It is one that allowed me insight into other classrooms than my own experience, which was incredible.
00:08:16.959 --> 00:08:29.199
And also the ability to have conversations with students who have no fear of sharing their insight or input on how they use this technology.
00:08:29.199 --> 00:08:32.000
Um, there's no repercussions, right, with a sub.
00:08:32.000 --> 00:08:38.080
So um I was able to do a little recon during that subbing situation.
00:08:38.080 --> 00:08:46.559
But, you know, as I progressed and I continued, and I have a weekly newsletter that I just kept on putting out, sharing what I've learned.
00:08:46.559 --> 00:09:02.000
Um, I think part of it too is that while I saw this as, I mean, I I tell this story basically, and it's the moment I realized that they needed somebody out there, and not just myself, but there's many, many people too.
00:09:02.000 --> 00:09:11.519
But this idea of like I was sitting down and I was writing an email in response to kind of, you know, your typical angry parent email, if you will.
00:09:11.519 --> 00:09:12.879
And I was done.
00:09:12.879 --> 00:09:15.679
I mean, it was the end of the day, it was like in April.
00:09:15.679 --> 00:09:23.519
And my friend said, I don't think you should respond to that email right now, because you know, I was a little, you know, heated or something.
00:09:23.519 --> 00:09:26.000
And so um, I said, Okay, I won't.
00:09:26.000 --> 00:09:30.799
She's like, But I think you should run it through Chat GPT and adjust it, you know.
00:09:30.799 --> 00:09:38.879
And so I did, and I just kind of sat back and was just like, oh gosh, oh, okay.
00:09:38.879 --> 00:09:51.840
This is not a technology that is just gonna be something that we just uh simply adapt to, but rather will have a profound impact on education.
00:09:51.840 --> 00:10:09.679
Um, especially, you know, like part of the thing is is like I taught French and Google Translate has had a huge impact on how we do assessments, how we teach in language classes, and we're still struggling with that, many of us, and it's been 15 years, you know.
00:10:09.679 --> 00:10:12.720
So it's one of the, or maybe 10, I'm sorry.
00:10:12.720 --> 00:10:19.840
Um, but I just kept on thinking, okay, this is where teachers are gonna need support on the outside.
00:10:19.840 --> 00:10:44.799
Now, I started following you, which was amazing on LinkedIn, and I quickly realized there's just not a lot of teachers on LinkedIn, you know, um, a lot of admin, maybe, um, you know, just kind of thought leaders, but like the chalk in hand teacher, just there's no need to kind of have that profile on LinkedIn because we don't use it to network, right?
00:10:44.799 --> 00:10:46.159
So what's the point?
00:10:46.159 --> 00:11:04.799
It, you know, and so I just kind of, you know, I started getting on LinkedIn and I realized that there's a need to make sure that the teacher voice is amplified in these spaces where there are people who are making decisions, thought leaders who are giving their opinions on where education is going.
00:11:04.799 --> 00:11:14.879
I just wanted to make sure that the chalk in hand teacher has somebody who can amplify their voice out into these spaces to make sure that they are being heard.
00:11:14.879 --> 00:11:18.639
Um, and so that's kind of the decision I made.
00:11:18.639 --> 00:11:21.440
And I it's been an amazing experience.
00:11:21.440 --> 00:11:27.120
It's been a journey, it's been ups and downs, of course, but it's been, it's been a lot of fun.
00:11:27.360 --> 00:11:27.600
Yeah.
00:11:27.600 --> 00:11:32.879
And yeah, I've seen, like I said, seeing you, like I said, uh beautiful glow up.
00:11:32.879 --> 00:11:43.840
And I always tell my friends, like, I see you, like I see the great things, and it just gets me so excited, you know, that people within the space are continuing to grow and getting all these opportunities and so on.
00:11:43.840 --> 00:11:51.039
And so just to see you, and from when, you know, we first started connecting on LinkedIn to see what you're doing.
00:11:51.039 --> 00:11:57.600
And I'm like, oh my gosh, look at what Marissa's doing now, and look at where she's at, and look at where she's headed and all those opportunities.
00:11:57.600 --> 00:12:09.600
I think that's something that's valuable too, in the sense that you're amplifying and you're also not only are you amplifying your experience, but you like you mentioned, it's you're amplifying the voice of those educators too, as well.
00:12:09.600 --> 00:12:24.000
Like you're bringing their voices to the table because oftentimes, you know, stakeholders usually are the ones that say, well, we're the one, the decision makers up at the top, but we don't include the actual users within those conversations as well.
00:12:24.000 --> 00:12:26.240
And I think that's something that's very important.
00:12:26.240 --> 00:12:40.159
But I want to kind of shift up, you know, a little bit now because I know that we talk a little a lot about this, and I know I've seen it on LinkedIn so much, and it's been something that I've seen for the past couple of years.
00:12:40.159 --> 00:12:53.519
Well, I and I know say a couple of years, but mainly maybe within the last year, year and a half, there's a strong push, and we're always going and talking about human-centered AI professional development.
00:12:53.519 --> 00:13:18.559
So, with your experience in professional development, and of course, using that phrase, human-centered AI professional development, what does human-centered mean to Marissa, especially, especially in a world where it just seems like every AI tool seems to be the star or that magic bullet, and we kind of forget about that human aspect.
00:13:18.559 --> 00:13:24.480
So tell me a little bit about how you find that balance and how you really put that the human first.
00:13:55.610 --> 00:14:06.409
Yeah, um I have this belief about when we use AI that really AI should be an extension of your own expertise.
00:14:06.409 --> 00:14:11.370
And it should reflect um your voice.
00:14:11.370 --> 00:14:20.809
And so when it's not going to replace, we're not going to put out there or use it for things that we wouldn't normally know what to do and how to do.
00:14:20.809 --> 00:14:29.049
You know, I think it's very important that, because I think a lot of us and the big fear and that the thing that nobody really talks about.
00:14:29.049 --> 00:14:44.409
And I mean, whether you're an artist, whether you are a teacher or another profession, we all are sitting here going, if AI can do what I do, then what value do I bring?
00:14:44.409 --> 00:14:53.210
And it's very, it's it's a it's one of those things we don't, we don't talk about it, but that's the fear.
00:14:53.210 --> 00:14:54.970
That's what's driving the fear.
00:14:54.970 --> 00:15:00.409
And I think when we sit down and we say, okay, what value do I bring?
00:15:00.409 --> 00:15:05.450
Well, you know, the the AI cannot do anything without your direction.
00:15:05.450 --> 00:15:19.289
So if it is something that is reflecting your input, is reflecting your expertise, then we have much more control over the AI than we really are, we we really think we do, right?
00:15:19.289 --> 00:15:35.450
So um that and and and I tell like when I'm doing sessions with students, even I have this conversation about making sure that you know, there's these skills that we need to develop in this AI world.
00:15:35.450 --> 00:15:45.529
And and then they're they're you know, hard and fast skills that we've been trying to do for, you know, impart on them and you know, critical thinking, creativity, all of those things, right?
00:15:45.529 --> 00:15:46.889
And collaboration.
00:15:46.889 --> 00:15:57.529
But one of the ones and in kind of tapping into like the literary world or you know, the English teacher's world is developing that authentic voice.
00:15:57.529 --> 00:16:00.169
And what does that look like for you?
00:16:00.169 --> 00:16:01.129
Because you know what?
00:16:01.129 --> 00:16:02.809
There's only one of you.
00:16:02.809 --> 00:16:06.330
You only get to develop that authentic voice.
00:16:06.330 --> 00:16:08.490
Nothing else can do that except for you.
00:16:08.490 --> 00:16:25.850
And in a world where it's becoming more and more um inundated with AI and artificially created things, having that authentic voice that cannot be duplicated and replicated is yours.
00:16:25.850 --> 00:16:29.769
And um really work on developing that.
00:16:29.769 --> 00:16:32.169
Can you use AI to help you develop it?
00:16:32.169 --> 00:16:35.289
Yes, but it will never replace it.
00:16:35.289 --> 00:16:53.210
And I think that is so important, especially, and I don't know if you've been seeing this either, but like when we're on TikTok or we're looking at social media, I swear there's been like a shift from going away from the curated selfies and the curated content perfectly.
00:16:53.210 --> 00:16:57.210
Everybody's life is perfect, but we know, you know, behind the scenes it's not perfect, right?
00:16:57.210 --> 00:17:11.450
Whereas the more popular content creators are coming in disheveled, looking like a normal, you know, everyday human person and telling their story because they're captivating this authentic voice.
00:17:11.450 --> 00:17:18.730
And we are so drawn to it and we have such a need for it that I think we're gonna want to really start developing that even more.
00:17:18.730 --> 00:17:23.049
And I think it's gonna be really important for our students' future.
00:17:23.370 --> 00:17:23.610
Yes.
00:17:23.610 --> 00:17:28.330
No, I and one of the things that you that I love that you mentioned is that authentic voice.
00:17:28.330 --> 00:17:33.130
There's only one of you in this world, and we really want to hear your voice, your thoughts.
00:17:33.130 --> 00:17:44.490
And like, you know, like you mentioned, there are many tools that are out there that can help, you know, at least, you know, work through your message or what you're trying to say and things of that sort.
00:17:44.490 --> 00:17:51.930
But there's also a way of delivery that is you, that is the way that your true voice really comes out.
00:17:51.930 --> 00:18:06.730
And like you mentioned, in a world that is becoming very synthetic very quickly, because now with uh Sora 2 dropping and all of these videos, now you've got that video aspect of it.
00:18:06.730 --> 00:18:10.329
Uh, obviously, you know, with the large language models and so on.
00:18:10.329 --> 00:18:17.369
So I do agree with you that now it's people want to see that authenticity, who you really are.
00:18:17.369 --> 00:18:24.170
And yes, you you do notice that shift where, you know, the videos are not as overproduced as they once were.
00:18:24.170 --> 00:18:28.809
You know, people are coming in just being more natural because that's what people are craving.
00:18:28.809 --> 00:18:32.410
They're craving, they are wanting, you know, that authentic voice.
00:18:32.410 --> 00:18:48.890
And uh kind of going along that line, I kind of wanted to share, and I know I've done it in another episode, but it's very, you know, pertinent to what we're talking about, especially with voice, where a good friend of mine, she also does speaking engagements and so on.
00:18:48.890 --> 00:18:58.250
And, you know, obviously with the world of of LLMs and uh, you know, creating presentations is a lot easier, a lot quicker.
00:18:58.250 --> 00:19:03.450
You kind of give it your idea and it'll go ahead and uh pop something out for you.
00:19:03.450 --> 00:19:13.289
So she said that she was gonna do a little keynote and she said, Man, I already had this presentation, you know, done, you know, a month ago and everything like that.
00:19:13.289 --> 00:19:18.809
But of course, with the use of AI and and helping in creating that.
00:19:18.809 --> 00:19:31.529
But she said that when she was presenting, even though she has that content knowledge, she just felt like this isn't me, like this isn't really my voice, and so on.
00:19:31.529 --> 00:19:49.849
So she actually said, like, I'm gonna take a little break from all of this a little bit and just kind of see, like, like kind of guess you find yourself again, because I think oftentimes too, you do also see that side of videos and content where you can see that it is very heavily dependent on LLMs.
00:19:49.849 --> 00:19:56.250
And sometimes like you could you can kind of pick it out, you know, especially if we use it and so on, you can kind of pick it out.
00:19:56.250 --> 00:19:59.529
And sometimes it's like, uh, I'll just kind of scroll right past it.
00:19:59.529 --> 00:20:04.410
And uh like then I'll I see somebody that maybe has a lot of spelling errors or something like that.
00:20:04.410 --> 00:20:11.129
And I'm like, ooh, I want to see that, I want to read that, you know, because it just feels like like, oh, this is authentic and stuff like that.
00:20:11.129 --> 00:20:20.170
So I I really like that you said that you really help teachers also see that, that there's only one of you and it's your voice, but also that you work with students on that.
00:20:20.170 --> 00:20:22.009
And I think that's something that's very important.
00:20:22.009 --> 00:20:43.849
And as you know, like now with the release of Sora and the way that the technology is getting better and better, you know, it just seems like every week there's some kind of new improvement, there's some new model that's coming out, and you really want to help your students understand and critically think about these outputs and critically think about these uh large language models as well.
00:20:43.849 --> 00:21:55.650
But kind of going back to the work that you do with schools, Marissa, I want to ask you in your experience too, as well, you know, a lot of schools that uh are having that maybe trying uh find having a hard time reconciling maybe the the great use of AI, the use case of it, but maybe also now battling with that data privacy and ethics, you know, those barriers.
00:21:55.650 --> 00:22:03.090
What are some things that you may have seen or some best practices that you've seen, you know, along your travels within school districts?
00:22:03.090 --> 00:22:11.090
And what might be some suggestions that you yourself too have come with that you might be able to share with educators or districts?
00:22:11.810 --> 00:22:13.650
Yeah, so there's two things.
00:22:13.650 --> 00:22:18.210
So there's the AI uh coherence cycle, and I'll tell you about that in a second.
00:22:18.210 --> 00:22:26.529
But first and foremost, I feel like everybody has to go through their own journey with it before they can actually address any of that, right?
00:22:26.529 --> 00:22:28.450
Um, they have to first learn about it.
00:22:28.450 --> 00:22:33.650
We have to, we have to model those critical thinking and excitement and innovation.
00:22:33.650 --> 00:22:37.970
So first learning and being lifelong learners is looking into that.
00:22:37.970 --> 00:22:49.970
Um, I think a lot of times schools, given the time that they're given, they just, you know, kind of want to front load all of this onto educators.
00:22:49.970 --> 00:22:54.450
And, you know, here's your one and a half hours, two hour training, and then that's it.
00:22:54.450 --> 00:22:57.090
And then, you know, expect greatness.
00:22:57.090 --> 00:22:59.810
And the thing is, is, you know, we don't do that to our students.
00:22:59.810 --> 00:23:03.330
We get fired if we try doing that, you know, moving on to the next lesson.
00:23:03.330 --> 00:23:11.970
But um I think um it's this idea that, you know, you have to have two light bulbs that go off.
00:23:11.970 --> 00:23:15.730
And I'm not gonna swear, although I do say it in my head.
00:23:15.730 --> 00:23:18.610
The first light bulb is like, oh, this is cool.
00:23:18.610 --> 00:23:19.730
This can do this for me.
00:23:19.730 --> 00:23:20.850
This is amazing.
00:23:20.850 --> 00:23:23.570
And then the it's the oh beep light bulb.
00:23:23.570 --> 00:23:27.730
The second light bulb is where it's like, oh, this is going to impact.
00:23:27.730 --> 00:23:30.050
This is going to do XYZ.
00:23:30.050 --> 00:23:36.769
This is going to affect the things that I do and how my students do things, how my school could possibly do things.
00:23:36.769 --> 00:23:40.370
There is going to be the oh beep light bulb that has to go off.
00:23:40.370 --> 00:23:43.009
And we have to have space to be able to do both.
00:23:43.009 --> 00:23:45.810
The first one has to happen before the second one happens.
00:23:45.810 --> 00:23:58.370
And then we can start addressing those really big picture ideas and going after how this can be done ethically, safely, um, and with intention.
00:23:58.370 --> 00:24:09.009
Now, what I've seen a lot of success in working with schools is the AI coherence cycle that I also work with Danelle Almaras.
00:24:09.009 --> 00:24:21.970
And what we've created with this cycle is let's not, you know, we kind of modeled it off of what business, successful businesses have done to integrate AI.
00:24:21.970 --> 00:24:33.410
And instead of sitting down with their employers and our employees and saying, here, go do a two-hour training and then off you go, what they first do with AI is say, what's on fire?
00:24:33.410 --> 00:24:35.090
Where are we bleeding?
00:24:35.090 --> 00:24:36.370
What is going on?
00:24:36.370 --> 00:24:42.850
Because we can't possibly think of anything else until we address those huge, huge challenges that we have.
00:24:42.850 --> 00:24:49.250
And I think if we with this AI coherence cycle, it's kind of doing the same model.
00:24:49.250 --> 00:24:52.529
It's saying, what are our biggest challenges right now?
00:24:52.529 --> 00:24:55.570
What are some initiatives that we have to do to address?
00:24:55.570 --> 00:25:01.570
You know, AI is the last thing on my mind when it comes to all of this.
00:25:01.570 --> 00:25:13.730
But instead, when we look at these big challenges we have and say, where can AI possibly come and help me with these things?
00:25:13.730 --> 00:25:16.289
Then we're really leveraging the tool.
00:25:16.289 --> 00:25:23.009
And along with that comes learning and exploration of the tool and professional development.
00:25:23.009 --> 00:25:25.410
But now we're doing that first light bulb.
00:25:25.410 --> 00:25:33.570
Oh, hey, this can help me with things, you know, and then you're getting the professional development in a way that is not here, let me sit you down for two hours.
00:25:33.570 --> 00:25:35.650
It's cyclical, it keeps on going.
00:25:35.650 --> 00:25:39.090
And we're addressing the challenges that we have at hand.
00:25:39.090 --> 00:25:45.410
And I think that's going to be more reflective of how we're going to be seeing AI in education in the future.
00:25:45.410 --> 00:25:48.610
But um, I I've seen some success with it.
00:25:48.610 --> 00:26:02.850
I will tell you the I worked with a school that decided to do that probably mid-year, um, where we just decided to focus on getting their curriculum um mapped.
00:26:02.850 --> 00:26:14.450
And we had about 30% of the staff who had the curriculum map and uploaded onto a um, what is it, like a program that documents it, right?
00:26:14.450 --> 00:26:16.850
And this was a part of their accreditation process.
00:26:16.850 --> 00:26:17.090
Okay.
00:26:17.090 --> 00:26:18.610
And they really wanted to get it done.
00:26:18.610 --> 00:26:27.009
By like two months after leveraging AI and working with teachers, we had 80% of the teachers completed with all of their curriculum mapped.
00:26:27.009 --> 00:26:32.130
And it's just like, look, it's your expertise that did this with this tool.
00:26:32.130 --> 00:26:35.890
This is how we could be addressing and using and leveraging AI.
00:26:35.890 --> 00:26:41.810
It doesn't always have to be a sit-down, here's your two-day, you know, professional development.
00:26:41.810 --> 00:26:46.370
And along with that comes learning about the limitations.
00:26:46.370 --> 00:26:52.050
And that's where making sure you're using it ethically, that's why professional development is important.
00:26:52.050 --> 00:26:58.130
So we we are careful and we're protecting ourselves, we're protecting our students, and we're protecting our learning community.
00:26:58.529 --> 00:26:59.170
I love that.
00:26:59.170 --> 00:27:15.970
And you know, that actually covers a lot of things that I wanted to ask as a follow-up because as we know, you know, with tools like the tools that are out there now, the plethora of platforms that we have available, sometimes it's like, hey, we just want the next shiny tool.
00:27:15.970 --> 00:27:27.009
And we just want it because we want to do this faster, as opposed to, okay, how can we reimagine what we're learning with these tools and maybe take it to that next level?
00:27:27.009 --> 00:27:33.490
And I know uh I'm a big fan of the Samar model, you know, the substitution, augmentation, modification, and redefinition.
00:27:33.490 --> 00:27:51.810
So oftentimes what we normally see, and initially, you know, with as with any adoption, sometimes you'll get that substitution factor of, well, before I used to do handouts, well, now I can use something like Kami that I can annotate and I can do the same thing, you know, digitally.
00:27:51.810 --> 00:28:02.769
But then also, what else can we do to take that and augment what can what are some things that we couldn't do before that we're able to do there and then modify and then redefine?
00:28:02.769 --> 00:28:11.650
And sometimes I feel that many times, as quick as we are, sometimes as educators, we just want the next shiny thing and we just kind of say, ah, okay.
00:28:11.650 --> 00:28:18.930
And then when it loses its lore and its glamour, it's like, all right, let's move on to the next thing because we we just we're we're like that.
00:28:18.930 --> 00:28:21.570
We get so excited and we want to try those things.
00:28:21.570 --> 00:28:47.009
But I think with this, even with the steps that you described, having the teachers being able to work through it as they're working there, it's not only going to be something that's gonna help them see that they can do their work more efficiently, but also be more effective and also think of new ways that they can enhance those lessons as they're putting their curriculums together and thinking, like, wow, like this is something that I would have never been able to do before.
00:28:47.009 --> 00:28:50.210
And now I can enhance those lessons for my students.
00:28:50.210 --> 00:28:55.330
So that is great that you shared all of that because I think that that really plays well into this.
00:28:55.330 --> 00:29:15.009
And so, my next question to you though, like I mentioned, as educators, we get so excited and we want to use the next tool because we see what's hot uh that's trending on, you know, on educator um TikTok or on whether it's uh X or Instagram and so on.
00:29:15.009 --> 00:29:20.050
So, as we know, you know, there's a lot of hype that gets put onto a lot of these tools.
00:29:20.050 --> 00:29:30.610
So, how do you help uh administrators discern what would be a good tool versus something that might just be snake oil?
00:29:31.970 --> 00:29:33.009
Oh my gosh.
00:29:33.009 --> 00:29:42.610
Um I I think we always should be kind of centered on the effectiveness, right?