Oct. 13, 2025

Episode 339: Merissa Sadler-Holder

Episode 339: Merissa Sadler-Holder
Spotify podcast player badge
Goodpods podcast player badge
Apple Podcasts podcast player badge
Amazon Music podcast player badge
Pandora podcast player badge
RSS Feed podcast player badge
Spotify podcast player iconGoodpods podcast player iconApple Podcasts podcast player iconAmazon Music podcast player iconPandora podcast player iconRSS Feed podcast player icon

 Episode 339: Merissa Sadler-Holder | Teaching with Machines 

In this episode of My EdTech Life, Dr. Fonz Mendoza talks with Merissa Sadler-Holder, founder of Teaching with Machines. Merissa shares her journey from French teacher to AI consultant and shows how educators can embrace AI without losing their voice, creativity, or purpose.

They discuss what human-centered AI really means in the classroom, how to build meaningful professional development, and ways to keep teaching focused on people instead of platforms.

Merissa also breaks down her AI Coherence Cycle, explains why teachers should stop chasing every new tool, and highlights the power of authenticity in a world full of automation.

This episode will help you rethink AI in education and remind you that the real value is you.

Timestamps

00:00 — Intro & Sponsors
 01:00 — Meet Merissa Sadler-Holder
 03:00 — From French Teacher to Teaching with Machines
 06:00 — The Aha Moment: Seeing AI’s Potential
 10:00 — What Human-Centered AI Really Means
 14:00 — Helping Students Find Their Authentic Voice
 20:00 — The AI Coherence Cycle Explained
 24:00 — Making PD Meaningful and Ongoing
 28:00 — Avoiding the “Shiny Tool” Trap
 33:00 — Ethics, Data Privacy & District Challenges
 36:00 — Involving Parents and Communities
 38:00 — Teaching is Messy—And That’s Good
 41:00 — Preview: Speaking at EdTech Week NYC
 44:00 — The Future of Teacher-Built Tools
 49:00 — The Value is YOU
 55:00 — Closing Thoughts + Stay Techie

Connect with Merissa

Website: teachingwithmachines.com

Facebook Group

Special Thanks to Our Sponsors

Huge thanks to Book Creator, Eduaide.AI, and Yellowdig for supporting our mission to amplify educator voices and spark meaningful conversations about the future of learning.

Stay Techie! ✌️

Peel Back Education exists to uncover, share, and amplify powerful, authentic stories from inside classrooms and beyond, helping educators, learners, and the wider community connect meaningfully with the people and ideas shaping education today.

Authentic engagement, inclusion, and learning across the curriculum for ALL your students. Teachers love Book Creator.

Support the show

Thank you for watching or listening to our show! 

Until Next Time, Stay Techie!

-Fonz

🎙️ Love our content? Sponsor MyEdTechLife Podcast and connect with our passionate edtech audience! Reach out to me at myedtechlife@gmail.com. ✨

 

00:00 - Warm Welcome & Sponsors

01:47 - Meet Marissa: Teacher to AI Advocate

04:05 - Founding Teaching With Machines

07:16 - Authentic Voice in an AI World

11:23 - Ethics, Privacy, and PD Realities

14:59 - The AI Coherence Cycle Explained

17:46 - Beyond Shiny Tools: Measuring Impact

21:12 - Community, Parents, and Modeling Innovation

23:42 - Spicy Take: Teachers Building Tools

26:39 - EdTech Week Preview with Bonnie Neves

28:34 - Teacher Value, Fear, and Mindset Shifts

31:49 - Lightning Round & Closing CTAs

WEBVTT

00:00:11.119 --> 00:00:14.960
Hello, everybody, and welcome to another great episode of My Ed Tech Life.

00:00:14.960 --> 00:00:17.920
Thank you so much for joining us on this wonderful day.

00:00:17.920 --> 00:00:23.359
And wherever it is that you're joining us from around the world, thank you as always for all of your support.

00:00:23.359 --> 00:00:26.960
As always, we appreciate all the likes, the shares, the follows.

00:00:26.960 --> 00:00:34.640
Thank you so much for interacting with our content, for your messages, and just for your overall listenership.

00:00:34.640 --> 00:00:35.280
Thank you.

00:00:35.280 --> 00:00:41.119
It really means the world to us that we can bring a great quality podcast for you to continue to learn.

00:00:41.119 --> 00:00:47.439
And that's our goal to make sure that we continue to give you conversations that'll help us continue to grow.

00:00:47.439 --> 00:00:52.240
And before we dive in, I definitely want to thank our wonderful sponsors.

00:00:52.240 --> 00:00:57.920
Thank you so much to Book Creator, Eduaid, and Yellowdig for sponsoring our show.

00:00:57.920 --> 00:01:02.880
Without you and without you believing in our mission, we wouldn't be doing what we're doing.

00:01:02.880 --> 00:01:04.959
So thank you so much for that support.

00:01:04.959 --> 00:01:11.280
And if you're interested in being a sponsor, please feel free to reach out to us and we can definitely set that up.

00:01:11.280 --> 00:01:14.159
But I am excited about today.

00:01:14.159 --> 00:01:19.439
Uh today's guest is somebody that I have been following for a very long time.

00:01:19.439 --> 00:01:25.840
And I have just seen not just her account, but just her and this glow-up.

00:01:25.840 --> 00:01:32.959
She's been doing so many great things, and she's going to tell us about some exciting things that are happening that are coming soon.

00:01:32.959 --> 00:01:40.560
But she has been talking not only at school districts, she'll participate, you know, at Stanford University.

00:01:40.560 --> 00:01:47.519
She's working with so many people in so many different ways and talking to them about AI.

00:01:47.519 --> 00:01:55.040
So I am excited to welcome to the show our wonderful guest today, Marissa Sadler Holder.

00:01:55.040 --> 00:01:57.599
Thank you so much for joining us this evening.

00:01:57.599 --> 00:01:58.959
How are you, Marissa?

00:01:59.200 --> 00:01:59.760
I'm great.

00:01:59.760 --> 00:02:01.599
I'm so excited to be here.

00:02:01.920 --> 00:02:03.840
Well, I am excited to have you here.

00:02:03.840 --> 00:02:13.599
It was great just talking in the pre-show, just getting to just the way that we've connected and we've connected on LinkedIn and you know, for a for a good while now.

00:02:13.599 --> 00:02:21.199
And obviously, uh we connect with the same circles as far as AI, AI and education conversations and so on.

00:02:21.199 --> 00:02:24.719
So it's just been great to see your input.

00:02:24.719 --> 00:02:36.639
It's been great, like I mentioned, seeing your journey and the wonderful opportunities that you have had in helping educators and also just bringing your knowledge to, like I mentioned, wonderful places.

00:02:36.639 --> 00:02:41.919
I know Stanford, you know, and you've got some great things coming up, but I'll make I'll let you announce that.

00:02:41.919 --> 00:02:44.000
But I'm really excited about that for you.

00:02:44.000 --> 00:03:02.240
But before we dive in into our conversation, Marissa, for any of our audience members that are listening at this very moment that may not be familiar with your work just yet, can you give us a little bit of background and what your context is within the education ed tech space?

00:03:02.639 --> 00:03:03.520
Yeah, sure.

00:03:03.520 --> 00:03:07.919
So um I have a background in teaching.

00:03:07.919 --> 00:03:11.120
Uh 13 years, I was actually a French teacher.

00:03:11.120 --> 00:03:20.560
And during COVID, I hopped into getting um a master's in um instructional design and technology for education.

00:03:20.560 --> 00:03:22.719
And I just I really dived in.

00:03:22.719 --> 00:03:29.840
I've always been one of those people who like to tinker with education and tinker with technology and see where there's an intersection there.

00:03:29.840 --> 00:03:38.879
And um after COVID, you know, I just I was kind of feeling like I wanted to do something with the two together.

00:03:38.879 --> 00:03:45.680
And, you know, lo and behold, I mean, it kind of just happened with AI kind of being open to the masses.

00:03:45.680 --> 00:03:50.639
And I thought, oh my gosh, this is, you know, this is gonna make a wave in education.

00:03:50.639 --> 00:03:52.879
I think this is something that I can dive into.

00:03:52.879 --> 00:03:56.960
And educators are, you know, gonna be looking for help.

00:03:56.960 --> 00:04:16.480
And um, so I kind of just decided to go ahead and create teaching with machines, which is really about helping educators kind of learn about these new technologies that are out there, see how they can apply their expertise to this technology and see what they can create.

00:04:16.480 --> 00:04:32.800
And um, my ultimate goal with teaching with machines is to really have the teacher um feel empowered and excited, kind of like that new shot in the arm in education that we all kind of need to get us excited about what we're doing again.

00:04:32.800 --> 00:04:39.040
And um I speak a lot about AI and education, the integration.

00:04:39.040 --> 00:04:50.879
Um, I've uh work with schools and uh I worked with Orange County Department of Education as well as an AI consultant and speak at Stanford.

00:04:50.879 --> 00:04:56.240
I I am speaking at conferences, and yeah, that's kind of what I'm doing right now.

00:04:56.240 --> 00:05:02.639
And again, like it's all about just sharing, going out there, learning the thing, and then sharing whatever I learn.

00:05:02.879 --> 00:05:03.920
Yeah, and that's great.

00:05:03.920 --> 00:05:08.959
And I, you know, one of the things that I love too that you mentioned is, you know, empowering teachers.

00:05:08.959 --> 00:05:16.079
And as we know, you know, we've seen so many things, you know, in the news, how things have changed in education and so on.

00:05:16.079 --> 00:05:27.680
And, you know, it's very important that we do help support our educators in every which way possible, you know, from pedagogy to including the tech in pedagogy and finding that balance.

00:05:27.680 --> 00:05:46.720
And so one of the things that I love uh, you know, following your page and seeing um teaching with machines, which we will make sure that we link in the show notes, guys, so you can go ahead and visit Marissa's page, but how important that technology is, but it will never compare to the impact that teachers have.

00:05:46.720 --> 00:05:54.240
And I think that that's so important that you not only help teachers, you know, learn, like you mentioned, a new technology.

00:05:54.240 --> 00:06:04.560
And and I don't know, we can still say relatively new, even though I mean, since November 2022, we're already headed to November 2025, and we've seen how it has evolved.

00:06:04.560 --> 00:06:26.079
But I think that that's something that is great, and it's something that is gonna be continuous because as the tech changes, as the tech progresses, there's still gonna have to be uh people such as yourself, myself, and many of the guests that I've had on the show to be able to share their experiences with educators to help them as we continue to move forward.

00:06:26.079 --> 00:06:28.079
So that's something that's very exciting.

00:06:28.079 --> 00:06:43.759
So I want to ask you, uh, Marissa, well, when was it that you made that jump or that choice to go from educator to say, hey, I'm gonna go ahead and just go all in on teaching with machines?

00:06:43.759 --> 00:06:47.199
What was that aha spark moment for you?

00:06:47.199 --> 00:06:48.639
Gosh.

00:06:48.879 --> 00:06:49.680
I don't know.

00:06:49.680 --> 00:06:59.040
I sometimes I go, am I crazy for even to because you know the thing is, is it really is it's a huge career shift, you know.

00:06:59.040 --> 00:07:05.040
13 years in my position, and you know, it's a stable position in a great high school.

00:07:05.040 --> 00:07:09.439
And to jump into this, it was a big decision.

00:07:09.439 --> 00:07:17.360
But like one of my friends said, you know, it's now or never, you know, I mean, this is this is this moment.

00:07:17.360 --> 00:07:26.480
And if you can help educators kind of navigate this so they can in turn help their students navigate this, this is the time.

00:07:26.480 --> 00:07:29.279
This is the only moment in time you can do that, you know.

00:07:29.279 --> 00:07:32.240
And so I just I did it.

00:07:32.240 --> 00:07:44.079
But you know, the funny thing is is that when I when I left the profession and I started exploring what I could be doing, none of the jobs that I wanted to do existed.

00:07:44.079 --> 00:07:54.399
And so I kind of just said, all right, well, I guess I'll create teaching with machines so I have something that I can put on my resume to say that I'm doing something.

00:07:54.399 --> 00:07:57.040
But I honestly, I ended up subbing.

00:07:57.040 --> 00:08:02.160
I ended up subbing for almost a year while I built out teaching with machines.

00:08:02.160 --> 00:08:06.480
And um through that, it was a humbling experience.

00:08:06.480 --> 00:08:08.399
It was a very interesting experience.

00:08:08.399 --> 00:08:16.959
It is one that allowed me insight into other classrooms than my own experience, which was incredible.

00:08:16.959 --> 00:08:29.199
And also the ability to have conversations with students who have no fear of sharing their insight or input on how they use this technology.

00:08:29.199 --> 00:08:32.000
Um, there's no repercussions, right, with a sub.

00:08:32.000 --> 00:08:38.080
So um I was able to do a little recon during that subbing situation.

00:08:38.080 --> 00:08:46.559
But, you know, as I progressed and I continued, and I have a weekly newsletter that I just kept on putting out, sharing what I've learned.

00:08:46.559 --> 00:09:02.000
Um, I think part of it too is that while I saw this as, I mean, I I tell this story basically, and it's the moment I realized that they needed somebody out there, and not just myself, but there's many, many people too.

00:09:02.000 --> 00:09:11.519
But this idea of like I was sitting down and I was writing an email in response to kind of, you know, your typical angry parent email, if you will.

00:09:11.519 --> 00:09:12.879
And I was done.

00:09:12.879 --> 00:09:15.679
I mean, it was the end of the day, it was like in April.

00:09:15.679 --> 00:09:23.519
And my friend said, I don't think you should respond to that email right now, because you know, I was a little, you know, heated or something.

00:09:23.519 --> 00:09:26.000
And so um, I said, Okay, I won't.

00:09:26.000 --> 00:09:30.799
She's like, But I think you should run it through Chat GPT and adjust it, you know.

00:09:30.799 --> 00:09:38.879
And so I did, and I just kind of sat back and was just like, oh gosh, oh, okay.

00:09:38.879 --> 00:09:51.840
This is not a technology that is just gonna be something that we just uh simply adapt to, but rather will have a profound impact on education.

00:09:51.840 --> 00:10:09.679
Um, especially, you know, like part of the thing is is like I taught French and Google Translate has had a huge impact on how we do assessments, how we teach in language classes, and we're still struggling with that, many of us, and it's been 15 years, you know.

00:10:09.679 --> 00:10:12.720
So it's one of the, or maybe 10, I'm sorry.

00:10:12.720 --> 00:10:19.840
Um, but I just kept on thinking, okay, this is where teachers are gonna need support on the outside.

00:10:19.840 --> 00:10:44.799
Now, I started following you, which was amazing on LinkedIn, and I quickly realized there's just not a lot of teachers on LinkedIn, you know, um, a lot of admin, maybe, um, you know, just kind of thought leaders, but like the chalk in hand teacher, just there's no need to kind of have that profile on LinkedIn because we don't use it to network, right?

00:10:44.799 --> 00:10:46.159
So what's the point?

00:10:46.159 --> 00:11:04.799
It, you know, and so I just kind of, you know, I started getting on LinkedIn and I realized that there's a need to make sure that the teacher voice is amplified in these spaces where there are people who are making decisions, thought leaders who are giving their opinions on where education is going.

00:11:04.799 --> 00:11:14.879
I just wanted to make sure that the chalk in hand teacher has somebody who can amplify their voice out into these spaces to make sure that they are being heard.

00:11:14.879 --> 00:11:18.639
Um, and so that's kind of the decision I made.

00:11:18.639 --> 00:11:21.440
And I it's been an amazing experience.

00:11:21.440 --> 00:11:27.120
It's been a journey, it's been ups and downs, of course, but it's been, it's been a lot of fun.

00:11:27.360 --> 00:11:27.600
Yeah.

00:11:27.600 --> 00:11:32.879
And yeah, I've seen, like I said, seeing you, like I said, uh beautiful glow up.

00:11:32.879 --> 00:11:43.840
And I always tell my friends, like, I see you, like I see the great things, and it just gets me so excited, you know, that people within the space are continuing to grow and getting all these opportunities and so on.

00:11:43.840 --> 00:11:51.039
And so just to see you, and from when, you know, we first started connecting on LinkedIn to see what you're doing.

00:11:51.039 --> 00:11:57.600
And I'm like, oh my gosh, look at what Marissa's doing now, and look at where she's at, and look at where she's headed and all those opportunities.

00:11:57.600 --> 00:12:09.600
I think that's something that's valuable too, in the sense that you're amplifying and you're also not only are you amplifying your experience, but you like you mentioned, it's you're amplifying the voice of those educators too, as well.

00:12:09.600 --> 00:12:24.000
Like you're bringing their voices to the table because oftentimes, you know, stakeholders usually are the ones that say, well, we're the one, the decision makers up at the top, but we don't include the actual users within those conversations as well.

00:12:24.000 --> 00:12:26.240
And I think that's something that's very important.

00:12:26.240 --> 00:12:40.159
But I want to kind of shift up, you know, a little bit now because I know that we talk a little a lot about this, and I know I've seen it on LinkedIn so much, and it's been something that I've seen for the past couple of years.

00:12:40.159 --> 00:12:53.519
Well, I and I know say a couple of years, but mainly maybe within the last year, year and a half, there's a strong push, and we're always going and talking about human-centered AI professional development.

00:12:53.519 --> 00:13:18.559
So, with your experience in professional development, and of course, using that phrase, human-centered AI professional development, what does human-centered mean to Marissa, especially, especially in a world where it just seems like every AI tool seems to be the star or that magic bullet, and we kind of forget about that human aspect.

00:13:18.559 --> 00:13:24.480
So tell me a little bit about how you find that balance and how you really put that the human first.

00:13:55.610 --> 00:14:06.409
Yeah, um I have this belief about when we use AI that really AI should be an extension of your own expertise.

00:14:06.409 --> 00:14:11.370
And it should reflect um your voice.

00:14:11.370 --> 00:14:20.809
And so when it's not going to replace, we're not going to put out there or use it for things that we wouldn't normally know what to do and how to do.

00:14:20.809 --> 00:14:29.049
You know, I think it's very important that, because I think a lot of us and the big fear and that the thing that nobody really talks about.

00:14:29.049 --> 00:14:44.409
And I mean, whether you're an artist, whether you are a teacher or another profession, we all are sitting here going, if AI can do what I do, then what value do I bring?

00:14:44.409 --> 00:14:53.210
And it's very, it's it's a it's one of those things we don't, we don't talk about it, but that's the fear.

00:14:53.210 --> 00:14:54.970
That's what's driving the fear.

00:14:54.970 --> 00:15:00.409
And I think when we sit down and we say, okay, what value do I bring?

00:15:00.409 --> 00:15:05.450
Well, you know, the the AI cannot do anything without your direction.

00:15:05.450 --> 00:15:19.289
So if it is something that is reflecting your input, is reflecting your expertise, then we have much more control over the AI than we really are, we we really think we do, right?

00:15:19.289 --> 00:15:35.450
So um that and and and I tell like when I'm doing sessions with students, even I have this conversation about making sure that you know, there's these skills that we need to develop in this AI world.

00:15:35.450 --> 00:15:45.529
And and then they're they're you know, hard and fast skills that we've been trying to do for, you know, impart on them and you know, critical thinking, creativity, all of those things, right?

00:15:45.529 --> 00:15:46.889
And collaboration.

00:15:46.889 --> 00:15:57.529
But one of the ones and in kind of tapping into like the literary world or you know, the English teacher's world is developing that authentic voice.

00:15:57.529 --> 00:16:00.169
And what does that look like for you?

00:16:00.169 --> 00:16:01.129
Because you know what?

00:16:01.129 --> 00:16:02.809
There's only one of you.

00:16:02.809 --> 00:16:06.330
You only get to develop that authentic voice.

00:16:06.330 --> 00:16:08.490
Nothing else can do that except for you.

00:16:08.490 --> 00:16:25.850
And in a world where it's becoming more and more um inundated with AI and artificially created things, having that authentic voice that cannot be duplicated and replicated is yours.

00:16:25.850 --> 00:16:29.769
And um really work on developing that.

00:16:29.769 --> 00:16:32.169
Can you use AI to help you develop it?

00:16:32.169 --> 00:16:35.289
Yes, but it will never replace it.

00:16:35.289 --> 00:16:53.210
And I think that is so important, especially, and I don't know if you've been seeing this either, but like when we're on TikTok or we're looking at social media, I swear there's been like a shift from going away from the curated selfies and the curated content perfectly.

00:16:53.210 --> 00:16:57.210
Everybody's life is perfect, but we know, you know, behind the scenes it's not perfect, right?

00:16:57.210 --> 00:17:11.450
Whereas the more popular content creators are coming in disheveled, looking like a normal, you know, everyday human person and telling their story because they're captivating this authentic voice.

00:17:11.450 --> 00:17:18.730
And we are so drawn to it and we have such a need for it that I think we're gonna want to really start developing that even more.

00:17:18.730 --> 00:17:23.049
And I think it's gonna be really important for our students' future.

00:17:23.370 --> 00:17:23.610
Yes.

00:17:23.610 --> 00:17:28.330
No, I and one of the things that you that I love that you mentioned is that authentic voice.

00:17:28.330 --> 00:17:33.130
There's only one of you in this world, and we really want to hear your voice, your thoughts.

00:17:33.130 --> 00:17:44.490
And like, you know, like you mentioned, there are many tools that are out there that can help, you know, at least, you know, work through your message or what you're trying to say and things of that sort.

00:17:44.490 --> 00:17:51.930
But there's also a way of delivery that is you, that is the way that your true voice really comes out.

00:17:51.930 --> 00:18:06.730
And like you mentioned, in a world that is becoming very synthetic very quickly, because now with uh Sora 2 dropping and all of these videos, now you've got that video aspect of it.

00:18:06.730 --> 00:18:10.329
Uh, obviously, you know, with the large language models and so on.

00:18:10.329 --> 00:18:17.369
So I do agree with you that now it's people want to see that authenticity, who you really are.

00:18:17.369 --> 00:18:24.170
And yes, you you do notice that shift where, you know, the videos are not as overproduced as they once were.

00:18:24.170 --> 00:18:28.809
You know, people are coming in just being more natural because that's what people are craving.

00:18:28.809 --> 00:18:32.410
They're craving, they are wanting, you know, that authentic voice.

00:18:32.410 --> 00:18:48.890
And uh kind of going along that line, I kind of wanted to share, and I know I've done it in another episode, but it's very, you know, pertinent to what we're talking about, especially with voice, where a good friend of mine, she also does speaking engagements and so on.

00:18:48.890 --> 00:18:58.250
And, you know, obviously with the world of of LLMs and uh, you know, creating presentations is a lot easier, a lot quicker.

00:18:58.250 --> 00:19:03.450
You kind of give it your idea and it'll go ahead and uh pop something out for you.

00:19:03.450 --> 00:19:13.289
So she said that she was gonna do a little keynote and she said, Man, I already had this presentation, you know, done, you know, a month ago and everything like that.

00:19:13.289 --> 00:19:18.809
But of course, with the use of AI and and helping in creating that.

00:19:18.809 --> 00:19:31.529
But she said that when she was presenting, even though she has that content knowledge, she just felt like this isn't me, like this isn't really my voice, and so on.

00:19:31.529 --> 00:19:49.849
So she actually said, like, I'm gonna take a little break from all of this a little bit and just kind of see, like, like kind of guess you find yourself again, because I think oftentimes too, you do also see that side of videos and content where you can see that it is very heavily dependent on LLMs.

00:19:49.849 --> 00:19:56.250
And sometimes like you could you can kind of pick it out, you know, especially if we use it and so on, you can kind of pick it out.

00:19:56.250 --> 00:19:59.529
And sometimes it's like, uh, I'll just kind of scroll right past it.

00:19:59.529 --> 00:20:04.410
And uh like then I'll I see somebody that maybe has a lot of spelling errors or something like that.

00:20:04.410 --> 00:20:11.129
And I'm like, ooh, I want to see that, I want to read that, you know, because it just feels like like, oh, this is authentic and stuff like that.

00:20:11.129 --> 00:20:20.170
So I I really like that you said that you really help teachers also see that, that there's only one of you and it's your voice, but also that you work with students on that.

00:20:20.170 --> 00:20:22.009
And I think that's something that's very important.

00:20:22.009 --> 00:20:43.849
And as you know, like now with the release of Sora and the way that the technology is getting better and better, you know, it just seems like every week there's some kind of new improvement, there's some new model that's coming out, and you really want to help your students understand and critically think about these outputs and critically think about these uh large language models as well.

00:20:43.849 --> 00:21:55.650
But kind of going back to the work that you do with schools, Marissa, I want to ask you in your experience too, as well, you know, a lot of schools that uh are having that maybe trying uh find having a hard time reconciling maybe the the great use of AI, the use case of it, but maybe also now battling with that data privacy and ethics, you know, those barriers.

00:21:55.650 --> 00:22:03.090
What are some things that you may have seen or some best practices that you've seen, you know, along your travels within school districts?

00:22:03.090 --> 00:22:11.090
And what might be some suggestions that you yourself too have come with that you might be able to share with educators or districts?

00:22:11.810 --> 00:22:13.650
Yeah, so there's two things.

00:22:13.650 --> 00:22:18.210
So there's the AI uh coherence cycle, and I'll tell you about that in a second.

00:22:18.210 --> 00:22:26.529
But first and foremost, I feel like everybody has to go through their own journey with it before they can actually address any of that, right?

00:22:26.529 --> 00:22:28.450
Um, they have to first learn about it.

00:22:28.450 --> 00:22:33.650
We have to, we have to model those critical thinking and excitement and innovation.

00:22:33.650 --> 00:22:37.970
So first learning and being lifelong learners is looking into that.

00:22:37.970 --> 00:22:49.970
Um, I think a lot of times schools, given the time that they're given, they just, you know, kind of want to front load all of this onto educators.

00:22:49.970 --> 00:22:54.450
And, you know, here's your one and a half hours, two hour training, and then that's it.

00:22:54.450 --> 00:22:57.090
And then, you know, expect greatness.

00:22:57.090 --> 00:22:59.810
And the thing is, is, you know, we don't do that to our students.

00:22:59.810 --> 00:23:03.330
We get fired if we try doing that, you know, moving on to the next lesson.

00:23:03.330 --> 00:23:11.970
But um I think um it's this idea that, you know, you have to have two light bulbs that go off.

00:23:11.970 --> 00:23:15.730
And I'm not gonna swear, although I do say it in my head.

00:23:15.730 --> 00:23:18.610
The first light bulb is like, oh, this is cool.

00:23:18.610 --> 00:23:19.730
This can do this for me.

00:23:19.730 --> 00:23:20.850
This is amazing.

00:23:20.850 --> 00:23:23.570
And then the it's the oh beep light bulb.

00:23:23.570 --> 00:23:27.730
The second light bulb is where it's like, oh, this is going to impact.

00:23:27.730 --> 00:23:30.050
This is going to do XYZ.

00:23:30.050 --> 00:23:36.769
This is going to affect the things that I do and how my students do things, how my school could possibly do things.

00:23:36.769 --> 00:23:40.370
There is going to be the oh beep light bulb that has to go off.

00:23:40.370 --> 00:23:43.009
And we have to have space to be able to do both.

00:23:43.009 --> 00:23:45.810
The first one has to happen before the second one happens.

00:23:45.810 --> 00:23:58.370
And then we can start addressing those really big picture ideas and going after how this can be done ethically, safely, um, and with intention.

00:23:58.370 --> 00:24:09.009
Now, what I've seen a lot of success in working with schools is the AI coherence cycle that I also work with Danelle Almaras.

00:24:09.009 --> 00:24:21.970
And what we've created with this cycle is let's not, you know, we kind of modeled it off of what business, successful businesses have done to integrate AI.

00:24:21.970 --> 00:24:33.410
And instead of sitting down with their employers and our employees and saying, here, go do a two-hour training and then off you go, what they first do with AI is say, what's on fire?

00:24:33.410 --> 00:24:35.090
Where are we bleeding?

00:24:35.090 --> 00:24:36.370
What is going on?

00:24:36.370 --> 00:24:42.850
Because we can't possibly think of anything else until we address those huge, huge challenges that we have.

00:24:42.850 --> 00:24:49.250
And I think if we with this AI coherence cycle, it's kind of doing the same model.

00:24:49.250 --> 00:24:52.529
It's saying, what are our biggest challenges right now?

00:24:52.529 --> 00:24:55.570
What are some initiatives that we have to do to address?

00:24:55.570 --> 00:25:01.570
You know, AI is the last thing on my mind when it comes to all of this.

00:25:01.570 --> 00:25:13.730
But instead, when we look at these big challenges we have and say, where can AI possibly come and help me with these things?

00:25:13.730 --> 00:25:16.289
Then we're really leveraging the tool.

00:25:16.289 --> 00:25:23.009
And along with that comes learning and exploration of the tool and professional development.

00:25:23.009 --> 00:25:25.410
But now we're doing that first light bulb.

00:25:25.410 --> 00:25:33.570
Oh, hey, this can help me with things, you know, and then you're getting the professional development in a way that is not here, let me sit you down for two hours.

00:25:33.570 --> 00:25:35.650
It's cyclical, it keeps on going.

00:25:35.650 --> 00:25:39.090
And we're addressing the challenges that we have at hand.

00:25:39.090 --> 00:25:45.410
And I think that's going to be more reflective of how we're going to be seeing AI in education in the future.

00:25:45.410 --> 00:25:48.610
But um, I I've seen some success with it.

00:25:48.610 --> 00:26:02.850
I will tell you the I worked with a school that decided to do that probably mid-year, um, where we just decided to focus on getting their curriculum um mapped.

00:26:02.850 --> 00:26:14.450
And we had about 30% of the staff who had the curriculum map and uploaded onto a um, what is it, like a program that documents it, right?

00:26:14.450 --> 00:26:16.850
And this was a part of their accreditation process.

00:26:16.850 --> 00:26:17.090
Okay.

00:26:17.090 --> 00:26:18.610
And they really wanted to get it done.

00:26:18.610 --> 00:26:27.009
By like two months after leveraging AI and working with teachers, we had 80% of the teachers completed with all of their curriculum mapped.

00:26:27.009 --> 00:26:32.130
And it's just like, look, it's your expertise that did this with this tool.

00:26:32.130 --> 00:26:35.890
This is how we could be addressing and using and leveraging AI.

00:26:35.890 --> 00:26:41.810
It doesn't always have to be a sit-down, here's your two-day, you know, professional development.

00:26:41.810 --> 00:26:46.370
And along with that comes learning about the limitations.

00:26:46.370 --> 00:26:52.050
And that's where making sure you're using it ethically, that's why professional development is important.

00:26:52.050 --> 00:26:58.130
So we we are careful and we're protecting ourselves, we're protecting our students, and we're protecting our learning community.

00:26:58.529 --> 00:26:59.170
I love that.

00:26:59.170 --> 00:27:15.970
And you know, that actually covers a lot of things that I wanted to ask as a follow-up because as we know, you know, with tools like the tools that are out there now, the plethora of platforms that we have available, sometimes it's like, hey, we just want the next shiny tool.

00:27:15.970 --> 00:27:27.009
And we just want it because we want to do this faster, as opposed to, okay, how can we reimagine what we're learning with these tools and maybe take it to that next level?

00:27:27.009 --> 00:27:33.490
And I know uh I'm a big fan of the Samar model, you know, the substitution, augmentation, modification, and redefinition.

00:27:33.490 --> 00:27:51.810
So oftentimes what we normally see, and initially, you know, with as with any adoption, sometimes you'll get that substitution factor of, well, before I used to do handouts, well, now I can use something like Kami that I can annotate and I can do the same thing, you know, digitally.

00:27:51.810 --> 00:28:02.769
But then also, what else can we do to take that and augment what can what are some things that we couldn't do before that we're able to do there and then modify and then redefine?

00:28:02.769 --> 00:28:11.650
And sometimes I feel that many times, as quick as we are, sometimes as educators, we just want the next shiny thing and we just kind of say, ah, okay.

00:28:11.650 --> 00:28:18.930
And then when it loses its lore and its glamour, it's like, all right, let's move on to the next thing because we we just we're we're like that.

00:28:18.930 --> 00:28:21.570
We get so excited and we want to try those things.

00:28:21.570 --> 00:28:47.009
But I think with this, even with the steps that you described, having the teachers being able to work through it as they're working there, it's not only going to be something that's gonna help them see that they can do their work more efficiently, but also be more effective and also think of new ways that they can enhance those lessons as they're putting their curriculums together and thinking, like, wow, like this is something that I would have never been able to do before.

00:28:47.009 --> 00:28:50.210
And now I can enhance those lessons for my students.

00:28:50.210 --> 00:28:55.330
So that is great that you shared all of that because I think that that really plays well into this.

00:28:55.330 --> 00:29:15.009
And so, my next question to you though, like I mentioned, as educators, we get so excited and we want to use the next tool because we see what's hot uh that's trending on, you know, on educator um TikTok or on whether it's uh X or Instagram and so on.

00:29:15.009 --> 00:29:20.050
So, as we know, you know, there's a lot of hype that gets put onto a lot of these tools.

00:29:20.050 --> 00:29:30.610
So, how do you help uh administrators discern what would be a good tool versus something that might just be snake oil?

00:29:31.970 --> 00:29:33.009
Oh my gosh.

00:29:33.009 --> 00:29:42.610
Um I I think we always should be kind of centered on the effectiveness, right?

00:29:42.610 --> 00:29:45.250
Because there's a lot of these tools out there.

00:29:45.250 --> 00:29:51.410
And you do, first of all, you have to give them time to explore the tools to see if they are effective.

00:29:51.410 --> 00:29:57.570
And I think what I like to tell the administrators is like just pick pick three.

00:29:57.570 --> 00:30:05.250
You don't need to pick, you know, the 500 that are out there, just pick three and just get good at it.

00:30:05.250 --> 00:30:06.690
Get good at it.

00:30:06.690 --> 00:30:12.210
And once you're done getting good at it, share out what it is that you did with it.

00:30:12.210 --> 00:30:26.769
Make sure you're creating that community of innovation and sharing and excitement so people can actually kind of bring that PLC alive, if you will, right?

00:30:26.769 --> 00:30:30.210
I don't, I mean a PLC could be a dirty word for some people, your listeners.

00:30:30.210 --> 00:30:36.610
I don't know, but it's that idea where it's like in the true sense of the learning community, right?

00:30:36.610 --> 00:30:46.610
So um I I I think just, you know, you don't, I think it's very important that you don't have to know everything all at once.

00:30:46.610 --> 00:31:00.289
And I think as educators, that's really, really hard for us because we've always been told that the value we bring is the expertise in the XYZ subject.

00:31:00.289 --> 00:31:03.330
We are expected to know it all.

00:31:03.330 --> 00:31:07.650
And the fact of the matter is, is that we don't.

00:31:07.650 --> 00:31:12.610
And we're all exploring and we're all learning and we're all exploring this new technology.

00:31:12.610 --> 00:31:15.009
So pick three, good, good.

00:31:15.009 --> 00:31:25.170
At it, get some data back from you know the users, see how they're doing with it, see what they like, what they don't like, and then you can, you know, kind of continue on.

00:31:25.170 --> 00:31:29.730
But again, you have to create this community of like, is this effective?

00:31:29.730 --> 00:31:43.650
And I'm a big uh believer in teaching educators how to fish versus go to the market and really kind of get into the weeds.

00:31:43.650 --> 00:31:45.730
How is this tool working?

00:31:45.730 --> 00:32:02.769
So in the future, if I'm simply using a tool that is safe and approved by our school, that if I know how to use it and I know how to manipulate the tool for a certain output, I don't need XYZ tool over there.

00:32:02.769 --> 00:32:09.170
I don't need this, you know, 555th tool that's out on the market, right?

00:32:09.170 --> 00:32:11.250
Like I just need my own intelligence.

00:32:11.250 --> 00:32:14.850
I think I think that's what we have to keep in mind.

00:32:14.850 --> 00:32:17.330
Like, we don't always have to go to the market.

00:32:17.330 --> 00:32:22.130
And also, I'll just add in there's a lot out there.

00:32:22.130 --> 00:32:33.250
Let's also use discernment when we are looking at these tools in a sense of who's putting it out there, what's what is it saying, and are they really educators?

00:32:33.250 --> 00:32:37.570
Because I'll see TikToks out there where they've just hired somebody to say that they're a teacher.

00:32:37.570 --> 00:32:41.730
And it's just like, no, you know, like let's, you know, that doesn't exist.

00:32:41.730 --> 00:32:43.330
That is inauthentic.

00:32:43.330 --> 00:32:44.769
You know what I mean?

00:32:45.009 --> 00:32:52.210
So yes, no, and that's great, and that's sound advice, uh, definitely, with especially how saturated the market has become.

00:32:52.210 --> 00:33:09.810
And again, and I'm talking simply just education, but I mean, even in the professional world, I mean, so many things that are happening and you know, systems in place, and you've got SaaS products and you've got all these vendors, and it's just everybody's really just selling on the buzzwords, you know.

00:33:09.810 --> 00:33:17.330
We this is gonna provide synergy and this is gonna go ahead and um you know build up these particular skills and so on.

00:33:17.330 --> 00:33:19.490
But then, but are the results there?

00:33:19.490 --> 00:33:22.210
You know, is there data to back that up?

00:33:22.210 --> 00:33:23.250
Exactly.

00:33:23.250 --> 00:33:37.730
And so I think that it is sound advice that you shared where if you are a district that is deciding to move forward with this, is really great to just say, okay, let's like, let's have a community.

00:33:37.730 --> 00:33:40.050
Um, first, maybe I and I know Dr.

00:33:40.050 --> 00:34:07.730
Nika McGee, I know when she started working with this here in uh uh one of the school districts that's close to where I live, she started and brought in a group of teachers, and they were like kind of like the teachers in the loop, and they tried everything out, pushed it at its paces, saw the results that they were getting, and then they were saying, okay, this is something that would be useful for X, Y, and Z content and kind of take it from there.

00:34:07.730 --> 00:34:14.449
Or just like you said, pick those three and just really take a deep dive and see where you go.

00:34:14.449 --> 00:34:38.289
The the only thing is with me, and and you know, being a digital learning uh coordinator is although we do say, okay, let we're gonna go ahead and level set and we're gonna use these three platforms, you know that there's those teachers, I call them the speedboats, that are gonna hop on and use the freemiums of every other app that is out there, and then they're gonna say, no, we want this one and we want this one.

00:34:38.289 --> 00:34:41.969
And then later on, you know, a month later, it's like, hey, I want this one now.

00:34:41.969 --> 00:34:49.889
But it's like, wait a minute, it's like you haven't given enough time for the tool to give us that data to see if it is being effective.

00:34:49.889 --> 00:34:59.969
And I think that's something that's very important in checking the efficacy and obviously checking the price on a lot of these apps because it is quite an investment.

00:34:59.969 --> 00:35:09.170
And then for you to not be able to get the results that you they promised that you would get, then that's something where you feel like, oh my gosh, did we make a mistake?

00:35:09.170 --> 00:35:11.730
So it's important to take those deep dives.

00:35:11.730 --> 00:35:18.610
And I think also going back to the model of how you do professional development, I think that's something that is great.

00:35:18.610 --> 00:35:29.570
And PLCs to become true PLCs, not please learn compliance meetings, but actual I've never heard that LCs.

00:35:29.570 --> 00:35:43.250
Yes, because a lot of the times it's just PLCs instead of teachers actually talking about lessons and what they could do and planning, it's really here's another list of check uh, you know, that I have to that you guys have to check off for compliance.

00:35:43.250 --> 00:35:45.170
So it's really please learn compliance.

00:35:45.170 --> 00:35:48.530
And so that's what I think that they've turned into many times.

00:35:48.530 --> 00:36:01.809
But it's so important to build that community and and it's important to have that communication collaborate across the district, across grade levels, across content, and see what works and just go with that, you know.

00:36:01.809 --> 00:36:04.769
I think it's uh even getting, yeah, sorry.

00:36:04.769 --> 00:36:06.050
Oh no, no, go ahead.

00:36:06.050 --> 00:36:06.930
Go ahead.

00:36:07.250 --> 00:36:10.610
I was just gonna say, even like just getting the community involved too.

00:36:10.610 --> 00:36:15.809
Like it doesn't, you know, you have all these different stakeholders that should be involved in all of this.

00:36:15.809 --> 00:36:16.849
Because guess what?

00:36:16.849 --> 00:36:31.170
The kids are watching, they're watching how we're approaching, they're watching how we are doing this, and we need to be modeling what innovative thinking looks like and how we approach this technology ethically and with intention.

00:36:31.490 --> 00:36:32.369
Yes, most definitely.

00:36:32.369 --> 00:36:33.090
And you know what?

00:36:33.090 --> 00:36:35.490
And that's also a great point that you mentioned.

00:36:35.490 --> 00:36:37.650
It's having everybody at the table.

00:36:37.650 --> 00:36:39.250
And that's even including parents.

00:36:39.250 --> 00:36:43.730
Parents should also know what it is and what's happening and what the school is trying to do.

00:36:43.730 --> 00:36:47.090
And just, you know, and and obviously it's all for the good of the students.

00:36:47.090 --> 00:36:48.050
We want those results.

00:36:48.050 --> 00:37:00.450
We want our students to be successful, but you know, we can also include that parent community parent community, and that'll definitely help uh your edge your school be very successful too, as well, because parents get informed.

00:37:00.450 --> 00:37:04.530
So I want to ask you now, just kind of uh changing things up a little bit.

00:37:04.530 --> 00:37:08.769
And you know, I want to talk about the great event that you'll be speaking uh at.

00:37:08.769 --> 00:37:31.329
But before we get to that, just to kind of round out the conversation a little bit, is I want to ask you, what is one thing that through all of your travels and all the work that you've done with teachers, what is the one thing that still surprises you about teachers as far as misconceptions that they still have about AI?

00:37:38.690 --> 00:37:53.730
Um I think it goes back to what I was talking about early um earlier about where they're finding their value in all of this.

00:37:53.730 --> 00:38:02.610
I think the biggest misconception is that they don't have value if something like AI can do what they do.

00:38:02.610 --> 00:38:08.610
And that the fact is is that their expertise is what makes them so valuable.

00:38:08.610 --> 00:38:14.369
Their ability to have human connection with their students is what makes them so valuable.

00:38:14.369 --> 00:38:19.490
And the nuances that they bring into the classroom makes them so valuable.

00:38:19.490 --> 00:39:13.010
And that while the technology is important to learn and learn about, and we need to model what that looks like, it's still so important to keep in mind that it is a it's technology and what we do in education learning the science of learning, learning is messy and beautiful, and sometimes fast, sometimes slow, but it is this incredible process that we curate for our students, and that simply should not be chalked up to something that can help optimize that messiness because it's in the messiness that students learn.

00:39:13.409 --> 00:39:13.970
Yes.

00:39:14.289 --> 00:39:15.490
Oh, I love that.

00:39:15.490 --> 00:39:16.769
That is wonderful.

00:39:16.769 --> 00:39:18.130
That is wonderful.

00:39:18.130 --> 00:39:19.490
Oh, that is great.

00:39:19.490 --> 00:39:28.210
I it I love that it just really kind of hit and resonates just because even in the classroom, you know, when when I was in the classroom, it wasn't always perfect.

00:39:28.210 --> 00:39:41.409
And I think for a lot of teachers, you know, it's the the pressure is on that they feel that they have to be on 100% and that the lessons have to be perfect and that they have to know everything.

00:39:41.409 --> 00:39:55.409
But somewhere along the line in my my 11 years in the classroom, I slowly started figuring out because I was like in a wake, I guess, kind of burning out and trying to always know everything.

00:39:55.409 --> 00:39:59.650
But I then I figured I was like, you know what, it's okay to not know everything.

00:39:59.650 --> 00:40:04.050
You know, I am here as a as a learning engineer.

00:40:04.050 --> 00:40:11.329
I'm I'm engineering and creating these learning experiences for the students where we can all at the same time learn together.

00:40:11.329 --> 00:40:13.490
And I think that that was something that was great.

00:40:13.490 --> 00:40:31.170
And now with the tools that are available, you can definitely take that learning to that next level and going in deeper and understanding how to leverage those tools in a way that you're still engineering a great experience and enhancing the learning for students.

00:40:31.170 --> 00:40:32.849
I think that that's something that is great.

00:40:32.849 --> 00:40:42.930
And I think that that should kind of alleviate some of that pressure and and some of the, you know, at least for some teachers, to just milk up, make them feel at peace and ease.

00:40:42.930 --> 00:40:47.090
Because, like you said, some may feel like, well, I mean, this thing's gonna replace me.

00:40:47.090 --> 00:41:10.610
I was like, but it's not because it doesn't know the students, it can't read their faces, it can't, it doesn't know where it is that who where the students are coming from, what side of town they may live, what it is that they may need, and how you as a teacher can help shape that student's not only day, but that year by being present and being there for them.

00:41:10.610 --> 00:41:38.849
And this this is just a tool on the side to help with the learning process, but it's irreplaceable, like you mentioned earlier that we talked about that teacher contact, that what a teacher can do is is a life-changing experience for sure for a lot of students that you know that could be the the safest place that they'll be all year, and you as a teacher have that, but now we have those tools also for the learning as well.

00:41:38.849 --> 00:41:40.930
So I think that's something that is very powerful.

00:41:40.930 --> 00:41:46.530
And yes, it it's teaching is messy, and it's okay that it's messy.

00:41:46.530 --> 00:41:52.130
And I figure sometimes if if if you don't feel that messiness, then are we really teaching?

00:41:52.130 --> 00:42:01.090
I was like, Well, but I mean, I'm just throwing that out there because I figure like myself, I was like, man, if if I'm not being messy, I was like, then what am I doing?

00:42:01.090 --> 00:42:10.050
Like, we need to get messy, we need to get heads-on, and sometimes it it kind of uh it looks a little different for a lot of us, but I think that that's great, awesome.

00:42:10.050 --> 00:42:13.889
All right, so Marissa, I want to ask you now, because I'm really excited.

00:42:13.889 --> 00:42:24.769
I know I saw the news, you know, a couple days back and everything, but I want to know a little bit about this wonderful experience that you're gonna have pretty soon, and you are going to be presenting at EdTech Week.

00:42:24.769 --> 00:42:25.970
Is that correct?

00:42:26.530 --> 00:42:27.170
Yes.

00:42:27.170 --> 00:42:34.130
Um, I will be in EdTech uh at Ed Tech Week in New York at Columbia University.

00:42:34.130 --> 00:42:36.450
I will be presenting, sorry, that's a mouthful.

00:42:36.450 --> 00:42:41.250
I will be presenting with Bonnie Neves, um, who has been a guest on the podcast as well.

00:42:41.250 --> 00:42:42.849
So that's really exciting.

00:42:42.849 --> 00:42:47.329
And we are, you know, I went last year and it was my first experience.

00:42:47.329 --> 00:42:53.409
You know, again, coming from the classroom, I didn't even know these existed, right?

00:42:53.409 --> 00:43:00.450
And so I found out and I went and I kind of realized that I, you know, that was one of the only teachers there.

00:43:00.450 --> 00:43:04.289
Or, you know, my experience was just as a teacher.

00:43:04.289 --> 00:43:11.250
And I mean, they had, you know, superintendents and, you know, big wigs and all these people, but then there's, you know, me.

00:43:11.250 --> 00:43:23.250
And I thought, you know, we need more, um, we need to hear more from those teachers who have, you know, chalk in the hand, you know, insight.

00:43:23.250 --> 00:43:32.210
Because frankly, like these, you know, the ed tech out there that we are we're experiencing.

00:43:32.210 --> 00:43:37.889
And as teachers, as soon as it finally comes to our classroom, we are looking at whether or not it is effective.

00:43:37.889 --> 00:43:42.690
Because, you know, frankly, we could use 50 tools in one year.

00:43:42.690 --> 00:43:46.369
And, you know, what makes your tool special?

00:43:46.369 --> 00:43:47.650
Well, is it effective?

00:43:47.650 --> 00:43:49.570
Is it doing what it should be doing?

00:43:49.570 --> 00:43:57.010
And I think sometimes there is a disconnect between the promise and the reality.

00:43:57.010 --> 00:44:15.809
And so this year I am so excited that they reached out to us to have us come and speak and kind of amplify that teacher voice and say, you know, I call myself the spicy apple because I kind of tell it like it is, you know, this is really what's going on, you know, type thing.

00:44:15.809 --> 00:44:33.889
And because the reality is this soon enough, we as educators will have access if we're not already doing it now, some of us teachers out there will have access to a tool that will allow us to create our own tools.

00:44:33.889 --> 00:44:38.130
What will happen to ed tech then?

00:44:38.130 --> 00:44:44.050
What will happen to those companies then once we actually just start building it ourselves?

00:44:44.050 --> 00:44:55.409
And so just kind of putting it out, you know, like look, this is this is a very real potential.

00:44:55.409 --> 00:45:01.650
And you know, um somebody has to say it, and I guess it's gonna be us.

00:45:02.050 --> 00:45:04.930
Yeah, no, and you know what, that is a wonderful message.

00:45:04.930 --> 00:45:11.889
I really love that because up until this point, we've depended on what others think that we need.

00:45:11.889 --> 00:45:24.050
But as a teacher, you are the one in the classroom and you are the one that knows what your obstacles, barriers are, you know what your wins are, you know what your students need.

00:45:24.050 --> 00:45:25.889
And so you're absolutely right.

00:45:25.889 --> 00:45:39.090
Once that tool comes out where you yourself can create what you need, then where does that leave everybody that up until this point were seemingly the experts in telling us what we need?

00:45:39.090 --> 00:45:44.849
And I think that that is a very powerful and a very spicy hot take, Marissa.

00:45:44.849 --> 00:45:45.730
And I love it.

00:45:45.730 --> 00:45:55.090
And I'm all here for it because uh, like you said, just being able to amplify that voice and hearing more teachers out and now say, hey, you know what?

00:45:55.090 --> 00:46:01.409
For right now, yes, we may need you and depend on you because obviously you guys have the infrastructure.

00:46:01.409 --> 00:46:21.730
But when that moment comes, I know what my student needs and I can easily build some build something for them for that specific use case for that group, that small group, that large group, for the varying levels that I may be teaching, I can easily make it, have it done, and it's there and it's mine.

00:46:21.730 --> 00:46:40.610
Or I can also see it as this and this, which is to me, I was like, hey, this is what would be great where as a collective, be able to interchange our tools with one another in different districts and say, hey, I like what Marissa's doing over there, and I like what that tool is doing.

00:46:40.610 --> 00:46:42.769
I want to see if I can go ahead and use that here.

00:46:42.769 --> 00:46:46.690
And it's like all of a sudden, it's like, whoa, Marissa, your tool really helps me out here.

00:46:46.690 --> 00:46:50.050
And then, hey, Fonz, you've got that cool podcasting tool.

00:46:50.050 --> 00:47:12.690
I want to try that with my students, and the it just becomes a true community, and we are all sharing the tools that we build for ourselves, and then we can actually say it is teacher built, as opposed to many other platforms that say that are teacher built, but may have never had a teacher on their staff at all, whatsoever.

00:47:12.690 --> 00:47:14.849
So spicy take there, Marissa.

00:47:14.849 --> 00:47:18.849
I love it, I love it, and the fact that you're presenting with Bonnie too.

00:47:18.849 --> 00:47:21.170
Oh, Bonnie is such a love.

00:47:21.170 --> 00:47:25.730
She is wonderful, she has a wonderful heart and such a passion for education.

00:47:25.730 --> 00:47:27.970
So I'm very excited and happy for you.

00:47:27.970 --> 00:47:33.409
And so uh please let us know for our listeners that may be attending at tech week.

00:47:33.409 --> 00:47:36.210
When is the uh your talk going to be?

00:47:36.530 --> 00:47:38.450
Oh gosh, I was supposed to know that.

00:47:38.450 --> 00:47:43.970
Um it will be, I know, it will be on Wednesday in the afternoon.

00:47:43.970 --> 00:47:47.170
I believe it's uh um uh I believe it's at four.

00:47:47.170 --> 00:47:51.010
And I'll double check and um maybe you can put it in the show notes.

00:47:51.010 --> 00:47:53.490
Yes, because I did not come prepared.

00:47:53.809 --> 00:48:03.650
No, I definitely will put that in the show notes, and definitely, guys, make sure that you visit Marissa's website at teachingwithmachines.com.

00:48:03.650 --> 00:48:08.050
That will also be in the show notes, and you can probably connect also.

00:48:08.050 --> 00:48:11.409
I think the best place to connect with Marissa is also going to be LinkedIn.

00:48:11.409 --> 00:48:15.730
But Marissa, tell our audience members where else they might be able to connect with you.

00:48:16.050 --> 00:48:24.130
Yeah, um, you can just email me directly at Marissa M-E-R-I-S-S-A at teachingwithmachines.com.

00:48:24.130 --> 00:48:25.650
I mean, I'm open.

00:48:25.650 --> 00:48:26.930
My emails are open.

00:48:26.930 --> 00:48:33.889
I'm more than happy to, you know, meet with you, zoom with you, whatever it is that you guys want to talk about.

00:48:33.889 --> 00:48:41.409
Um, I just I really want teachers to know that there is somebody out there who who who will help them navigate this.

00:48:41.409 --> 00:48:44.289
So um email me.

00:48:44.289 --> 00:48:48.130
And then there's also a Facebook uh group called Teaching with Machines.

00:48:48.130 --> 00:48:49.570
You can just kind of go find that.

00:48:49.570 --> 00:48:54.130
And then I'll also send you that link for Facebook and LinkedIn.

00:48:54.130 --> 00:48:57.010
LinkedIn's kind of where I've been hanging out.

00:48:57.010 --> 00:49:00.530
Um, but I also have a weekly newsletter as well.

00:49:00.530 --> 00:49:06.289
And you can find that on my um website, teachingwithmachines.com.

00:49:06.289 --> 00:49:12.849
Also on the website, just know that there's a bunch of templates, ready to go, free downloads.

00:49:12.849 --> 00:49:14.130
Everything's free.

00:49:14.130 --> 00:49:37.010
Just, you know, it's really just to help you kind of either set up AI classroom norms with your classroom or um what I call a flexible toolbox where you add in AI into your class, you invite it in with boundaries, and you allow your students to kind of start building their own literacy and their own uh fluency with AI.

00:49:37.010 --> 00:49:40.369
So while supporting your own curriculum.

00:49:40.369 --> 00:49:43.570
So um all of that is free for you guys.

00:49:43.570 --> 00:49:47.329
It's a pleasure, and it was great talking to you today for sure.

00:49:47.650 --> 00:49:48.610
So awesome.

00:49:48.610 --> 00:49:52.849
And again, I did just hop on to LinkedIn and go to your profile.

00:49:52.849 --> 00:49:59.409
So it'll be October the 22nd, which is that Wednesday, that Marissa will be talking.

00:49:59.409 --> 00:50:02.450
So, and again, we'll definitely put that in the show notes too as well.

00:50:02.450 --> 00:50:10.210
So if there's anybody that's listening that's gonna be at Ed Tech Week, please make sure that you do check out that talk with Marissa and Bonnie.

00:50:10.210 --> 00:50:15.970
And you're gonna hear uh Marissa's spicy takes, which I'm all here for, which is great.

00:50:15.970 --> 00:50:17.970
And all of that will be in the show notes.

00:50:17.970 --> 00:50:23.409
But Marissa, before we wrap up, as you know, I always love to end the show with the following three questions.

00:50:23.409 --> 00:50:25.570
So hopefully you are ready to go.

00:50:25.570 --> 00:50:26.530
So here we go.

00:50:26.530 --> 00:50:32.210
Question number one: as we know, every superhero has a pain point or a weakness.

00:50:32.210 --> 00:50:35.650
For Superman, kryptonite weakened him.

00:50:35.650 --> 00:50:49.010
So I want to ask you, Marissa, in the current state of education or AI in education, you can pick either one, or if you want to give both, what would you say is your current edu kryptonite?

00:50:49.730 --> 00:50:51.250
Oh my gosh.

00:50:51.250 --> 00:51:28.130
Uh kind of um it's this, I know it's my I know it's it's mindset, but the inability to I I guess it's really difficult to kind of see kind of where you you think things are moving and you believe things are moving, and know it takes quite a bit to get there, but the inability to see the your surroundings and see where education is actually staying put and stagnant, and you're you're just hoping so much more to come out of it.

00:51:28.130 --> 00:51:33.090
And I think a lot of it is mindset, and I know that Dr.

00:51:33.090 --> 00:51:41.809
Mark Isaacs uh Isaacs, the gentleman you had on the last, by the way, if you're listening to that, go back and listen to that podcast.

00:51:41.809 --> 00:51:43.650
It is absolutely amazing.

00:51:43.650 --> 00:51:45.329
I'm gonna make this man a my friend.

00:51:45.329 --> 00:51:46.289
I don't know how.

00:51:46.289 --> 00:51:47.809
He is he's so great.

00:51:47.809 --> 00:51:49.250
Everything he said was amazing.

00:51:49.250 --> 00:51:50.369
That was an amazing episode.

00:51:50.369 --> 00:51:56.930
But kind of this idea that we're stuck and we're, you know, we're going back to the way things we've always done because it's always worked.

00:51:56.930 --> 00:52:02.289
But the problem is it's it's just not working, and nobody everybody knows that, but nobody's doing anything to make any changes.

00:52:02.289 --> 00:52:19.650
So I I see the change and the demand, and I think we need to kind of pair up the demand for what our students actually need, and we just we we just need to make that mindset shift just slowly, if if if anything.

00:52:20.050 --> 00:52:20.930
No, I love it.

00:52:20.930 --> 00:52:27.090
I actually I put out a clip earlier today about our conversation, and he called it like cognitive entrenchment.

00:52:27.090 --> 00:52:36.769
And it just the where you just do the same thing every year, you know, your calendar basically looks the same, your lesson plans, your tests, and so on.

00:52:36.769 --> 00:52:40.369
And then so you wonder, why are we why is innovation not working?

00:52:40.369 --> 00:52:41.730
Why are we not innovating?

00:52:41.730 --> 00:52:51.010
Well, because you're stuck in that loop of doing things continually instead of, like you said, like what is working for the students and what is not?

00:52:51.010 --> 00:52:52.930
And let's kind of make those changes.

00:52:52.930 --> 00:52:57.090
So, yes, definitely uh great conversation with uh Dr.

00:52:57.090 --> 00:52:59.889
Mark Isics, and that's fantastic.

00:52:59.889 --> 00:53:08.130
Uh, question number two, Marissa, is if you could have a billboard with anything on it, what would it be and why?

00:53:13.329 --> 00:53:15.329
I should have prepared for this question.

00:53:15.329 --> 00:53:25.650
Um I think I would say the value is you.

00:53:27.889 --> 00:53:29.170
I like that.

00:53:29.409 --> 00:53:29.889
Yeah.

00:53:29.889 --> 00:53:30.930
And why?

00:53:30.930 --> 00:53:32.210
Because it's true.

00:53:32.210 --> 00:53:42.050
You know, I mean, again, like AI won't, you know, AI I feel like we can be so creative with AI.

00:53:42.050 --> 00:53:47.970
And so I think now we have this opportunity to do that, but only you bring that in.

00:53:47.970 --> 00:53:50.289
Only you decide what is actually good.

00:53:50.289 --> 00:53:57.409
Only you decide if it represents your creativity, only you decide if it's reflective of your expertise.

00:53:57.409 --> 00:54:02.930
So the value is you without it, it it just simply does not exist.

00:54:03.409 --> 00:54:05.170
Oh, that is powerful.

00:54:05.170 --> 00:54:06.930
That is a wonderful billboard.

00:54:06.930 --> 00:54:10.690
Like you got me thinking, I'm like, yes, that is that is a great message.

00:54:10.690 --> 00:54:11.490
I love it.

00:54:11.490 --> 00:54:12.050
All right.

00:54:12.050 --> 00:54:14.130
Question number three, last one.

00:54:14.130 --> 00:54:20.690
If you could trade places with a single person for a day, who would that be and why?

00:54:20.930 --> 00:54:22.930
I would probably, he's gonna laugh at this.

00:54:22.930 --> 00:54:28.530
I would I would totally change places with Canal.

00:54:28.530 --> 00:54:30.289
Kenal Dalal.

00:54:30.289 --> 00:54:32.210
I don't know if you know him.

00:54:32.210 --> 00:54:47.490
He he's quite the character, but this man is when we talk about creativity, like I would love to just be in his boots, in his brain for a day, and just see the sparks flying and see where it takes us.

00:54:47.490 --> 00:54:49.490
If you don't follow him, he's amazing.

00:54:49.490 --> 00:54:50.769
He's on LinkedIn.

00:54:50.769 --> 00:54:56.930
He is such a forward thinker visionary when it comes to AI and education.

00:54:56.930 --> 00:55:05.170
And while I like to say that I'm kind of a couple steps ahead of the game, this man is lapping me.

00:55:05.170 --> 00:55:15.809
So um it's hard to keep up with him sometimes, but his excitement and creativity and human-centered everything and just super authentic.

00:55:15.809 --> 00:55:17.090
Everybody should follow him.

00:55:17.090 --> 00:55:19.730
He's a lovely person, and I would love to be him for a day.

00:55:19.730 --> 00:55:20.130
Yes.

00:55:20.690 --> 00:55:21.650
And I can vouch for that.

00:55:21.650 --> 00:55:22.690
He definitely is.

00:55:22.690 --> 00:55:27.730
And uh, I'll definitely I'll even put his his uh info there in the show notes too.

00:55:27.730 --> 00:55:31.170
If you follow him, I promise you you will not regret it.

00:55:31.170 --> 00:55:42.610
You and just like Marissa says, so much energy, so much passion, and it's somebody wonderful that you should definitely connect with and be part of your network for share.

00:55:42.610 --> 00:55:47.090
Marissa, thank you so much for a wonderful conversation this evening.

00:55:47.090 --> 00:55:48.849
It has been an honor and a pleasure.

00:55:48.849 --> 00:55:58.450
And like I said from the very beginning, just since we first connected to see the wonderful work that you're doing, where you have been and where you are going.

00:55:58.450 --> 00:56:00.530
And I think it's something fantastic.

00:56:00.530 --> 00:56:03.250
And I love to see that within the community.

00:56:03.250 --> 00:56:04.289
So much growth.

00:56:04.289 --> 00:56:07.409
Keep doing you, keep owning your shine.

00:56:07.409 --> 00:56:17.490
And I'm just excited for what the future will bring for you because I can definitely see you speaking at many more conferences, bigger conferences, and you just keep doing it, my friend.

00:56:17.490 --> 00:56:18.530
I appreciate you.

00:56:18.530 --> 00:56:24.050
And as always, any guest of my show is a forever guest.

00:56:24.050 --> 00:56:34.050
So next time that you've got something major going on, your your next book or anything else that you're gonna be doing, you're always welcome back, my friend.

00:56:34.050 --> 00:56:35.570
It's an honor to have you.

00:56:35.570 --> 00:56:36.450
Thank you.

00:56:36.610 --> 00:56:38.210
Thank you so much for having me.

00:56:38.530 --> 00:56:39.490
Appreciate you.

00:56:39.490 --> 00:56:40.369
All right, guys.

00:56:40.369 --> 00:56:43.570
And for all our listeners, thank you as always for all of your support.

00:56:43.570 --> 00:56:54.690
Please make sure you check out this episode and the other 338 episodes that were I promise that you will find a little something just for you that you can sprinkle onto what you are already doing.

00:56:54.690 --> 00:56:55.010
Great.

00:56:55.010 --> 00:57:00.530
So please make sure you visit our website at myedtech.life, myedtech.life.

00:57:00.530 --> 00:57:07.090
And again, thank you so much to our wonderful sponsors, Book Creator, Yellowdig, and Edu8.

00:57:07.090 --> 00:57:09.170
I appreciate all of your support.

00:57:09.170 --> 00:57:20.450
Like I said, we definitely want to put out some wonderful conversations within our education space so we can continue to amplify wonderful voices and continue to grow and learn from one another.

00:57:20.450 --> 00:57:21.650
So thank you very much.

00:57:21.650 --> 00:57:26.130
And my friends, until next time, don't forget, stay techie.
Merissa Sadler-Holder Profile Photo

Merissa Sadler-Holder is an AI & Emerging Tech Strategist helping K–12 schools and organizations integrate technology into teaching practice with purpose and precision. She brings 13 years of classroom experience and a systems-level lens to her consulting work, ensuring innovation is always grounded in what educators and students actually need.

As the founder of Teaching with Machines, Merissa supports a global community of educators and organizations through strategic consulting, implementation planning, and the development of award-winning tools that translate AI and emerging technology into meaningful classroom practice. Her work centers on systems-level integration, resource design, and thought leadership and is always grounded in equity, simplicity, and sustainability.

Merissa is a two-time ASU+GSV Leading Woman in AI (2024 & 2025) and has presented at major events including Stanford’s AI x Education, ASU+GSV Air Show, and LACOE’s AI Symposium. Teaching with Machines was a proud 2024 ASU+GSV AI Show partner alongside Code.org, Digital Promise, and aiEDU.

Her mission: help educators, districts, and edtech orgs close the gap between powerful emerging tools and practical instructional use, turning complexity into clarity, and curiosity into real-world change.