Dec. 30, 2025

AI Literacy Isn’t “One More Thing” ft. Lindy Hockenbary | My EdTech Life 350

In Episode 350 of My EdTech Life, Dr. Alfonso “Fonz” Mendoza sits down with Lindy Hockenbary (LindyHoc), a K–12 EdTech advisor, strategist, and professional learning leader known for helping teachers make technology work for real learning.

This conversation goes straight to the issues educators actually face, the “one more thing” overload, AI misconceptions, and how to move beyond AI detectors toward authentic assessments students can’t fake. Lindy breaks down how AI literacy fits inside core instruction (not as a separate add-on), why we must redesign assessment to emphasize process over product, and how tools with guardrails + teacher dashboards change what “safe classroom AI use” can look like.

You’ll also hear why Lindy’s work is especially grounded in small and rural schools, where staffing, compliance review, and budget constraints make AI adoption harder, but also more urgent.

Chapters
00:00 Introduction and Guest Introduction
02:05 Lindy Hockenbary's Educational Journey
04:57 The Impact of Technology on Education
09:19 Changing Mindsets in Education
11:00 Integrating AI into Core Curriculum
16:42 Redesigning Assessment in the Age of AI
23:37 Authenticity in Learning and AI Challenges
24:24 Misconceptions About AI in Education
25:52 AI Literacy and Compliance in Teaching
31:58 The Impact of ChatGPT on Educators
36:00 Challenges in Rural Education and AI Adoption
40:39 Final Thoughts and Future Directions

Lindy’s website: https://www.lindyhoc.com/
Make EdTech 100 podcast page: https://www.lindyhoc.com/podcast

Sponsors Shoutout
Thank you to our sponsors: Book Creator, Eduaide.AI, and Peel Back Education for supporting My EdTech Life.

Peel Back Education exists to uncover, share, and amplify powerful, authentic stories from inside classrooms and beyond, helping educators, learners, and the wider community connect meaningfully with the people and ideas shaping education today.

Authentic engagement, inclusion, and learning across the curriculum for ALL your students. Teachers love Book Creator.

Support the show

Thank you for watching or listening to our show! 

Until Next Time, Stay Techie!

-Fonz

🎙️ Love our content? Sponsor MyEdTechLife Podcast and connect with our passionate edtech audience! Reach out to me at myedtechlife@gmail.com. ✨

 

00:00 - Welcome And Sponsor Shoutouts

02:35 - Meet Lindy Hockenberry

05:35 - Montana Lab Days And Early Edtech Lessons

11:54 - The QuickBooks Pivot And Relevance

18:49 - Research Is Changing In The Age Of AI

24:22 - One More Thing Mindset And AI Literacy As Core

31:53 - Redesigning Assessment For Authentic Learning

39:09 - The Assessment Puzzle Framework

45:59 - Fighting AI Slop With Voice And Annotation

50:51 - Common AI Misconceptions In Schools

WEBVTT

00:00:30.000 --> 00:00:33.759
Hello, everybody, and welcome to another great episode of My EdTech Life.

00:00:33.759 --> 00:00:36.560
Thank you so much for joining us on this wonderful day.

00:00:36.560 --> 00:00:41.840
And wherever it is that you're joining us from around the world, thank you as always for all of your support.

00:00:41.840 --> 00:00:44.320
We appreciate all the likes, the shares, the follows.

00:00:44.320 --> 00:00:52.320
Thank you so much for engaging with our content and sharing our content around social media that really means the world to us.

00:00:52.320 --> 00:00:58.399
As you know, we do what we do for you to bring you some amazing conversations with amazing guests.

00:00:58.399 --> 00:01:05.920
So that way we may continue to learn from one another, both professionally and personally as well.

00:01:05.920 --> 00:01:11.760
And of course, this mission wouldn't be made possible if it weren't for our amazing sponsors.

00:01:11.760 --> 00:01:20.959
So thank you so much to Book Creator, Eduaid, Yellowdig, and Peelback Education for your support in our mission.

00:01:20.959 --> 00:01:39.120
And I am excited about today's show, as of course I always am excited about every show because I get to talk to some amazing educators, creators that are out there doing the work, that are out there helping educators and are just everywhere at conferences.

00:01:39.120 --> 00:01:45.519
And of course, today's guest is pretty much all of those because it seems like she is everywhere.

00:01:45.519 --> 00:01:47.359
And that is something that is fantastic.

00:01:47.359 --> 00:01:51.439
I've been following her on LinkedIn for a very long time.

00:01:51.439 --> 00:01:56.000
And I would love to welcome to the show today Lindy Hawkenberry.

00:01:56.000 --> 00:01:57.680
Lindy, how are you doing?

00:01:58.319 --> 00:01:59.200
I am great.

00:01:59.200 --> 00:02:00.319
I'm so excited to be here.

00:02:00.319 --> 00:02:03.840
I'm a longtime listener of my EdTech Like podcast.

00:02:03.840 --> 00:02:07.439
And it's great to be on the other side and be on the podcast.

00:02:07.680 --> 00:02:08.000
Yes.

00:02:08.000 --> 00:02:09.599
Well, I'm excited that you're here too.

00:02:09.599 --> 00:02:17.840
And of course, I know we talked a little bit pre-chat, and I know you've got some exciting things too that you are working on, and that really gets me excited too as well.

00:02:17.840 --> 00:02:21.199
So maybe we'll share a little bit of that later on in the conversation.

00:02:21.199 --> 00:02:37.759
But before we get started, Lindy, for our audience members, if there are any that are out there that may not be familiar with your work yet or haven't connected with you yet, can you give us a little brief introduction and what your context is within the education space?

00:02:38.639 --> 00:02:39.360
Absolutely.

00:02:39.360 --> 00:02:41.199
So my name is Lindy Hockenberry.

00:02:41.199 --> 00:02:42.560
Hawkenberry is a lot.

00:02:42.560 --> 00:02:45.599
So shorten that down if you think of the Lindy Hop.

00:02:45.599 --> 00:02:47.199
I am the Lindy Hopp.

00:02:47.199 --> 00:02:48.240
You can just shorten that down.

00:02:48.240 --> 00:02:50.400
Everybody's always like, well, do you do the Lindy Hopp?

00:02:50.400 --> 00:02:51.199
I'm like, no, I don't.

00:02:51.199 --> 00:02:53.199
So you don't let me to dance.

00:02:53.199 --> 00:02:56.240
But it is a really good way to remember my name.

00:02:56.240 --> 00:02:58.479
I have spent my career in education.

00:02:58.479 --> 00:03:02.560
I started off as a middle school, high school CTE teacher.

00:03:02.560 --> 00:03:05.360
I taught business and family consumer sciences.

00:03:05.360 --> 00:03:08.879
And when I taught business, my classroom was a computer lab.

00:03:08.879 --> 00:03:14.800
And this was in the mid to late 2000s before like laptop currents were a thing, Chromebooks didn't exist.

00:03:14.800 --> 00:03:19.439
So when I say my classroom was a computer lab, it was literally a computer lab.

00:03:19.439 --> 00:03:32.319
Like 25 old school desktop computers with the huge towers and the really deep monitors that took up almost my entire classroom in little tiny rural Montana.

00:03:32.319 --> 00:03:41.120
So using technology as a learning tool always came natural to me because I always had that quote unquote one-to-one environment.

00:03:41.120 --> 00:03:48.240
I had never heard that term at the time, but um that's now, you know, the term where every kid had a device in my classroom.

00:03:48.240 --> 00:03:50.479
And in fact, I love this little tidbit.

00:03:50.479 --> 00:03:58.000
My classroom was the only device access that the seventh through 12th graders had was my classroom.

00:03:58.000 --> 00:04:06.000
So because the other end of the building was the K6 and they had a computer lab, but the seventh through 12th graders weren't allowed to use the K6 computer lab.

00:04:06.000 --> 00:04:08.400
So the only computer lab they had was my classroom.

00:04:08.400 --> 00:04:23.360
So I constantly had the seniors coming in in my class out of my classroom, typing their scholarship applications and their college applications and doing their AP English work and such because that that was the only option that they had.

00:04:23.360 --> 00:04:27.199
And that then led me into a career in ed tech.

00:04:27.199 --> 00:04:29.600
I worked as a technology integration specialist.

00:04:29.600 --> 00:04:43.920
And then for the last 11 years, I've been a K-12 ed tech advisor and strategist, helping schools, educators unpack the effects of emerging technologies on their curriculum and their instruction.

00:04:43.920 --> 00:04:50.000
So I do a lot of professional development, a lot of thought leadership around the K-12 ed tech space.

00:04:50.319 --> 00:04:51.040
That's excellent.

00:04:51.040 --> 00:04:51.839
And you know what?

00:04:51.839 --> 00:04:57.360
What a great experience from when you first started and just being within a lab the whole time.

00:04:57.360 --> 00:05:07.199
And I mean, you've seen it all, you've seen everything, you know, and the way that uh, you know, at the rate of change that everything is going, it's just been amazing.

00:05:07.199 --> 00:05:14.720
And the fact that you were there also helping students or seeing them just, you know, type in their scholarship applications and so on.

00:05:14.720 --> 00:05:24.399
I mean, a lot has changed now, and I know that you know that for a fact because I know that you are at pretty much almost every major conference and a speaker.

00:05:24.399 --> 00:05:32.639
And again, if you do not follow Lindy on LinkedIn, please make sure you do so as because she really shares some important things.

00:05:32.639 --> 00:05:42.720
She shares obviously the work that she's doing at conferences, but also the thought leadership aspect of it, you know, just uh profound questions that are out there that are definitely very engaging.

00:05:42.720 --> 00:06:00.079
But Lindy, I want to go back a little bit here, you know, and your experience that you mentioned, you know, working, going from classroom to tech integration specialist, and of course, now professional development and being able to help other, you know, other districts, other campuses, and at that conference level.

00:06:00.079 --> 00:06:13.439
I want to ask you, you know, at what moment or what was one moment in your early classroom days and teaching that kind of shaped the way you think about educational technology?

00:06:14.480 --> 00:06:16.959
Oh, I love that question.

00:06:16.959 --> 00:06:19.279
I have when I'm teaching business.

00:06:19.279 --> 00:06:26.800
Business education in the US can be very business, it can be businessy, like marketing and accounting, or it can be very tech heavy.

00:06:26.800 --> 00:06:28.160
It can be a mix of both.

00:06:28.160 --> 00:06:32.079
Mine was very tech heavy, hence the computer lab being my classroom.

00:06:32.079 --> 00:06:40.240
And this was at the time where, you know, smartphones were just starting to become a thing.

00:06:40.240 --> 00:06:50.319
Like I remember my kids had their hands in their pocket, like texting on their flip phones using the old T9 text method, you know, during this time.

00:06:50.319 --> 00:06:56.319
And I just remember looking at them, and my accounting class was a great example of this.

00:06:56.319 --> 00:07:03.680
We were doing the old school pull out the paperwork books and do the t-charts and follow the textbook.

00:07:03.680 --> 00:07:08.800
And I looked at this group of kids from rural Montana and I went, you know what?

00:07:08.800 --> 00:07:20.560
No, they need to learn QuickBooks because several of them, and several of them are going to to this day, like run, they're gonna run their family ranch, right?

00:07:20.560 --> 00:07:22.639
And they need to understand QuickBooks.

00:07:22.639 --> 00:07:34.079
So I literally I bought QuickBooks, so I had to put like sticky notes on the computers that had QuickBooks on it, and I just totally changed like mid, might even be quarter of the way through the school year.

00:07:34.079 --> 00:07:38.800
I was like, nope, we're changing gears, we're gonna learn QuickBooks because this is what you guys need.

00:07:38.800 --> 00:07:47.040
And that to me was like a moment where I was like, these kids, and honestly, I was really young when I started teaching.

00:07:47.040 --> 00:07:54.079
I was only four or five years older than some of my students that were seniors in high school, right?

00:07:54.079 --> 00:07:59.439
So we I went in the elder millennial group, they were solid millennials.

00:07:59.439 --> 00:08:07.040
So I had experienced that huge shift of like in high school, I was taught how to use a card catalog in the library.

00:08:07.040 --> 00:08:16.160
And then when I got to college, I was expected to know how to do a Google search and use Google Scholar and use these online databases to find research.

00:08:16.160 --> 00:08:19.439
And I had never been taught that when I was in school.

00:08:19.439 --> 00:08:20.720
And it's nobody's fault.

00:08:20.720 --> 00:08:24.160
It's just the sign of the times and the change.

00:08:24.160 --> 00:08:26.079
And I looked at them and I was like, you know what?

00:08:26.079 --> 00:08:28.000
They are in the exact same position.

00:08:28.000 --> 00:08:37.200
Let's throw these work paper workbooks out the window and let's give them a 21st century whatever, use some jargon there, right?

00:08:37.200 --> 00:08:38.879
Like learning experience.

00:08:38.879 --> 00:08:42.720
Um, and that was just a key moment for me as a teacher to be like, nope, you know what?

00:08:42.720 --> 00:08:44.000
The world is changing.

00:08:44.000 --> 00:08:45.759
This is no longer relevant.

00:08:45.759 --> 00:08:47.679
We have to change and switch.

00:08:47.679 --> 00:08:54.240
And that has really guided my education and ed tech career ever since then.

00:08:54.559 --> 00:08:54.879
Nice.

00:08:54.879 --> 00:09:07.440
And that is great to hear, you know, especially that you know that transition and your story of going into college and learning how to use, of course, the computer, uh, you know, Google Scholar, and then of course, databases and everything.

00:09:07.440 --> 00:09:14.879
And for myself, I mean, I'm gonna age myself too, in the sense that, you know, getting out of high school, you know, we were using card catalog.

00:09:14.879 --> 00:09:22.240
And then when I got into university, you know, we did have access to the databases and things of that sort uh there at their university.

00:09:22.240 --> 00:09:29.440
But I was still, you know, for some courses, I was still using um the microfiche machine.

00:09:29.440 --> 00:09:32.720
And for a lot of you may hear microfiche, you're like, what is that?

00:09:32.720 --> 00:09:42.639
It's really essentially like film uh that you're putting in into this projector and you're still seeing like articles from magazines or newspapers and things of that sort.

00:09:42.639 --> 00:09:46.399
And that was more for my history class and courses and so on.

00:09:46.399 --> 00:09:58.480
So definitely, like you said, a sign of the times as far as how much has changed from even now, you know, being out of university for a while now, everything is changing continually.

00:09:58.480 --> 00:10:15.840
And I think uh, you know, the fact that you looked at this in that lens, you know, I I don't I must commend you on that because I think sometimes, you know, we we can get so busy with our pedagogy and what we're doing and the curriculum that we just kind of stick to that.

00:10:15.840 --> 00:10:29.759
And we really sometimes educators may not look forward to what is coming or what is already there, and that our students might need just a little bit of change in the way that they're receiving those learning experiences.

00:10:29.919 --> 00:10:32.960
And uh yeah, we're going through that right now.

00:10:32.960 --> 00:10:37.279
We're going through like it that exact scenario, and actually, I use this example.

00:10:37.279 --> 00:10:41.279
I have a training idea that's all about how AI is changing research, right?

00:10:41.279 --> 00:10:52.960
And I use that exact same example of I tell them my experience as a high school and college student and how I would I didn't go into college prepared, and it wasn't my high school teachers' fault.

00:10:52.960 --> 00:10:56.320
They and I went to school in rural Montana, I taught in rural Montana.

00:10:56.320 --> 00:10:58.240
If you didn't get the hint, I live in Montana.

00:10:58.240 --> 00:11:08.159
Um I and you know, my rural Montana teachers, they didn't have the resources for somebody to teach to tell them and teach them that research was changing.

00:11:08.159 --> 00:11:14.159
And in fact, I literally full circle, another full circle in my education career this summer.

00:11:14.159 --> 00:11:19.120
I was able to do some sessions at the Montana after school summit.

00:11:19.120 --> 00:11:22.240
And so it was all Montana teachers that run these after school programs.

00:11:22.240 --> 00:11:28.240
And one of the sessions I did at their summit was all about how research skills are changing in the age of AI.

00:11:28.240 --> 00:11:33.759
And so uh the one of the very first questions I asked is I'm like, who is using perplexity?

00:11:33.759 --> 00:11:38.720
I think maybe one hand of 35, 40 teachers in the room went up.

00:11:38.720 --> 00:11:41.440
And I said, Who has never even heard of perplexity?

00:11:41.440 --> 00:11:44.320
Almost every single hand in the room went up.

00:11:44.320 --> 00:11:46.799
And I said, that that's that pivotal shift.

00:11:46.799 --> 00:11:49.360
We're literally living through that right now.

00:11:49.360 --> 00:12:08.639
If you're a high school teacher and you're not even talking, even if you don't have access to the technology, talking and having the conversations with your high school kids about how research is changing and how to use tools like perplexity and illicit for research, you're not preparing them for it doesn't matter if they don't go to college.

00:12:08.639 --> 00:12:12.320
You're not preparing them for whatever path they take when they leave your classroom.

00:12:42.730 --> 00:12:54.330
Which right now, this is a nice segue, and you kind of hit a little bit because this is the next question that I did want to ask is I know you and I, uh like for me, my my philosophy has always been the KISS philosophy.

00:12:54.330 --> 00:12:56.009
Keep it simple and streamlined.

00:12:56.009 --> 00:13:05.129
I know some people put something else on that last S, but for me, for another version of Yeah, there's another version out there that's floating around, but that's not mine.

00:13:05.129 --> 00:13:10.970
I have always been one of those that I want to keep it simple and streamlined for all my educators.

00:13:10.970 --> 00:13:30.330
And so I want to ask you, you know, with that example in mind that you gave, and I know you kind of hit on it a little bit, but in your experience, you know, we know, at least for us that are out there and doing the research and, you know, always trying to stay at least 10 steps ahead, 15 steps ahead of everything that is happening.

00:13:30.330 --> 00:13:38.169
We know that essentially the technology is supposed to help us make things at least a little bit more streamlined, a lot easier.

00:13:38.169 --> 00:13:50.330
But sometimes in some ways, it's still making things a little bit harder for educators, and mainly because there might be some educators still that may feel like, okay, this is something else that I got to guard against.

00:13:50.330 --> 00:13:52.889
This is just another thing added to my plate.

00:13:52.889 --> 00:14:05.049
So I want to ask you, you know, in your experience when you go talk to educators and to conferences and so on, how do you flip it on them to it kind of maybe help them change that mindset?

00:14:05.529 --> 00:14:06.330
I love that.

00:14:06.330 --> 00:14:12.970
The the I call it the one more thing mentality is really ingrained in education and for good reason.

00:14:12.970 --> 00:14:14.889
Educators are filled to the brim.

00:14:14.889 --> 00:14:32.490
I use the analogy of a jar of marbles, and education is like a jar of marbles that is overflowing, and we keep throwing marbles in, and then they're just falling out and rolling all over the floor, and people are tripping over them and stepping on them and falling and breaking their legs, and we just keep throwing more marbles at it.

00:14:32.490 --> 00:14:32.889
Right.

00:14:32.889 --> 00:14:37.850
So, like there's very much this mentality of not one more thing, not one more marble.

00:14:37.850 --> 00:14:39.689
Don't throw one more marble at me.

00:14:39.689 --> 00:14:56.730
So, literally, like that is my one of my main missions in the work that I do is to help educators and get them to see and give them strategies and tools and templates and toolkits that help them make it so it isn't one more thing.

00:14:56.730 --> 00:14:59.529
One of my big initiatives right now is this idea.

00:14:59.529 --> 00:15:06.730
I'm working a lot in AI and education, as you can imagine, a big focus around AI literacy and the importance of AI literacy.

00:15:06.730 --> 00:15:09.689
Well, of course, when you go to a teacher and they're like, hey, guess what?

00:15:09.689 --> 00:15:11.370
You got to teach AI literacy now.

00:15:11.370 --> 00:15:12.490
One more thing.

00:15:12.490 --> 00:15:14.250
When I don't have the time when I do it.

00:15:14.250 --> 00:15:20.649
So my perspective is this idea that AI literacy isn't extra, it's core.

00:15:20.649 --> 00:15:28.889
And what I mean by that is technology does not exist outside of core curriculum.

00:15:28.889 --> 00:15:35.850
Humans create technology because they went to school and they learned and they understand core curriculum.

00:15:35.850 --> 00:15:41.450
They know math, they know science, they know English, they know social studies, music, the arts, right?

00:15:41.450 --> 00:15:42.730
Like on and on and on and on.

00:15:42.730 --> 00:15:44.730
CTE, of course, all of them.

00:15:44.730 --> 00:15:52.330
It's because they know and understand these things, they're able to take that knowledge and create these really amazing technologies.

00:15:52.330 --> 00:15:54.090
But then what do we do in school?

00:15:54.090 --> 00:16:01.769
We take and we pull that completely out of our curriculum and don't embed it, right?

00:16:01.769 --> 00:16:04.569
Where so let's use the example of AI literacy.

00:16:04.569 --> 00:16:10.569
AI, when you boil it down to the basics, is literally math.

00:16:10.569 --> 00:16:12.730
It's pattern recognition.

00:16:12.730 --> 00:16:18.490
AI is looking for patterns, it does it very, very quickly, way faster than human brains, right?

00:16:18.490 --> 00:16:23.450
In order to make predictions based upon its training data set.

00:16:23.450 --> 00:16:24.970
That's math.

00:16:24.970 --> 00:16:36.330
So I show teachers how you can actually take and embed AI literacy concepts within the standards and learning goals that you're already teaching.

00:16:36.330 --> 00:16:37.289
Does that make sense?

00:16:37.289 --> 00:16:38.889
So it doesn't have to be one more thing.

00:16:38.889 --> 00:16:42.330
It doesn't have to be an add-on, it doesn't have to be its own class.

00:16:42.330 --> 00:16:48.889
You don't have to have an AI literacy class now that we know how important it is to teach AI literacy, right?

00:16:48.889 --> 00:16:52.409
Which is typically what we've done with the concept of digital citizenship.

00:16:52.409 --> 00:16:53.769
It's pull it out.

00:16:53.769 --> 00:16:57.850
I have a whole nother initiative about that, but I I won't go there right now.

00:16:58.090 --> 00:16:58.409
Yeah.

00:16:58.409 --> 00:16:59.850
And I but I agree with you.

00:16:59.850 --> 00:17:02.970
Sometimes it just feels like we silo all of these things.

00:17:02.970 --> 00:17:09.210
And that's why it just feels so overwhelming and overbearing on teachers that it's one more thing.

00:17:09.210 --> 00:17:35.369
But if we give like yourself being an awesome bridge from the world of tech into the classroom and showing how and bridging that tech and those initiatives and the generative AI and digital literacy and all that good stuff, and you're that bridge to show them how this is already or can be embedded into their core content is something that is fantastic.

00:17:35.369 --> 00:17:37.930
And I think sometimes we do miss the mark.

00:17:37.930 --> 00:17:44.250
And I'm talking as far as at the district level, because I think it's the this is the way we've always done it.

00:17:44.250 --> 00:17:59.930
Any new initiative is just an add-on, but why not let it be something that is built in or that they can see that it is already something that's built in and that you're just uh, like I always say, sprinkling a little extra to what they are already doing great.

00:17:59.930 --> 00:18:01.529
It's already sprinkled in there.

00:18:01.529 --> 00:18:03.529
You can, you know, build off of that.

00:18:03.529 --> 00:18:23.849
So I really like that that you're sharing that with us because I think oftentimes what we hear and what I see on LinkedIn, and you and I are very active on social media, it's like we see like initiative, you know, through initiative, and then you see, for example, this organization has got, you know, digital, digital literacy, um, you know, AI literacy.

00:18:23.849 --> 00:18:28.970
Then this other uh, you know, foundation has, you know, AI literacy.

00:18:28.970 --> 00:18:41.849
And I remember your question on LinkedIn, it's like it's like almost like saying AI literacy first that many times it kind of loses its meaning, and we really don't end up not knowing what it is because everybody has a different definition of it.

00:18:41.849 --> 00:18:51.210
But I love the way that you are framing this for the teachers, that it doesn't have to feel like an add-on.

00:18:51.210 --> 00:18:55.289
It's already something that you're doing, but you're making that connection.

00:18:55.289 --> 00:18:59.129
That's fantastic, which kind of leads me again to my next question.

00:18:59.129 --> 00:19:00.730
This is fantastic the way it's working out.

00:19:00.730 --> 00:19:08.650
But a full disclaimer, I did not send her any questions, but it's just working out great as far as you know our conversation.

00:19:08.650 --> 00:19:15.529
But you know, uh Lindy, you have done, and again, and again, I I sing your praises because you have been doing so many great things.

00:19:15.529 --> 00:19:25.690
And I know once people connect with you on LinkedIn and we share all your links, they're gonna see you know your experience and and the wealth of knowledge and the wealth of work that you're putting out there.

00:19:25.690 --> 00:19:38.089
But I know that you do help teachers and and in schools trying to make um, I guess, learning real and and especially with technology.

00:19:38.089 --> 00:19:50.170
So I want to ask you, you know, what does real learning now in 2025 look like, especially with generative AI in the mix?

00:19:51.369 --> 00:19:52.730
I love that question.

00:19:52.730 --> 00:20:00.170
I if you follow me, you know, one of the things I love to talk about is assessment and redesigning assessment.

00:20:00.170 --> 00:20:12.329
I've been talking about this for years, and then now generative AI, especially, and agentic AI really is forcing us to really have to redesign assessment.

00:20:12.329 --> 00:20:16.970
And let's not lie, like, we've needed to redesign assessment for a long time in K 12 education.

00:20:16.970 --> 00:20:24.170
Of course, I'm very much generalizing and stereotyping a bit, but in general, there's pockets of success out there.

00:20:24.170 --> 00:20:27.609
But in general, we really need to rethink assessment.

00:20:27.609 --> 00:20:32.970
So, in terms of like, how do we make Learning real in the age of AI.

00:20:32.970 --> 00:20:42.089
I think there is a huge reason that we need to move from focusing on products to focusing on process, right?

00:20:42.089 --> 00:20:52.490
How how much more do we make learning real than going through the whole assessing the whole process of learning rather than just that final test or essay that is written?

00:20:52.490 --> 00:20:54.009
So that's number one.

00:20:54.009 --> 00:21:03.049
And then part of that too is adding in the authentic student experience is voice reflection, is huge.

00:21:03.049 --> 00:21:06.569
I like to call it X-ray vision for teachers.

00:21:06.569 --> 00:21:07.450
Because think about it.

00:21:07.450 --> 00:21:18.569
When you hear a student explain something, what do you understand about photosynthesis or mitosis or um I just watched the new Frankenstein movie on Netflix?

00:21:18.569 --> 00:21:26.490
What are the the undercurrents of the Frankenstein uh book, or whatever it is, whatever topic you're teaching, right?

00:21:26.490 --> 00:21:35.289
What better way to know if a student is truly understanding the learning outcome, the standard, than to hear them explain it?

00:21:35.289 --> 00:21:36.089
Right?

00:21:36.089 --> 00:21:47.210
And there's no way that you can ex you can fake or use AI to inauthentically or somehow copy and paste a voice reflection.

00:21:47.210 --> 00:21:52.329
Like a true rifted voice reflection, right?

00:21:52.329 --> 00:21:56.009
So I actually created this framework, I call it the assessment puzzle.

00:21:56.009 --> 00:22:14.329
I created it in the last year or two of this idea that one way, and I want to stress that, that this is not by far not the only way to redesign and rethink about assessment in the age of AI, but one method, one strategy that you can do is this idea of thinking about different puzzle pieces.

00:22:14.329 --> 00:22:16.970
So I have a puzzle piece that is text.

00:22:16.970 --> 00:22:18.410
Text is still very important.

00:22:18.410 --> 00:22:28.569
There's a puzzle piece that is um video, there's a puzzle piece that's voice reflections, there's a puzzle piece that's collaborations with AI, annotations is a big one, right?

00:22:28.569 --> 00:22:34.569
And the whole idea of the framework is you take, you need at least three puzzle pieces to make a puzzle.

00:22:34.569 --> 00:22:37.289
Like two pieces, like that's not a puzzle.

00:22:37.289 --> 00:22:40.490
It requires no critical thinking about how to put those pieces together, right?

00:22:40.490 --> 00:22:42.890
But once you add a third piece in, you now have a puzzle.

00:22:42.890 --> 00:22:51.769
So you need at least three pieces of a puzzle, of these different puzzle pieces, and put them together to create this assessment.

00:22:51.769 --> 00:23:03.289
And it have to include voice reflection, but I won't lie that I feel like what there's some puzzle pieces that have more oop than other puzzle pieces, and voice reflection is one of them.

00:23:03.289 --> 00:23:11.690
And again, like what better way to make learning real than to talk through and explain your learning process?

00:23:12.250 --> 00:23:12.890
I love that.

00:23:12.890 --> 00:23:22.490
I love especially that framework, you know, really it just the fact that you have four and then you know, using three out of the four to assess, you know, I think that's fantastic.

00:23:22.490 --> 00:23:25.769
And I do agree with you, you know, uh all of those are all great.

00:23:25.769 --> 00:23:30.170
And but like you said, you know, sometimes there's some that have a little more oomph to them.

00:23:30.170 --> 00:23:32.569
And for me, I'm with you on the voice.

00:23:32.569 --> 00:23:45.609
I am a big proponent, and even while I was in the classroom, is just using like students having the record their voices, uh, record their presentations, you know, doing voiceovers, and just getting their thought process.

00:23:45.609 --> 00:23:53.049
There's just something so genuine about it, and it's very true the way that you describe it, as far as being that x-ray.

00:23:53.049 --> 00:23:57.450
Because one thing is for them to, and again, nothing against writing.

00:23:57.450 --> 00:24:09.769
Uh, you know, that writing is an important component too, you know, they can go ahead and write, and but there's just something about them when even you hear their inflections when they speak, their their facial expressions.

00:24:09.769 --> 00:24:12.569
It they're still conveying a message there.

00:24:12.569 --> 00:24:17.529
And again, like you said, you know, being able to tell, you know, what their thought process is.

00:24:17.529 --> 00:24:25.609
And even though they may do some research and they may um, you know, get generative AI to help with an outline or anything like that.

00:24:25.609 --> 00:24:31.289
But when they present it to you in such a way, I think that that's something that is so valuable there.

00:24:31.289 --> 00:24:35.690
And that's why for me, just podcasting in the classroom is something that has been great.

00:24:35.690 --> 00:24:40.490
And little micro podcasts that are like 90 seconds where I would say, okay, here's what you need.

00:24:40.490 --> 00:24:41.849
This is what I need you to explain.

00:24:41.849 --> 00:24:43.609
Here's the topic: World War II.

00:24:43.609 --> 00:24:46.170
Uh, so that's the main I need a main idea.

00:24:46.170 --> 00:24:53.690
I need you to describe to me or tell me uh what started World War II, who the main characters were, and how did everything end.

00:24:53.690 --> 00:25:05.609
And you have 90 seconds to do that, and you have to do it in the third person, you know, and so I would give them and and implement different ways of them to be able to do these uh podcasts.

00:25:05.609 --> 00:25:20.970
But the fact that now they have to think about the process, they have to do the research, now they're writing, now they have to trim the fat and keep it lean so it's within that 90 seconds and make sure that it hits all of those expectations there that were in the rubric.

00:25:20.970 --> 00:25:24.009
And it was just amazing what they were able to produce.

00:25:24.009 --> 00:25:38.009
That I what one of the gains that I did see is that my emergent bilingual students were acquiring proper vocabulary and also just the English language was a lot more fluent.

00:25:38.009 --> 00:25:42.250
That from beginning of the year to the end of the year, you saw a noticeable difference.

00:25:42.250 --> 00:26:12.809
And the fact of the matter is that now, as teachers, and I always tell them that, you have a digital learning artifact that if you ever need to go into a 504 IEP ARD and whatever other alphabet suit meeting you need to go into, you have evidence of learning from you have evidence of where the child started to where they're at now, maybe mid-year at the end of the year, and there's documentation there rather than just showing up and saying, well, they're they're getting 70s, they're getting 60s.

00:26:12.809 --> 00:26:14.009
Well, what can the student do?

00:26:14.009 --> 00:26:15.129
Well, study more.

00:26:15.129 --> 00:26:16.970
Well, no, I mean, look at what I have.

00:26:16.970 --> 00:26:20.170
I have them, look at how they're speaking, look at the growth.

00:26:20.170 --> 00:26:21.529
We need to work on this.

00:26:21.529 --> 00:26:23.049
This is how they're gonna improve.

00:26:23.049 --> 00:26:25.690
And now you have a plan of action that you can share.

00:26:25.690 --> 00:26:43.849
And so those are some of the things, too, that I mean, uh that I always say that with what you're doing, those are wonderful artifacts that can be uh set aside in a folder and portfolio that they can continue to take along with them as they move on from year to year and show that progress, but also evidence of learning.

00:26:43.849 --> 00:26:45.450
And I think that's so powerful.

00:26:45.450 --> 00:26:54.970
I mean, and even in my dissertation courses, I had talking about principals, I had assistant soups in some of those courses.

00:26:54.970 --> 00:27:01.930
And I remember my professor saying, Okay, guys, this is what we're gonna do for this uh semester.

00:27:01.930 --> 00:27:07.369
You have a choice board, you get to choose how you want these 22 contact hours.

00:27:07.369 --> 00:27:21.930
You can either read a book or do a book study, uh, you can go ahead and write an essay, you do uh whatever way or combination thereof, and just the look on the adults' faces when they're like, like, what do you mean?

00:27:21.930 --> 00:27:23.529
And I was so excited.

00:27:23.529 --> 00:27:26.650
I was like, this is amazing because this is what I do with my kids.

00:27:26.650 --> 00:27:29.690
And I was, I was like, this is this is my gem.

00:27:29.690 --> 00:27:31.609
Like, I'm totally gonna ace this class.

00:27:31.609 --> 00:27:36.650
And of course, there's me raising my hand saying, uh, Professor Jewett, uh, can I do a podcast?

00:27:36.650 --> 00:27:37.690
Sure, go ahead.

00:27:37.690 --> 00:27:39.210
I was like, that's it, I'm done.

00:27:39.210 --> 00:27:40.970
I got an A, I'm good.

00:27:40.970 --> 00:27:51.769
But everybody else in there, that what shocked me was it was more of just tell me what to do so I can mimic what you're giving me to get the A.

00:27:51.769 --> 00:28:03.369
And so I love your framework that it just it's not about mimicry, it's about actually putting thoughtful effort into the learning and showing the real learning.

00:28:03.369 --> 00:28:04.809
So, man, that's great.

00:28:04.809 --> 00:28:06.089
That is great, Lindy.

00:28:06.170 --> 00:28:14.410
Yeah, that's your example of a podcast, is literally one of the examples I have in my assessment puzzle toolkit that expands on this framework.

00:28:14.410 --> 00:28:23.529
Is so one of the ELA examples is record a podcast and formally reviewing a book, include a visual cover image for the episode too.

00:28:23.529 --> 00:28:26.890
So again, that's bringing in those different puzzle pieces, right?

00:28:26.890 --> 00:28:31.049
We've got the visuals, that's important, but not just the visuals.

00:28:31.049 --> 00:28:38.569
There's text layer there, but then adding that audio, that voice reflection over the top of it, like perfect.

00:28:38.569 --> 00:28:42.009
And you you said it, it's genuine, it's genuine.

00:28:42.009 --> 00:28:44.809
And what's the conversation around AI right now?

00:28:44.809 --> 00:28:47.369
Not just AI and education, but AI in general.

00:28:47.369 --> 00:28:49.930
AI slob, right?

00:28:49.930 --> 00:28:59.769
Is the new, if you haven't heard this, actually named the title of my assessment puzzle toolkit is Tired of Student AI Slot, solve it with the assessment puzzle toolkit.

00:28:59.769 --> 00:29:19.690
Because the new there's literally, it's such an issue both in education and the workforce right now that we've come up with this pop culture term to describe this inauthentic, this non-genuine production of information that you can now do with mostly generative AI, right?

00:29:19.690 --> 00:29:22.089
Voice reflections, annotations.

00:29:22.089 --> 00:30:11.730
That's genuine, that's authentic, that's real learning.

00:30:12.050 --> 00:30:12.769
I love it.

00:30:12.769 --> 00:30:13.490
I love it.

00:30:13.490 --> 00:30:14.930
That is fantastic.

00:30:14.930 --> 00:30:30.610
All right, so here's my next question to you, Lindy, because I'm always curious too, and especially with somebody like you that gets to visit so many schools there in your area, but also I'm obviously presenting many times, and you have different teachers that come in.

00:30:30.610 --> 00:30:40.370
So, from your vantage point, what might still be some of the biggest misconceptions that are out there that teachers are holding on to about AI?

00:30:41.650 --> 00:30:44.210
Oh my gosh, where do I start?

00:30:44.210 --> 00:30:51.090
The amount of misinformation and misconceptions out there is is huge.

00:30:51.090 --> 00:30:53.570
We're chipping away at it slowly.

00:30:53.570 --> 00:31:04.529
Uh culture doesn't help, pop culture doesn't help, social media doesn't help right in like furthering that that those misinformation, those misconceptions.

00:31:04.529 --> 00:31:13.330
Um, I think one of the big things I'm trying to get across is that we lump and I'm guilty of it because it's easy, right?

00:31:13.330 --> 00:31:15.170
We say AI in education.

00:31:15.170 --> 00:31:18.289
And that's a lot that encompasses a lot.

00:31:18.289 --> 00:31:21.090
There's so many different branches.

00:31:21.090 --> 00:31:32.930
So one of the things I'm working on right now is trying to pull those branches out and be like, okay, we have this I we're we're in AI and education, we're both teaching with and about AI.

00:31:32.930 --> 00:31:39.170
We're also learning with and about AI and defining what those mean, right?

00:31:39.170 --> 00:31:44.610
So when we say AI literacy, we're talking about we're learning about AI.

00:31:44.610 --> 00:31:56.850
We understand how the technology works so that we have foundational knowledge to be able to critically evaluate the outputs of this technology and the goods and bads of the technology.

00:31:56.850 --> 00:32:09.250
But then you have like this idea of teaching with AI, and that's where teachers can use it to save time, produce greater, faster, better instructional materials, right?

00:32:09.250 --> 00:32:18.289
Um, then you have the idea of like using AI as a learning tool, like leveraging the power of AI to be a learning tool.

00:32:18.289 --> 00:32:26.930
That is where the most misconceptions, well, I shouldn't say that because there's a lot of AI literacy misconceptions out there, which then trickle down, right?

00:32:26.930 --> 00:32:37.410
When you don't have that foundational knowledge, whenever I have school administrators contact me and like, hey, I want you to come do a training with our teachers on AI, my number one question is do they have foundational knowledge?

00:32:37.410 --> 00:32:45.730
And I have an infographic I send them that has four sections that outline the things that they need to understand compliance, basics, right?

00:32:45.730 --> 00:32:52.370
Data privacy that's kind of under the compliance part, how to critically evaluate and why it's so important to critically evaluate.

00:32:52.370 --> 00:33:07.009
And I tell them, I'm like, if your teachers don't have this, then we can't move forward and do any like teaching with AI or learning with AI until they have that foundational knowledge.

00:33:07.009 --> 00:33:07.970
So that's number one.

00:33:07.970 --> 00:33:14.769
But then the misconceptions when you get into the using AI as a learning tool, it's huge.

00:33:14.769 --> 00:33:26.450
Um, there's the misconception that if we're gonna do that, then that means that students are staring at screens all day and we're just getting rid of every other pedagogical best practice out there.

00:33:26.450 --> 00:33:28.130
And that's not the case at all.

00:33:28.130 --> 00:33:31.810
In fact, one of the trainings I do is I model to teach it.

00:33:31.810 --> 00:33:35.570
I'm big on modeling and putting teachers in the student shoes.

00:33:35.570 --> 00:33:42.850
And I actually have them experience a lesson where we we don't start with AI at all.

00:33:42.850 --> 00:33:55.570
We actually start by looking at a Google Arts and Culture gallery and a piece of paper and them writing good old pen and paper, them writing down the things that they see in the Google Arts and Culture collection.

00:33:55.570 --> 00:33:58.529
Then we do collaborations with humans.

00:33:58.529 --> 00:33:59.890
We talk to each other.

00:33:59.890 --> 00:34:01.170
What were your observations?

00:34:01.170 --> 00:34:03.170
Oh, I didn't notice that you're right, though.

00:34:03.170 --> 00:34:06.690
I did notice you, oh yeah, and we and they edit, right?

00:34:06.690 --> 00:34:10.769
Then we go into like the collaborations with AI.

00:34:10.769 --> 00:34:20.690
Like, hey, now that you've had this initial information, this is an optional piece that can exp, it's not necessary, you don't have to do it, but guess what?

00:34:20.690 --> 00:34:27.489
It can further expand your learning to go have this one-on-one conversation with an AI chatbot, right?

00:34:27.489 --> 00:34:33.969
To be able to translate it into Spanish if you're a multilingual learner or, you know, right?

00:34:33.969 --> 00:34:41.329
Like take your prior knowledge as well as the collaborations and reflections that you've just written on your paper and build on that.

00:34:41.329 --> 00:34:43.250
But then again, we don't stop there.

00:34:43.250 --> 00:34:45.969
Then from that point on, then we evaluate.

00:34:45.969 --> 00:34:58.529
They go through and they fact-check everything because they have to fact-check what they saw, they have to fact-check the conversations they had with their fellow learners, they have to fact-check the outputs from the AI, right?

00:34:58.529 --> 00:35:05.809
So, really, this model AI lesson I'm doing, a teen little piece of it is actually using AI.

00:35:05.809 --> 00:35:15.489
And then one more thing, there's a lengthy, you, you, you open Pandora's box for me, is you're not just using any tool as a teacher, right?

00:35:15.489 --> 00:35:17.650
Like we're not going to ChatGPT.

00:35:17.650 --> 00:35:23.170
We're not, we're using a tool that is school approved, that's compliant.

00:35:23.170 --> 00:35:30.690
And two more really important factors there in choosing a tool to use AI as a learning tool.

00:35:30.690 --> 00:35:37.570
That's a huge misconception, is you need a tool that is guardrailed and that has a teacher dashboard.

00:35:37.570 --> 00:35:38.930
So the teacher can see.

00:35:38.930 --> 00:35:52.210
So the guardrails make sure that that AI isn't letting going off the rails or giving outputs that are deemed not school appropriate or giving misinformation, right?

00:35:52.210 --> 00:36:06.690
Like it has those the the guardrails around it, and then the teacher dashboard allows the teacher complete control and visibility into everything that's going on with the students having those interactions.

00:36:07.329 --> 00:36:31.490
You know, one of the things that I want to go back to is just the way that you describe this, and and I think for a lot of teachers, they really need obviously they they need to hear this podcast and they need to hear the way that you describe the work and the work that goes into it, that it's not just putting them in front of the screen, they get on AI or generative AI platform and just like you mentioned, you know, getting the slop and just submitting that.

00:36:31.490 --> 00:36:45.250
But the fact that not only are you having them let's say visit a website, like you said, the arts and culture, they're getting some information, they're writing this down, they're having that, you know, thought partner or peer, peer-to-peer discussion.

00:36:45.250 --> 00:36:55.329
There's discourse, there's all all of that going on, even before you're even using a an AI tool or a generative AI tool.

00:36:55.329 --> 00:37:08.530
I think that that is something fantastic because I think honestly, some of the biggest uh misunderstandings that are out there, I think, is just that teachers just say, Well, I mean, we don't want to use AI because they're just gonna get on the screen, get whatever's on Chat GPT.

00:37:08.530 --> 00:37:26.210
But the way that you modeled this and explained it, and going back even to when you were we were talking about AI literacy, like some of these components, some of these things are already embedded within that content as far as the way we do things that can tie into the technology.

00:37:26.210 --> 00:37:33.170
And so I am I was really taken back and just like this is this is amazing, and and people a lot more people need to hear this.

00:37:33.170 --> 00:37:39.889
So I know that that's definitely gonna be a great sound bite for me to share about this because it it's just fantastic.

00:37:39.889 --> 00:38:01.010
So now one of the things too that I did like is that you said, you know, it's not just any tool that you're gonna use, but that you do provide in that um I guess the the major suggestions of what using a tool that obviously there is compliance, there are guardrails, something that would have a teacher dashboard just to be able to see those things.

00:38:01.010 --> 00:38:12.450
So I want to ask you now, as you know that Chat GPT now is available to educators, uh, and it's supposedly just not gonna be training on any of your data at all whatsoever.

00:38:12.450 --> 00:38:23.329
Um, I did go and sign up and just to see, you know, uh and what happened is I was prompted, it said, okay, you know, your email works, check, it's it's uh education email.

00:38:23.329 --> 00:38:27.650
Then of course I was prompted to uh would you like to connect your Google Drive?

00:38:27.650 --> 00:38:28.849
Would you like to connect this?

00:38:28.849 --> 00:38:30.130
Would you like to connect that?

00:38:30.130 --> 00:38:32.050
And that's where I got a little iffy.

00:38:32.050 --> 00:38:40.530
Where I was like, I know you're telling me that you're not gonna be training on my data, but why would I want to connect?

00:38:40.530 --> 00:38:46.289
Or you're asking me, like, for a better workflow, connect this and connect this and connect this.

00:38:46.289 --> 00:38:55.650
And I don't know, I just got really iffy about it because I know that they're also saying this is gonna be free through 2027 for teachers.

00:38:55.650 --> 00:39:01.090
What do you think that this is gonna look like once we get into 2027?

00:39:01.090 --> 00:39:11.250
Do you think that now you know Chat GPT has entered that market already within education where now the teachers won't be able to live without it?

00:39:11.250 --> 00:39:13.730
And so, what are your thoughts on that?

00:39:14.530 --> 00:39:15.730
Oh, I love that question.

00:39:15.730 --> 00:39:18.450
I so I shared out the announcement was last week.

00:39:18.450 --> 00:39:23.010
I shared it out in my Vember newsletter that hey, this is brand new.

00:39:23.010 --> 00:39:25.730
This is huge, in my opinion.

00:39:25.730 --> 00:39:32.450
One of I have been preaching for by the way, this week is the third anniversary of ChatGPT.

00:39:32.450 --> 00:39:34.610
Or of 2022 is when it came out.

00:39:34.610 --> 00:39:44.530
So three years I've been preaching that teachers, in order to teach AI literacy to their kids, have to have AI literacy.

00:39:44.530 --> 00:39:53.170
And an important part of that is having access to frontier models that have the capability, the full capabilities.

00:39:53.170 --> 00:39:59.809
Because oftentimes the tools that are compliant are not those frontier models.

00:39:59.809 --> 00:40:02.210
They're older models for many, many reasons.

00:40:02.210 --> 00:40:05.809
A lot of it has to do with costs, it has to do with integration, like so many things.

00:40:05.809 --> 00:40:08.369
I won't go down the technical rabbit hole there.

00:40:08.369 --> 00:40:19.250
But when you as a teacher don't have access to those models that are sometimes significant improvements, you can't see the full capability of the technology.

00:40:19.250 --> 00:40:21.250
So then you don't fully understand.

00:40:21.250 --> 00:40:33.090
One of the things I do in my um my assessment puzzle conference session when we're talking about redesigning assessment in the age of AI, is I give them a preview about where the technology is.

00:40:33.090 --> 00:40:48.530
And most of the time I have gasps and big wide eyes and jaws dropping to the floor because most people have no idea because what's available at the consumer level and then is limited compared to what's available at the enterprise level.

00:40:48.530 --> 00:40:57.970
And then what's usually available at for schools and teachers that's compliant is usually a step below what's available at the consumer level, right?

00:40:57.970 --> 00:41:14.130
So if you haven't seen what this technology is capable of and how fast it's moving and changing, you might not, as an educator, fully understand and embrace that reason and why you have to change and you have to learn yourself.

00:41:14.130 --> 00:41:19.329
And you can't just Just fan in the block and ignore the technology.

00:41:19.329 --> 00:41:20.369
Does that make sense?

00:41:20.369 --> 00:41:20.930
Yes.

00:41:20.930 --> 00:41:27.250
So in terms of that, that's where I was like, oh, this chat GPT for teachers is huge, right?

00:41:27.250 --> 00:41:34.050
Like this is this could potentially be huge to give educators unlimited access to the frontier models.

00:41:34.050 --> 00:41:35.970
And that's that was the key difference for me.

00:41:35.970 --> 00:41:45.010
So up until last week, you had to pay $20 a month for Chat GPT to get not limited access to the frontier models.

00:41:45.010 --> 00:41:47.490
And the free version of Chat GPT, and that's another thing.

00:41:47.490 --> 00:42:00.450
If you only ever worked in the free version of Chat GPT, you don't even understand that you're limited or what a frontier model is because there's no way to toggle between different large language models in the free version of Chat GPT.

00:42:00.450 --> 00:42:05.970
So from that basic AI literacy perspective, I was like, oh wow, this is huge.

00:42:05.970 --> 00:42:09.730
But I made sure to say in my newsletter, this is brand new.

00:42:09.730 --> 00:42:11.730
I'm still exploring it.

00:42:11.730 --> 00:42:16.369
They're saying they're compliant, but this has to be checked.

00:42:16.369 --> 00:42:32.530
So one of the things that I do not love that they did that ChatGPT, or I should say OpenAI did with ChatGPT for teachers, is they made it so that any person from a school can go sign up and verify and basically like claim that school's domain, right?

00:42:32.530 --> 00:42:53.490
Where really it should be required that like a tech director or someone from the school is doing that and is going through those compliance checks and then adding teachers from the school in with their school email addresses because that's a huge potential compliance breach that's now out of the school's control.

00:42:53.490 --> 00:42:57.250
So I really don't like that part of that.

00:42:57.250 --> 00:43:06.690
Um, I am digging into like, all right, they're saying they're compliant, but what level, like what are their privacy policies actually saying?

00:43:06.690 --> 00:43:08.930
Are they actually FERPA compliant?

00:43:08.930 --> 00:43:17.329
Um, that's really, really tricky to determine, and it takes diving in to and actually looking at their privacy policies to determine that.

00:43:17.329 --> 00:43:28.690
So, moral of the story is I think it's a huge potential win, but still a lot of unanswered questions only being a week into the release.

00:43:28.690 --> 00:43:29.090
Yeah.

00:43:29.090 --> 00:43:30.450
Does that answer your question?

00:43:30.450 --> 00:43:31.730
Yes, absolutely.

00:43:32.050 --> 00:43:33.490
Yeah, I mean, great answer.

00:43:33.490 --> 00:43:42.849
And again, like to me, it just seemed like well, when they're encouraging you to, oh, go ahead and uh, you know, connect your Google Drive to this and connect this right there.

00:43:42.849 --> 00:43:46.130
I was like, I just really don't feel comfortable with that, you know.

00:43:46.130 --> 00:43:49.730
Yes, exactly.

00:43:49.730 --> 00:43:52.530
You know, it's for teachers, it's not gonna train on my data.

00:43:52.530 --> 00:43:55.730
And so for any teacher, it's like, well, it says it's not gonna train on anything.

00:43:55.730 --> 00:44:04.130
Sure, let me go ahead and and just go ahead and add my Google Drive to it, where I was like, well, wait a minute, you know, do you own that Google Drive?

00:44:04.130 --> 00:44:08.050
That really belongs to the district too as well, you know, because it's part of their domain.

00:44:08.050 --> 00:44:15.650
And like you said, there are other things out there that we may not know about, and how maybe there might be some data that is still being used.

00:44:15.650 --> 00:44:18.050
So again, just proceed with caution.

00:44:18.050 --> 00:44:19.730
That's what Lindy and I say.

00:44:19.730 --> 00:44:28.130
So just make sure that um, you know, do a research and do your thorough research at yourself too before using and just to make sure that everything is good.

00:44:28.130 --> 00:44:42.769
Now, Lindy, as we kind of start wrapping up a little bit, I did have a couple of questions here as far as uh I know you did mention, you know, you are coming from Montana, so I know that probably there's a lot of rural school districts that are out there in those areas.

00:44:42.769 --> 00:44:57.809
So I want to ask you, because I mean, obviously, I live in a different demographic here than in a rural area, but what is it that you see that are some unique challenges for the rural communities, the educated educator communities?

00:44:57.809 --> 00:45:05.490
Uh, what is what are some things that they face as far as adopting AI, uh, maybe beyond the budget, things of that sort?

00:45:05.490 --> 00:45:07.650
Where do they stand?

00:45:07.650 --> 00:45:08.930
What do you see?

00:45:09.730 --> 00:45:11.090
Oh my gosh, where do I start?

00:45:11.090 --> 00:45:12.530
This is this is my life.

00:45:12.530 --> 00:45:14.610
This is how I got into ed tech.

00:45:14.610 --> 00:45:16.289
Well, not exactly, but kind of.

00:45:16.289 --> 00:45:22.210
I got into the work I'm doing right now with ed tech was to support those small rural schools.

00:45:22.210 --> 00:45:23.490
Because I was there.

00:45:23.490 --> 00:45:32.769
I was a teacher in a rural school, and I knew how it's sometimes impossible to even get technical help, let alone instructional help.

00:45:32.769 --> 00:45:33.970
Um, so there's that.

00:45:33.970 --> 00:45:44.769
But for B, the biggest thing with the small rural schools is they don't have the manpower to work through things like compliance.

00:45:44.769 --> 00:45:50.450
That that I mean, honestly, that's number one problem right there is compliance.

00:45:50.450 --> 00:46:04.369
So then you've got teachers that are like, hey, I want to do these things and I, but I have to have access to technology tools, and they're going, well, you know, either we don't have the staff to review them and determine if they're compliant, and or we don't have the budget for them.

00:46:04.369 --> 00:46:07.250
And then the teacher feels like their hands are tied, right?

00:46:07.250 --> 00:46:09.010
And then that kind of leads to burnout.

00:46:09.010 --> 00:46:15.250
When you as a teacher feel like, hey, I'm not doing right by my students, I'm not preparing them to their future for their future.

00:46:15.250 --> 00:46:22.930
And like I know that, but I feel like my hands are tied, that's a really hard place to be as a teacher.

00:46:22.930 --> 00:46:26.769
So, so that honestly, that's for me number one.

00:46:26.769 --> 00:46:32.610
Um, I also think the misinformation and the fear mongering is another one.

00:46:32.610 --> 00:46:35.889
And don't get me wrong, like I am very anti-fear mongering.

00:46:35.889 --> 00:46:46.690
I think that fear mongering is very not productive, but sometimes people take that to mean that I am just pro-tech everything and pro-AI everything.

00:46:46.690 --> 00:46:50.849
And I tend to take the cheerleader perspective because that's my job.

00:46:50.849 --> 00:46:56.690
I want teachers to feel like we were talking at the beginning of like tech isn't one more thing.

00:46:56.690 --> 00:46:59.809
It can be easy, it can be seamless, right?

00:46:59.809 --> 00:47:03.329
We can do this and we have to do this for the sake of our kids.

00:47:03.329 --> 00:47:13.970
But that does not mean that I'm still not making sure that I am recommending tools that are compliant, that are safe, that are secure, that are guardrailed, right?

00:47:13.970 --> 00:47:16.530
That have teacher dashboards, etc.

00:47:16.530 --> 00:47:25.570
Um, and that's the part where again, when you don't have the resources, manpower and budget being the two, right?

00:47:25.570 --> 00:47:39.809
Time and money, the two big resource, major resource limitations in education, then you don't even have the ability to touch the technology and interact with it or do it.

00:47:39.809 --> 00:47:53.329
And then you've got this misinformation and fear-mongering, and there's nobody there to help push against and be like, actually, no, actually that's not true, or actually that's not how this tech works, or you know what I mean?

00:47:53.329 --> 00:48:07.010
Or push not only push back, but go, oh no, you're right, that is a legitimate concern because there's a lot of legitimate concerns that we need to discuss when it comes to AI in particular and tech in particular.

00:48:07.409 --> 00:48:07.889
Excellent.

00:48:07.889 --> 00:48:09.490
All right, great answer.

00:48:09.490 --> 00:48:18.050
Well, Lindy, it has been an amazing pleasure to have you on the show and just talk all things at tech generative AI with you.

00:48:18.050 --> 00:48:27.730
And I definitely learned so much, and I know that our audience members listening to this show will definitely take a lot of great gems that you shared uh with them today.

00:48:27.730 --> 00:48:37.730
And of course, I encourage all our listeners to please make sure that you click on the links in the show notes so that way you can follow Lindy and you can visit her webpage also as well.

00:48:37.730 --> 00:48:46.690
But Lindy, let us know for our audience members that are wanting to connect with you, or maybe there are some school district leaders that are listening and say, hey, you know what?

00:48:46.690 --> 00:48:51.170
We we need Lindy to come and really work with us and help our teachers.

00:48:51.170 --> 00:48:55.490
How what might be the easiest way for them to get a hold of you?

00:48:56.130 --> 00:48:57.329
Yes, my website.

00:48:57.329 --> 00:49:00.690
My website is the key to everything, lindyhawk.com.

00:49:00.690 --> 00:49:05.409
So that's l-i-n-dy-h-o-c dot com.

00:49:05.409 --> 00:49:10.530
That will give you access to all of my social channels, my blog posts.

00:49:10.530 --> 00:49:14.369
I do a lot of events and webinars and podcasts are all linked from there.

00:49:14.369 --> 00:49:17.490
I have a lot of really great resources in my blog.

00:49:17.490 --> 00:49:20.210
You can also get my assessment puzzle toolkit.

00:49:20.210 --> 00:49:22.690
It's 20 pages, it's a free download.

00:49:22.690 --> 00:49:24.930
It's on the website as well.

00:49:24.930 --> 00:49:28.289
Um, yeah, you can reach out to me, you can contact me.

00:49:28.289 --> 00:49:29.809
I'm active on socials.

00:49:29.809 --> 00:49:31.250
That is the key.

00:49:31.409 --> 00:49:32.690
Lindyhawk.com.

00:49:32.690 --> 00:49:33.809
All right, excellent.

00:49:33.809 --> 00:49:35.730
So we'll definitely make sure we link that.

00:49:35.730 --> 00:49:37.170
And you heard it here, guys.

00:49:37.170 --> 00:49:40.210
Make sure you go to that website so you can connect with Lindy.

00:49:40.210 --> 00:49:43.809
And please make sure that you do connect with her on all socials.

00:49:43.809 --> 00:49:49.650
You do not want to miss out on all the great things that she puts out, and then that way you'll also know where it is that she'll be.

00:49:49.650 --> 00:49:56.130
She may be in a city near you, and maybe hey, you can go ahead and stop by that conference and you know, see, catch her presentations.

00:49:56.130 --> 00:49:57.090
All right, Lindy.

00:49:57.090 --> 00:50:01.409
But before we wrap up, I always love to end the show with these last three questions.

00:50:01.409 --> 00:50:08.369
But because you are a very special guest, I'm throwing in a bonus question that wasn't shared with you.

00:50:08.369 --> 00:50:10.690
So hopefully you're ready for that one.

00:50:10.690 --> 00:50:14.610
But let's go ahead and do the ones that we are for sure familiar with.

00:50:14.610 --> 00:50:19.409
And so, as we know, Lindy, every superhero has a pain point or a weakness.

00:50:19.409 --> 00:50:22.530
And for Superman, kryptonite was his weakness.

00:50:22.530 --> 00:50:36.289
So I want to ask you, Lindy, in the current state of education, or a and this encompasses uh AI in education too as well, what would you say that you would consider to be your current edu kryptonite?

00:50:37.329 --> 00:50:51.250
Oh, definitely the act like it doesn't exist culture in education when it comes to technology, like pulling the wool over your eyes, acting like outside these classroom models, the world isn't changing and that technology doesn't exist.

00:50:51.250 --> 00:50:52.369
And I'll tell you why.

00:50:52.369 --> 00:50:55.889
It's because it removes you from the conversation.

00:50:55.889 --> 00:51:05.889
And in order to shape the future of technology and make sure it goes down the good path and not the bad path, you have to be a part of the conversation.

00:51:05.889 --> 00:51:15.409
And you means everyone in the school, your teachers, your leadership, your staff, and of course, your students all have to be part of that conversation.

00:51:15.809 --> 00:51:17.250
Fantastic answer.

00:51:17.250 --> 00:51:18.050
I love it.

00:51:18.050 --> 00:51:19.809
Thank you so much for sharing that one.

00:51:19.809 --> 00:51:27.409
All right, question number two if you could have a billboard with anything on it, what would it be and why?

00:51:28.450 --> 00:51:32.690
Oh, so this is I I told you I kind of got into this early, and I was like, I won't go there now.

00:51:32.690 --> 00:51:37.650
But one of my initiatives is this idea of that we have need to reboot civics.

00:51:37.650 --> 00:51:56.690
So my billboard would say, reboot civics, there's no such thing as digital citizenship, it's just citizenship because AI is pushing us into this realm where it it's not just digital and the mix of being a good citizen and digital citizenship is getting very intertwined.

00:51:56.690 --> 00:52:07.409
And think about it like the way that you interact with AI is so much grounded in your values and how you consider your civic responsibility as a member of society.

00:52:07.970 --> 00:52:08.849
Fantastic.

00:52:08.849 --> 00:52:10.369
That is a great billboard.

00:52:10.369 --> 00:52:11.010
Love it.

00:52:11.010 --> 00:52:11.730
All right.

00:52:11.730 --> 00:52:19.730
Next question: if you can trade places with a single person for a day, who would that be and why?

00:52:20.690 --> 00:52:21.570
Oh, I love that.

00:52:21.570 --> 00:52:24.690
My husband and I actually talk about this fairly often.

00:52:24.690 --> 00:52:31.250
It would be a someone in a philanthropic position, especially like a philanthropic woman.

00:52:31.250 --> 00:52:38.289
And the reason is is one, I just want to be able to like know what it feels like to be able to just give, right?

00:52:38.289 --> 00:52:45.250
And two, I would use that day if I had a day to switch places to give to kids.

00:52:45.250 --> 00:52:52.289
I would give to schools, I would give to kids, I'd give to makers, but all the things I would give to kids.

00:52:53.250 --> 00:52:55.250
Nice, love it, love it.

00:52:55.250 --> 00:53:02.369
All right, and here's my last question for you, and I think that this is something that's very fitting, you know, based on the conversation that we just had.

00:53:02.369 --> 00:53:22.610
So I want to ask you if you had the opportunity, or actually I should say the power to give every teacher in America one tech superpower that they don't currently have, but I'm not talking about a tool, I'm talking more maybe like a mindset or skill.

00:53:22.610 --> 00:53:25.570
What would that be and why?

00:53:26.450 --> 00:53:55.409
Oh, it would totally be what I was talking about at the beginning of the episode of being able to take technology and everything, and the idea of algorithm, social media, and social media algorithms, and and how AI works and embed that into what we're teaching because it fits so perfectly and it's so relevant to all of our lives, but especially Gen Z and Gen Alpha, which are all K-12 kids or either Gen Z or Gen Alpha right now, right?

00:53:55.409 --> 00:54:00.210
And it would add so much more relevance to learning.

00:54:00.210 --> 00:54:14.930
And we are really, and again, generalizing there's pockets of success, but overall, K-12 education is lacking relevance, and I think that has a trickle effect to so many of the other challenges we have in K-12 education.

00:54:14.930 --> 00:54:25.570
So adding relevance and teaching those information literacy concepts as part of the core curriculum and the standards that we're already teaching.

00:54:25.570 --> 00:54:27.650
To summarize that in one second.

00:54:28.369 --> 00:54:28.769
Love it.

00:54:28.769 --> 00:54:30.769
Great, thank you so much, Lindy.

00:54:30.769 --> 00:54:34.769
I really appreciate you being here today and being just an amazing guest.

00:54:34.769 --> 00:54:40.289
I had a wonderful time here learning more about you, learning more about the work that you're doing.

00:54:40.289 --> 00:54:52.050
So I definitely wish you the best in the remainder of this year and wish you a successful 2026, you know, as you make it out there, travel and getting going to conferences and all that great stuff.

00:54:52.050 --> 00:54:56.690
So, and as always, once you are a guest of Maya Tech Life, you always have an open invite.

00:54:56.690 --> 00:55:03.570
So whenever you got your next book, whenever you've got your next big project or anything, or oh, we didn't even mention this.

00:55:03.570 --> 00:55:09.010
Uh, or I don't even know if we should or not, but I know that you are working on something that you know should be coming up.

00:55:09.170 --> 00:55:09.889
But I don't know.

00:55:09.889 --> 00:55:11.730
Yeah, we can we can talk about it.

00:55:11.970 --> 00:55:12.450
Okay, yeah.

00:55:12.450 --> 00:55:14.690
So let's add that little segment here.

00:55:14.690 --> 00:55:23.409
Because I was gonna say, you know, or maybe when you start your now I can officially say it, your podcast, then you know, you you can definitely come back.

00:55:23.409 --> 00:55:24.450
So I'm really excited.

00:55:24.450 --> 00:55:28.610
So, but before we wrap up, tell us a little bit more about that project, Lindy.

00:55:29.250 --> 00:55:30.610
Yes, I'm super excited.

00:55:30.610 --> 00:55:33.090
I've been working on this for a long time.

00:55:33.090 --> 00:55:34.849
I am starting a podcast.

00:55:34.849 --> 00:55:36.930
It's called Make EdTech 100.

00:55:36.930 --> 00:55:42.769
If you follow me, that's kind of my my tagline is this idea of let's make ed tech real.

00:55:42.769 --> 00:55:49.170
Let's talk about real strategies, let's talk about what's happening in real classrooms, real stories from educators.

00:55:49.170 --> 00:55:55.170
So, yeah, if you go to my website, lindyhawk.com, it'll it's not up there right now, depending on when you're listening to this.

00:55:55.170 --> 00:55:58.849
This is Thanksgiving week, November 2025, but soon.

00:55:58.849 --> 00:56:03.889
Like I should have it up there in December by the end of 2025 for sure.

00:56:03.889 --> 00:56:11.250
You'll be able to see my uh make a tech 100, and it'll be on all of the podcast platforms too.

00:56:11.250 --> 00:56:12.530
So you can always search there.

00:56:12.849 --> 00:56:13.889
Love it, love it.

00:56:13.889 --> 00:56:15.250
You heard it here first.

00:56:15.809 --> 00:56:20.849
You're you're already on tap to be on there, so I'm gonna get to interview you.

00:56:20.849 --> 00:56:22.369
We're gonna switch positions.

00:56:22.610 --> 00:56:23.090
All right.

00:56:23.090 --> 00:56:24.130
I love it, I love it.

00:56:24.130 --> 00:56:25.490
Well, you heard it here, guys.

00:56:25.490 --> 00:56:29.809
I mean, there is there really anything that Lindy can't do, you know.

00:56:29.809 --> 00:56:31.570
So this is fantastic.

00:56:31.570 --> 00:56:32.530
So thank you, Lindy.

00:56:32.530 --> 00:56:33.650
I really appreciate it.

00:56:33.650 --> 00:56:38.130
And again, for all our audience members, please make sure you click on those show notes.

00:56:38.130 --> 00:56:44.130
Make sure that you go to lindyhawk.com, make sure that you follow Lindy on all social media.

00:56:44.130 --> 00:56:51.170
I promise you, you're gonna definitely enjoy all the content that gets put out there, and you're gonna definitely learn from it and engage with it.

00:56:51.170 --> 00:56:53.250
So, again, she is fantastic.

00:56:53.250 --> 00:57:09.570
And again, for our audience members also, please visit our website, myatech.life, where you can check out this amazing episode and the other 340 plus episodes, where I promise you you will get a little golden nugget from that you can already sprinkle into what you are already doing.

00:57:09.570 --> 00:57:09.970
Great.

00:57:09.970 --> 00:57:13.409
And again, this wouldn't all be possible if it weren't for our sponsors.

00:57:13.409 --> 00:57:18.690
Thank you, Book Creator, Eduate, Yellow Dig, and Peel Back Education for your support.

00:57:18.690 --> 00:57:22.690
And until next time, my friends, don't forget, stay techie.
Lindy Hockenbary Profile Photo

K-12 EdTech Advisor

Lindy Hockenbary—aka “LindyHoc”—is the bridge between education and technology. With experience in instructional technology, professional development, and curriculum design, she helps educators make sense of emerging technologies—especially artificial intelligence—and turn them into practical, classroom-ready learning experiences.

Her journey began in a technology-equipped classroom, where she first blended instruction with innovation. Since then, she has led more than 1,300 hands-on trainings for over 30,000 educators, authored A Teacher’s Guide to Online Learning, and supported schools around the world as they navigate digital learning, AI literacy, and instructional change. In recognition of her leadership, she was named a 2025 Leading Woman in AI honoree by ASU+GSV.

Lindy’s work is grounded in helping kids, which fuels her passion for making technology literacy accessible and meaningful for every classroom. Her motto is “Make EdTech 100,” and her mission is to empower educators with the confidence and clarity to navigate the evolving intersection of technology and pedagogy.