Aug. 8, 2025

Episode 332: Becky Keene

Episode 332: Becky Keene

AI Optimism with Becky Keene In this episode of My EdTech Life, I sat down with author, educator, and thought leader Becky Keene to unpack her new book AI Optimism. We tackled everything from AI literacy and data privacy to the SAMR model, edtech skepticism, and what it really takes to shift classroom practice forward, not with hype, but with intention. Becky shares her “AI Optimism” framework, talks candidly about the realities of edtech implementation, and reminds us all that if AI can do y...

AI Optimism with Becky Keene

In this episode of My EdTech Life, I sat down with author, educator, and thought leader Becky Keene to unpack her new book AI Optimism. We tackled everything from AI literacy and data privacy to the SAMR model, edtech skepticism, and what it really takes to shift classroom practice forward, not with hype, but with intention.

Becky shares her “AI Optimism” framework, talks candidly about the realities of edtech implementation, and reminds us all that if AI can do your job, maybe it’s time to teach differently.

Whether you're a classroom teacher, district leader, or edtech developer, this conversation is for you. Dive in and walk away with clarity, strategy, and a renewed sense of agency.

Timestamps
00:00 – Intro & Becky’s background
03:10 – Why she wrote AI Optimism
07:00 – The SAMR model & AI’s role
11:30 – From substitution to redefinition
15:00 – Empowering student agency with AI
17:30 – Rethinking PD for meaningful AI use
23:30 – Higher ed vs. K–12 AI needs
28:30 – Data privacy concerns
31:30 – Equity, access & the freemium problem
35:45 – Can AI replace teachers?
40:00 – Encouraging teacher reflection
42:00 – Final questions & takeaways 

📚 Grab Becky’s book AI Optimism + free study guide:
🌐 https://beckykeene.com

🎉 Shoutout to our amazing sponsors:
📘 Book Creator – Create, read, and publish student work with ease
🤖 Eduaide – AI tools designed to empower teachers, not replace them
🟠 Yellowdig – Social learning built for real engagement

Visit our site for more great episodes!

🔗 Website: https://www.myedtech.life
💛 Support the show: Become a Podcast Partner

👇 Like, comment, and share this episode with your network. Let's keep growing together.


 🎤 And as always... Stay Techie!

Peel Back Education exists to uncover, share, and amplify powerful, authentic stories from inside classrooms and beyond, helping educators, learners, and the wider community connect meaningfully with the people and ideas shaping education today.

Authentic engagement, inclusion, and learning across the curriculum for ALL your students. Teachers love Book Creator.

Support the show

Thank you for watching or listening to our show! 

Until Next Time, Stay Techie!

-Fonz

🎙️ Love our content? Sponsor MyEdTechLife Podcast and connect with our passionate edtech audience! Reach out to me at myedtechlife@gmail.com. ✨

 

00:30 - Welcome and Introduction

02:02 - Meet Becky Keene: Educator & Author

03:13 - The AI Optimism Framework Explained

08:40 - SAMR Model in AI Education

15:50 - Building Critical AI Decision-Making Skills

21:56 - Effective Professional Development Strategies

28:42 - Data Privacy in Education vs Operations

33:43 - Addressing the AI Equity Gap

38:28 - AI Won't Replace Teachers Who Teach Right

44:43 - Closing Thoughts and Connections

WEBVTT

00:00:30.396 --> 00:00:33.798
Hello everybody and welcome to another great episode of my EdTech Life.

00:00:33.798 --> 00:00:42.649
Thank you so much for joining us on this wonderful day and, wherever it is that you're joining us from around the world, thank you, as always, for all of your support.

00:00:42.649 --> 00:00:44.969
We appreciate all the likes, the shares, the follows.

00:00:44.969 --> 00:00:48.651
Thank you so much for engaging with our content and sharing our content.

00:00:48.651 --> 00:00:52.189
Thank you so much to all our new listeners as well.

00:00:52.189 --> 00:00:55.450
We definitely appreciate all of that love.

00:00:55.450 --> 00:00:56.865
Thank you, thank you, thank you.

00:00:56.865 --> 00:01:11.796
As you know, we do what we do for you to bring you some amazing conversations and amazing guests and, of course, as always, today is no different, because we have an amazing guest and an author that we're going to be talking about their new book release.

00:01:11.796 --> 00:01:15.868
So I'm really excited for you to meet Becky Keene.

00:01:15.868 --> 00:01:17.813
Becky, how are you doing today?

00:01:18.617 --> 00:01:20.724
I'm great Thanks, happy Friday.

00:01:20.724 --> 00:01:21.707
We're recording on Friday.

00:01:22.230 --> 00:01:27.049
Yeah, we're recording on Friday, so hopefully you guys are having a wonderful, wonderful Friday as well.

00:01:27.049 --> 00:01:28.837
So thank you, becky, for being here.

00:01:28.837 --> 00:01:30.646
I'm really excited to connect with you.

00:01:30.646 --> 00:01:46.034
I know I've been a longtime follower and then scrolling through TikTok to just been loving the content that you're putting out with your TikTok walks and talking about AI and just kind of everything leading up to your book, and you know so many great things there.

00:01:46.034 --> 00:01:54.004
So I'm really excited for all our listeners current listeners and any new listeners that are joining to get to connect with you.

00:01:54.004 --> 00:02:01.368
And before we dive into our great conversation, can you give us a little brief introduction and what your context is within the education space?

00:02:02.171 --> 00:02:02.933
Oh, absolutely.

00:02:02.933 --> 00:02:05.852
I am an educator first and foremost.

00:02:05.852 --> 00:02:10.348
I still identify as an educator, although I'm not in a classroom full-time anymore.

00:02:10.348 --> 00:02:13.362
I'm not in a district full-time, but that's my heart.

00:02:13.362 --> 00:02:19.520
I spent 15 years working for a large school system south of Seattle Washington.

00:02:19.520 --> 00:02:35.876
I've taught everyone from little tiny eight-year-olds through adults, and I've spent the last 10 years specifically focused on adult professional learning in education and primarily ed tech, although my background is in literacy.

00:02:35.876 --> 00:02:37.811
So it's been a journey.

00:02:38.818 --> 00:02:39.060
Excellent.

00:02:39.060 --> 00:02:46.514
So now we're going from that literacy now in the ed tech space to, you know, ai literacy and AI optimism.

00:02:46.514 --> 00:02:51.920
Like I mentioned to our audience members, we're definitely going to be talking about your new release that we have right here.

00:02:51.920 --> 00:02:58.741
Ai Optimism a guide to yeah in my hand, and so I'm really excited for us to talk about this.

00:02:58.741 --> 00:03:11.955
So I want to ask you, becky, just to kind of start off the conversation, is what inspired you to write AI Optimism, and how do you personally define AI Optimism within the education context?

00:03:13.300 --> 00:03:22.775
So I have been involved in the AI movement for the last few years, like many of us, and I started to notice some trends that really bothered me.

00:03:22.775 --> 00:03:27.324
I started to notice some trends that really bothered me.

00:03:27.324 --> 00:03:33.704
I started to notice that over and over on the social media groups I was a part of, at conferences, I was seeing this trend toward one of two directions.

00:03:33.704 --> 00:03:37.733
One was oh no, the world is ending right, we're all going to die.

00:03:37.733 --> 00:03:41.004
The robots are taking over Just a lot of fear-based decision-making.

00:03:41.004 --> 00:03:47.342
And the other was the oversimplification of this powerful tool we have in front of us.

00:03:47.342 --> 00:03:59.925
So ed tech companies I'm not going to blame educators for this, but ed tech companies really focusing on these very, very low level uses of AI, you know, just constantly pushing.

00:03:59.925 --> 00:04:10.634
You can make a worksheet, you can make a quiz, you can make a lesson plan and unfortunately, that is just really not where I hope this tool ends in education.

00:04:10.634 --> 00:04:25.225
So for me it was kind of more about I felt that I had something to say and I wanted to capture everything that I had, you know, kind of been swirling around in my head as I go on walks and walk my dog and do different things.

00:04:25.286 --> 00:04:34.081
I kind of had all these thoughts brewing and I finally thought, okay, I want to write something about that, and I had been familiar with the SAMR model for what?

00:04:34.081 --> 00:04:39.391
The last 20 years, and so I just started really thinking through.

00:04:39.391 --> 00:05:00.807
You know, it's okay to use AI at that S substitution level in many cases, and that's where most of us got started using AI, but let's not end the journey there, and so one little fun fact you might be interested in is I wrote the entire book before I had a title, because the title was not the point.

00:05:00.807 --> 00:05:08.802
The book was the point, and so, after everything was written, my editor and I had some conversations.

00:05:08.802 --> 00:05:15.382
I had a couple other close friend conversations and said, okay, what do I really want to call this?

00:05:15.382 --> 00:05:23.827
And that's when we came up with AI Optimism as the level, but it's the content that I wanted to talk about first.

00:05:24.430 --> 00:05:43.214
Love it and you know there's a lot to unpack there in that statement, because I, along with you, obviously, you know we have, you know, very similar circles and you know, in a lot of the chats and a lot of the social media platforms and so on.

00:05:43.235 --> 00:05:44.920
And so you know, and even as part of my dissertation study exactly what you mentioned.

00:05:44.920 --> 00:05:45.944
Congratulations.

00:05:45.944 --> 00:05:48.107
By the way, we have to stop for just a minute.

00:05:48.107 --> 00:05:50.975
You just made a major announcement.

00:05:50.975 --> 00:05:52.848
I'm so incredibly happy for you.

00:05:52.848 --> 00:05:56.889
That is obviously a huge journey and congratulations.

00:05:57.350 --> 00:05:57.831
Thank you.

00:05:57.831 --> 00:05:59.524
Thank you, becky, I really appreciate it.

00:05:59.524 --> 00:06:07.555
But it really lines up with what you said because, as as you said, there it was like a split into like two factions.

00:06:07.555 --> 00:06:16.579
You know the oh no, like you know let's resist this because of you know, just the unknown, the fear of the unknown, losing control.

00:06:16.579 --> 00:06:26.872
And one of the things that I've learned in doing PD, and especially when you're going to implement something, a lot of the pushback that you get it's not so much that you're adding something to the teacher.

00:06:26.872 --> 00:06:36.002
A lot of times it's really they feel like they're losing that control and so maybe for a lot of them it was that loss and fear of that control that they once had.

00:06:36.002 --> 00:06:44.230
And then, of course, you've got the other speedboat side where it's like, hey, let's do this, let's go, but, like you mentioned, at that very low level.

00:06:44.230 --> 00:06:59.968
And so I really love what you said about the SAMR model, because I know I had talked to you a little bit and asking you about this model and saying, because of the speed and how fast platforms are evolving and moving, is this a great model to still continue using?

00:06:59.968 --> 00:07:03.762
And I'm thinking, yes, it still is, because it doesn't over.

00:07:04.184 --> 00:07:26.721
I feel, and this is my opinion and other guests that I have interviewed that were part of the study it was really they're stating the use of AI has been so low level where it's just substitution, and I think you and I can agree that we've seen a lot of technology that has come through and you know, ok, we've got Chromebooks now that has come through.

00:07:26.721 --> 00:07:27.725
And you know, okay, we've got Chromebooks now.

00:07:27.725 --> 00:07:31.220
Okay, so now instead of a physical worksheet, now I'm going to give it to you digitally but you're still annotating.

00:07:31.220 --> 00:07:44.564
So I was like, okay, so we're just substituting the physical for that digital and it just seems like we get stuck on that S and there's some teachers that are great at augmenting so they go to that next step.

00:07:44.564 --> 00:07:54.766
I recently had a guest on joe christiansen who pretty much mentioned it's like we're pretty much just stay stuck on the s and we never full on go into the r.

00:07:54.867 --> 00:08:03.189
So I love that you mentioned of your ideas and thinking how the ed tech companies are kind of selling it into.

00:08:03.189 --> 00:08:05.853
Yes, we're going to help you do your lesson plans faster.

00:08:05.853 --> 00:08:11.952
We're going to help you answer or create emails faster, do these you know newsletters a lot faster.

00:08:11.952 --> 00:08:16.108
But how is that going to augment what you're doing?

00:08:16.108 --> 00:08:21.685
How is it going to modify, how is it going to, you know, reinvent or redefine education?

00:08:21.685 --> 00:08:42.404
So I want to ask you here I know that you emphasize the Samar model in your book, so tell us a little bit about how you see that we can take this to that really next level, or how ed tech companies can do better in getting us into the AM and the R, the augmentation, modification and redefining the content.

00:08:43.847 --> 00:08:47.234
Well, that is totally what the book is about.

00:08:47.234 --> 00:09:05.147
So the book starts with the AI optimism framework, which is three core principles privacy, praxis and prompting and that really the whole beginning of the book talks about educators taking a step back from you know, oh, I want to try an AI tool.

00:09:05.147 --> 00:09:09.471
That's my least favorite comment in the whole world.

00:09:09.471 --> 00:09:19.811
It's like I want to try this tool and as an instructional coach someone with 10, a decade of instructional coaching experience I would always say no, no, you know, we're not here to try a tool.

00:09:19.811 --> 00:09:24.072
We're here to do best practice and to think about pedagogy first.

00:09:24.072 --> 00:09:30.914
So the practice part of that conversation is really that Think about what you want your students to achieve.

00:09:30.914 --> 00:09:35.649
What do you envision this learning experience looking like for your students?

00:09:35.649 --> 00:09:40.231
Do you envision they all sit down and fill out a worksheet, in which case we can end the conversation?

00:09:40.231 --> 00:09:51.754
But if you envision them being engaged and empowered and excited about what they're doing, then that's something that we have to set aside and then consider tools that keep student data safe.

00:09:51.754 --> 00:09:53.245
That's the privacy component.

00:09:53.245 --> 00:09:59.253
And also, how am I using the tools available to really prompt richly?

00:09:59.253 --> 00:10:01.783
And we see a lot of these.

00:10:01.783 --> 00:10:21.643
I call them easy button tools that are helpful to get started, absolutely, but they shouldn't be where we end with AI, because all they're doing is churning out predictive, pattern-based you know very didactic content in many cases when we could be using our prompting skills to do more.

00:10:22.163 --> 00:10:37.133
So it's starting with those three core principles in mind and then assessing and I have a wheel framework and there are six categories around the edge of the wheel and then assessing what do we want to do, and it's probably one of those six categories.

00:10:37.679 --> 00:10:43.273
I'm trying to design something, or create something, or support my learners, or analyze data, right.

00:10:43.312 --> 00:10:55.903
So it's thinking about the task next and then having a little self-check on how much do I want to invite AI to be my assistant for this task, for my pedagogical purpose.

00:10:55.903 --> 00:11:10.869
So AI Optimism is this decision-making process that starts with what I want to achieve, in the middle is what I'm trying to do, and at the end is the tool that's going to support me and my students the best.

00:11:10.869 --> 00:11:32.729
And so, to answer your question, with the SAMR model in mind, I then have a choice Do I want to use a tool that meets my needs, that sits at a substitution level, or do I want to use that incremental innovation process to move forward into a MRR, and that might change what my entire assessment looks like.

00:11:32.729 --> 00:11:50.707
That's the point of the R right Redefinition things that weren't previously possible, but it might also just be some small tweaks along the way that help me build more student voice, more engagement, more autonomy, more productivity, more creativity, more students as creators.

00:11:50.707 --> 00:11:57.129
You know whatever I'm trying to achieve, so for me it's about that decision along the way.

00:11:57.831 --> 00:11:58.192
Excellent.

00:11:58.192 --> 00:12:03.062
I really love that, you know, and I think that's something that a lot of teachers do have in mind.

00:12:03.062 --> 00:12:19.135
But I think sometimes it's just the excitement of the tool and working as a digital learning coordinator for many years is we bring a platform in and then all of a sudden, the hype is gone.

00:12:19.135 --> 00:12:36.192
But then what I notice is they never, or we never, really go in a lot deeper than that superficial that got them all excited and it's like, well, ok, we already used it, like OK, like that the glitz and the glamour and the shine is gone.

00:12:36.192 --> 00:12:41.692
And now it's like, hey, well, I want to use that app because it just has like one additional component to it.

00:12:41.692 --> 00:12:42.341
But wait a minute.

00:12:42.341 --> 00:12:46.302
I mean, this does the exact same thing, but have we even gone deeper into it?

00:12:46.743 --> 00:12:55.547
And I think that we've gotten into that trend where we just hop from app to app to app to app and we really never dive in deep.

00:12:55.547 --> 00:13:22.182
And so I love what you mentioned, too, about you know it's okay to slow down, but see where it is that you want to take your students and see how you may get them there in a balanced approach with you know, even no tech, little tech, to that final product or that tech platform that's really going to help you modify and redefine and do something that is that was once inconceivable.

00:13:22.182 --> 00:13:25.687
You know to do that, you know, and I know they mentioned that in the Samar model.

00:13:25.687 --> 00:13:34.827
So I think that's something that, as educators, we really need to just kind of slow down and say it's okay, like let's look at the goals first.

00:13:34.827 --> 00:13:42.447
And like they always say, you know, it's like a task before apps, so let's see what is the task, what is that final outcome, what's the process and what is the final product.

00:13:42.447 --> 00:13:47.193
What's the process and what is the final product and what's the best tool to get us there.

00:13:47.313 --> 00:14:05.384
So I really like that, and so I want to kind of talk a little bit and continue that conversation, because in the redefinition chapter you talk about AI enabling tasks that were once inconceivable, things that we didn't think about first, you know, as opposed to just substituting conceivable things that we didn't think about first, you know, as opposed to just substituting.

00:14:05.384 --> 00:14:11.010
But how else might we take that app to that next level to really enhance the learning experience?

00:14:11.010 --> 00:14:35.476
So how do you balance the excitement of doing the redefinition and doing something inconceivable with the excitement and the possibilities of students, you know, doing something that they never thought they could, but also balance that with the over-reliance of, maybe, a particular tool for the problem-solving aspect of it.

00:14:36.779 --> 00:14:37.061
Right.

00:14:37.061 --> 00:14:51.197
So that really comes down to the question that I talk about in the very beginning of the book, which is helping students understand how to answer the question does this use of AI limit my learning?

00:14:51.197 --> 00:14:59.438
And I know it probably seems really like you know, pie in the sky, like there's no way our kids are actually going to do this.

00:14:59.438 --> 00:15:24.909
But I think these red light, green light models that we've adapted to the use of AI, where we're kind of telling the students, okay, you can use it, for you know this purpose for this assignment, but you can't use it for this purpose, for this assignment I feel like at some point we have to start helping them make critical thinking decisions about the way I'm trying to use AI right now.

00:15:25.537 --> 00:15:27.724
Is that limiting me or supporting me?

00:15:27.724 --> 00:15:32.163
Is it helping me design something that would be previously impossible?

00:15:32.163 --> 00:15:40.128
Because maybe I don't know how to code and so I'm not going to learn that for this assignment, but I'm super excited to create a website without having to learn coding.

00:15:40.128 --> 00:15:43.945
Or is the point of the assignment to learn coding?

00:15:43.945 --> 00:15:51.524
And now I'm skipping past all of that really important knowledge that I'm going to need for my career, my goals, my passions, my hobbies.

00:15:51.524 --> 00:15:57.388
So I feel like helping students understand that question does this use of AI limit?

00:15:57.388 --> 00:16:06.658
My learning is going to set them up for success beyond the walls of a classroom, where they have a poster on the wall telling them exactly how and when they're supposed to be using.

00:16:39.016 --> 00:17:00.155
Excellent, I love that and you know that's such an important question and I think oftentimes we don't give our students enough credit as educators and immediately you know a lot of the reactions from November 2022, and even still to this day it's like, oh, they're just going to cheat, they're just going to go ahead and do this, and so on and so forth, and you know, but that's our assumption.

00:17:00.496 --> 00:17:11.038
You know, yes, there will be some students and, just like we know, there's always, you know, that handful of students that will try and do as little as possible to just get what we need.

00:17:11.097 --> 00:17:28.424
But it's because it's like, hey, what is it that we need to get an A, you know, and they're just going to give you the minimum, maybe even just a C, because it's like they're not as engaged, they're not, you know, really just getting that attention that they need to enhance that learning experience.

00:17:28.625 --> 00:17:38.484
So I really like that you talk about that and that we can possibly now, you know, with the information that is out there, what can we do better to help our students?

00:17:38.484 --> 00:17:40.913
And I know that that starts with the teachers first.

00:17:40.913 --> 00:17:51.451
So, becky, in your experience, since I know that you do a lot of PD and you're, you know, pretty much everywhere nationwide, doing a lot of trainings and you're, you know, pretty much everywhere nationwide doing a lot of trainings for teachers.

00:17:51.451 --> 00:17:51.750
What are some?

00:17:51.750 --> 00:18:16.238
Maybe, if you can give us two tips for anybody in a leadership role in a school that provides PD, what can we do to help our teachers kind of maybe not think just that like plagiarism, the eminent, like just the negative side, and maybe reframe their thinking to seeing more of that potential and then being able to translate that to students.

00:18:18.086 --> 00:18:19.754
That's what AI Optimist is all about.

00:18:19.754 --> 00:18:24.949
Right Is acknowledging the challenges and then still be willing to press forward.

00:18:24.949 --> 00:18:26.790
So two tips.

00:18:26.790 --> 00:18:38.211
One tip is I always talk for over 20 years I've been talking about incremental innovation, disruptive innovation.

00:18:38.552 --> 00:18:44.913
It sounds great in a business model but it's not sticky and it makes it drives a lot of fear.

00:18:44.913 --> 00:18:48.804
It feels like a push in rather than this.

00:18:48.804 --> 00:18:50.992
You know, organic role in this, together mentality.

00:18:50.992 --> 00:19:00.904
So I don't love when school systems come in and they're like you know, rip the bandaid off, we're all doing this thing, we're going to provide a bunch of PD, it's going to change everything Like.

00:19:00.904 --> 00:19:05.273
No one likes that and it's generally not very effective.

00:19:05.273 --> 00:19:08.309
I've seen those things come and go over time.

00:19:08.349 --> 00:19:35.356
So I really, really recommend that school leaders and instructional coaches suggest that incremental innovation where we're going to honor where people are now, because the reality is most of our educators nationwide in the United States are doing great work and they're doing the best they can, and so you know we don't have a lot of people out there just phoning in, so honor that and then talk to them about that.

00:19:35.504 --> 00:20:21.227
This is part two solving their problems, because every educator has things they would love to fix right Parent communication or student engagement or the workload of scoring, you know, ap and IB sample tests Like there's something that they're looking to achieve that is going to improve their life and therefore the lives of their students, because they've now freed up some cognitive space to do more what they love, right, and so being in that problem solver mindset again, instead of recommending an app like hey, you know, we're all going to use fill the blank, I don't want to like throw an app under the bus, but a lot of times those implementations don't fit the needs of all of our educators and all of our classrooms.

00:20:21.227 --> 00:20:41.740
So rather, approaching it with a problem solving mindset of what do you want to achieve and how can I support you getting there and then going on that journey to figure out what's the best fit, I think is absolutely a more sticky approach that's going to honor the expertise in the room and deliver something that actually shows.

00:20:44.224 --> 00:20:44.404
Excellent.

00:20:44.404 --> 00:20:44.786
I really love that.

00:20:44.786 --> 00:20:56.194
I recently had a conversation with Kyler Cheatham, who is somebody else that I found on TikTok and she works with organizations, but she also works with school districts and she really just echoes everything that you just said right now.

00:20:56.194 --> 00:21:26.512
It was just, you know, being she mentioned making sure that everybody that needs to be in the room is in the room, went to and stating that if 50% of your room is not filled with end users, then you need to kind of, you know, make that happen, to make sure that there is that buy-in, and understanding that it's not, like you mentioned, trying to put a band-aid, maybe even over a band-aid, you know, and then over another band-aid, and so on.

00:21:26.512 --> 00:22:04.779
So it's just to making sure that you're finding those solutions and being very active in the problem-solving process, and I think that that is something that is wonderful that at the very top, you know, for superintendents, ctos, curriculum directors, they all need to be in that room, because oftentimes those decisions are left to one person and they're based on maybe one teacher who came back from a conference who's very excited and says I need you to open this up or I need you to purchase this, but, like you mentioned understanding that that tool may not be for everybody and it may not be for every teacher or maybe even every learner.

00:22:04.779 --> 00:22:12.048
So, going back to, let's look at the root of the problem.

00:22:12.048 --> 00:22:16.323
What is it that we're trying to solve and then make the best decision there for the whole body there in your district.

00:22:16.323 --> 00:22:17.686
So I really like that a lot.

00:22:17.686 --> 00:22:27.363
That's just some really great tips and obviously the experience that you have in speaking in so many places, even from k-12 to higher ed, that definitely I love that.

00:22:27.383 --> 00:22:30.153
So again, guys, the book is AI Optimism.

00:22:30.153 --> 00:22:32.173
Make sure that you check that out.

00:22:32.173 --> 00:22:36.605
But, becky, as we continue our conversation, I know recently I saw a post.

00:22:36.605 --> 00:22:39.294
I believe you were in UT Austin.

00:22:39.354 --> 00:22:43.750
I believe Were you working- Yesterday got a little bit midnight, okay it was yesterday.

00:22:44.225 --> 00:22:44.547
So is that?

00:22:44.547 --> 00:22:46.032
Were you working with higher ed there?

00:22:47.004 --> 00:22:48.191
Yes, we've been working.

00:22:48.191 --> 00:22:54.224
So the company I work for, i2e, we've been supporting several higher ed institutions.

00:22:54.224 --> 00:23:05.381
We've spent a lot of time at the University of Texas, both at Austin and SISTEM, so Dallas, houston and so on supporting their AI implementation.

00:23:05.381 --> 00:23:30.292
So it's interesting because working with higher education, we're not really working with faculty, we're working with operations, business operations, most of the time right now, because we're working with people who are like this is going to save me hours a day on my job, you know those automations, and so it's not so much about instruction and that's been actually really fun, a good growing experience for me.

00:23:31.055 --> 00:23:31.355
Excellent.

00:23:31.355 --> 00:23:33.313
Well, and that's great that I asked that.

00:23:33.313 --> 00:23:49.690
I really thought it was, you know, with higher ed faculty, but now that you mentioned this, that it's more of the operations, you know, I think like K-12 and higher ed, we look at things obviously in that through that education lens, but then you also see on the outside.

00:23:49.690 --> 00:23:54.298
Now you know that productivity and you're working with operations and so on.

00:23:54.298 --> 00:23:56.608
So I want to ask you you know what?

00:23:56.608 --> 00:23:59.354
What are some of the things that you get?

00:23:59.354 --> 00:24:10.067
Let's say, a little bit of fight back, or you know just a little bit, uh okay, on the education side, what do they kind of like fight back on, and in the operation side, what do they kind of like fight back on?

00:24:10.067 --> 00:24:14.475
And in the operation side, what might be some things that they fight back on?

00:24:14.475 --> 00:24:19.076
Just to kind of get a comparison of the education space and the operations space.

00:24:20.766 --> 00:24:26.458
So, educators, I feel like push back a little bit on tools that don't really add value.

00:24:26.458 --> 00:24:50.093
So, for example, I was walking through a conference you know, you and I both go to the ed tech conferences and one of the vendor booths had a nice big sign advertising that they were so excited about changing, like a YouTube video into a PowerPoint with AI so that you could share that with your students.

00:24:50.093 --> 00:24:56.378
And I kind of just wanted to be honest, like and that's your use case.

00:24:56.378 --> 00:25:20.672
So, you know, I just walked through that mentality of wait a minute, we're, we're taking a captioned, dynamic, personalized right, I can pause, I can rewind, I can rewatch instructional piece and we're using the best tools we have on the planet to make it more didactic, more central, focused, less accessible, less engaged.

00:25:20.751 --> 00:25:21.513
Like, what are we doing?

00:25:21.513 --> 00:25:31.746
We're going backwards and I think that educators do notice when a tool comes out and it's like this is the opposite of what we want to be doing in education.

00:25:31.746 --> 00:25:38.767
So there's that pushback of is your tool actually adding value or are you just trying to get me to use it?

00:25:38.767 --> 00:25:41.273
You know, I like that skepticism.

00:25:41.273 --> 00:25:44.267
Part of AI optimism is going here's what's possible.

00:25:45.470 --> 00:26:04.855
That's where I want to go Like don't limit me by again these easy button options that are just stuffing me full of more content to push out to my students, and so I love that approach.

00:26:04.855 --> 00:26:10.658
So we're working Cherie with medical centers, legal teams, real estate.

00:26:10.658 --> 00:26:17.042
They're handling contracts, they're handling donor information, right, alumni relations.

00:26:17.042 --> 00:27:09.086
There's highly sensitive information flowing through those AI models, and so for them.

00:27:09.086 --> 00:27:12.832
Honestly, we cover out of the gate, like in the first five minutes at every session.

00:27:12.832 --> 00:27:16.576
Here's why this is data secure and here's why that matters for you.

00:27:16.576 --> 00:27:32.218
And, yes, you can use HIPAA compliant data, or this is HIPAA compliant those types of things, if they're using the right tools, because that's incredibly important and nobody wants their research in a data leak.

00:27:33.906 --> 00:27:34.911
That's very interesting.

00:27:34.911 --> 00:27:46.327
And going back to what you said, you know, one of the things that shocked me at ISTE, as I was walking through the conference, is I was approached by an app and you know, kind of like.

00:27:46.327 --> 00:27:49.316
I was like is this, is this for real?

00:27:49.316 --> 00:27:53.757
So they kind of talked to me and they said yeah, you know, have you heard of us?

00:27:53.757 --> 00:27:57.676
And I was like no, like, maybe like on social media.

00:27:57.676 --> 00:28:06.772
And then so they're really giving me the you know their spiel and everything and they're saying you know this, you know application is for teachers and you know this is what it does.

00:28:06.772 --> 00:28:08.091
And this is what they said.

00:28:08.091 --> 00:28:17.329
They're like you see, those apps over there, those apps help the teacher before the lesson there, those apps help the teacher before the lesson.

00:28:17.329 --> 00:28:20.378
These apps over here help the teacher after the lesson, but we help them during the lesson.

00:28:20.378 --> 00:28:24.393
And I was like okay, I'm intrigued, my ears perked up yeah yeah, yeah.

00:28:24.413 --> 00:28:26.339
So I was like, okay, so how does that happen?

00:28:26.339 --> 00:28:33.352
He's like, well, you download an application and then the teacher puts on their.

00:28:33.352 --> 00:28:39.221
I guess their expectation was the teacher was going to hang their phone here and was going to record while they teach.

00:28:39.221 --> 00:28:42.365
And I was like, okay, interesting.

00:28:42.365 --> 00:28:44.370
And then I said, well, what about?

00:28:44.370 --> 00:28:47.478
You know, students speaking at that time?

00:28:47.478 --> 00:28:52.297
You know how is it going to protect their privacy when names are being called out or things of that sort?

00:28:52.297 --> 00:29:02.829
And the voices and they're like, oh, uh, well, the teacher wears it really close here and so they wouldn't hear, like the student voices because they're supposed to be wearing it close here.

00:29:02.850 --> 00:29:08.994
And I said, yeah, but they're still mentioning, you know, there's still going to be some, uh, identifiable information bits there.

00:29:08.994 --> 00:29:17.257
And they're like they kind of just stayed quiet and then after that they're like, well, uh, thank you, and that was it, and I walked out and it was very scary, you know.

00:29:17.257 --> 00:29:23.697
But, like you mentioned, there are some things where I feel like wait a minute, like we've already done that and we've already done that.

00:29:23.697 --> 00:29:31.748
And, yes, the optimism is there, becky, but sometimes and I don't know if you feel that way, but I guess maybe the feeling for me was like we could be doing so much more.

00:29:31.748 --> 00:29:37.554
I guess, maybe the feeling for me was we could be doing so much more but, like you mentioned it, sometimes I feel like are we going backwards?

00:29:38.295 --> 00:29:44.622
Right, right, like, did we forget about data privacy when we launched some of these tools?

00:29:44.622 --> 00:29:50.115
Or did we forget about, like, best practice?

00:29:50.115 --> 00:29:50.998
Yeah, I totally agree.

00:29:50.998 --> 00:29:56.404
I'm glad you spoke up and I'm glad that you had that lens of like hang on, really.

00:29:56.404 --> 00:30:18.555
Yeah, I had a similar conversation with an app, I mean when I was at ASU GSB and and they said, oh, what's the number one concern you hear about when you know, when you're working with educators, and and I said data privacy and and I was talking to like the COO or something, and he said, gosh, you're the first person to say that.

00:30:18.555 --> 00:30:19.879
And I said who are you talking to?

00:30:22.410 --> 00:30:23.312
oh my goodness.

00:30:23.653 --> 00:30:28.149
I'm glad that there are people making that no for sure.

00:30:28.209 --> 00:30:41.437
And and then for me that's the biggest thing too, and but also just the fact that you know you're expecting a, an educator, to hang a phone around them and put it so close right here.

00:30:41.437 --> 00:30:53.457
So it's just picking up and and I get it, you know it's the voice note thing and then afterwards you, you know it's going to transcribe and maybe give you or offer suggestions at least that's what it was saying, that's what I took from it.

00:30:53.457 --> 00:30:59.106
But I was like I was like, oh my gosh, like I don't know about that, you know, but we'll see, but anyway.

00:30:59.106 --> 00:31:21.474
So, becky, just as we kind of round out this conversation and maybe as we start kind of wrapping up, I do want to ask you your thoughts on this, because I know that this is something that always there was a lot of guests and a lot of authors also that came on, and even from both sides is the use of AI.

00:31:21.474 --> 00:31:30.135
If you are a school that is very well funded, well good for you to provide students the those same similar tools, because I mean the cost of it.

00:31:30.156 --> 00:31:52.095
You know the talk is it's always going to create more of that divide, not only in the learning aspect, but just obviously in equity access.

00:31:52.095 --> 00:31:57.471
So can you tell us, in your experience, what might we be able to do?

00:31:57.471 --> 00:32:03.048
Well, maybe not as educators, but maybe I guess maybe as platforms.

00:32:03.048 --> 00:32:10.608
What might be some advice for platforms to say, hey, you know what, we see what you're doing, we don't want that divide.

00:32:10.608 --> 00:32:13.281
What can we do to provide equal access?

00:32:14.702 --> 00:32:28.954
I 100% agree with you that I feel AI has created a larger equity gap right now in many scenarios and it's heartbreaking because AI is the one tool recently that could be an equalizer.

00:32:30.880 --> 00:32:43.173
So my advice to edtech companies is, and as much as we might hate the freemium model, I think it makes sense to offer a lot of what is student facing.

00:32:43.254 --> 00:32:51.990
So like a school AI or a, you know, a chat, gbt, gemini, copilot, whatever the school is using scenario where the kids are engaging.

00:32:51.990 --> 00:33:05.267
Canva is another example where the kids can engage with AI, offering that free, freely available, right that tells us, and those are safe for education tools that you know, we know have good data privacy agreements.

00:33:05.267 --> 00:33:15.807
So for the schools to be able to at least have an entry point right, if one teacher or the building itself is like we want to do this, then they can get set up and get started.

00:33:15.807 --> 00:33:45.230
The premium features I get the people that are running these companies have jobs and they have, you know, they have operating expenses and they have AI costs and so, yes, keep a paid model if the district wants analytics, for example, or, you know, a dashboard or these deep dive tools that give high level oversight into what's happening in the system, but we don't compromise privacy and we don't compromise access.

00:33:45.230 --> 00:33:49.027
I love that.

00:33:49.929 --> 00:34:13.608
No, that is wonderful and you know, and the reason that I asked that question and it was one that actually wasn't really planned, but it's this morning again, I woke up to LinkedIn and saw a post that somebody just openly posted that said hey, you know, we wanted to get a quote for this specific platform and they said it was for 150 students and 50 faculty 38,000.

00:34:13.608 --> 00:34:15.632
I was like 38,000.

00:34:15.632 --> 00:34:19.965
And so I'm thinking to myself what in all is involved in all of that.

00:34:19.965 --> 00:34:23.291
It's 200 people using this and it's 38,000.

00:34:23.291 --> 00:34:30.885
And so my thought process and also just because going by the comments, is it based on, of course, usage?

00:34:30.885 --> 00:34:33.117
I mean, are you going to get those users that are going in there?

00:34:33.137 --> 00:34:46.226
And obviously the price of tokens and as that goes up, you know, obviously platforms need to offset some of those costs, depending on where they are connected to, as far as APIs are concerned and things of that nature.

00:34:46.226 --> 00:35:05.940
So I mean that, from the very beginning, has been one of my biggest fears, becky, honestly, is that you have these wonderful platforms, that you get a quote and you're like, okay, okay, this might be doable, but then that fear that, based on what may happen, based on what large language models do.

00:35:05.940 --> 00:35:27.083
And as far as the pricing, all of a sudden, from one year to the next, like I mentioned to you, the recording app that I used to use jumped like double in price, or maybe even two and a half times, and now that district is out and they can no longer use that or, you know, then the company may go out of business because they can't work with that model.

00:35:27.083 --> 00:35:34.925
So it's definitely very interesting and a lot of things to think about there as far as on the leadership side and making sure that you do what's best.

00:35:34.925 --> 00:35:40.967
So thank you so much for sharing your tips too as well, because that's definitely something that's very important.

00:35:41.488 --> 00:35:55.796
So my last question to you, becky it also is like, again, there's always a lot of pushback and you know you've got the teachers that are, you know, on one side of the aisle, the other side of the aisle, and I always, like I said, I always want to bring just the conversation in the middle.

00:35:55.796 --> 00:35:56.760
Where can we meet?

00:35:56.760 --> 00:36:11.367
But my question to you is is I know that you advocate for AI as a partner in education and of course, it's not there to replace teachers, obviously, but what about the other side?

00:36:11.367 --> 00:36:17.532
You know that might say well, you know what this might eventually lead to teacher redundancies.

00:36:17.532 --> 00:36:21.548
You know, they kind of just feel that way like we're just giving too much to AI.

00:36:21.548 --> 00:36:30.552
Well, how might you respond to them and just say hey, you know, this is where we're at at now and this is how we can move forward?

00:36:31.780 --> 00:36:33.067
I think it's a very real fear.

00:36:33.067 --> 00:36:43.635
It's happening in certainly all sorts of industries, right where people are realizing oh I can do my job in a way that's either more efficient or more cost effective.

00:36:43.635 --> 00:36:49.481
We see it in radiology and art, there's all these field, coding, there's all these fields.

00:36:49.481 --> 00:36:54.873
Right now that that's happening in Teaching is a predominantly humanistic field.

00:36:54.873 --> 00:37:10.311
But I will say and this is, you know, maybe not the most empathetic response, but I was talking to a teacher recently, just a couple of months ago, face to face, who said to me what is it I'm doing that AI can't replace?

00:37:10.311 --> 00:37:14.182
And I just looked at him and said, maybe you need to teach differently.

00:37:15.005 --> 00:37:37.575
Like, if that's what you see, if you see that everything you're doing in your day as a teacher could be done by an automated tool that doesn't know your students, that doesn't build a relationship, that doesn't care about them as individuals, that doesn't see what they need, like, goodness, let's do some self-reflection, then, and maybe how you can be acting differently.

00:37:37.635 --> 00:38:04.005
So that's not a call to action or, you know, it's not meant to be snarky, it's meant to be more that reflection on if I am spending most of my time as an educator asking kids to read pages out of textbooks or watch videos or even participate in an interactive simulation and then giving them a quiz on it that I can auto grade, then you're right, a lot of that can be outsourced to an AI tool.

00:38:04.184 --> 00:38:20.516
And so let's kind of get back to the root of why many of us became educators to begin with, and that was because we love the joy of teaching, of being a part of kids' learning journey, of seeing that light bulb turn on, of making a difference in kids' lives.

00:38:20.516 --> 00:38:40.530
And maybe the education system that we have no control over, you and I as individuals, has taken us to a place where we feel trapped into this model of you know, disseminating information and collecting knowledge, checks, and so it's such a bigger issue than an individual teacher.

00:38:40.530 --> 00:38:47.311
But I think it can start with individual teachers kind of reclaiming that power over.

00:38:47.311 --> 00:38:53.500
Like these are things that I bring to a classroom, uniquely inhuman, that an AI tool can't replace.

00:38:53.500 --> 00:39:01.653
So it comes back to empowerment and building that optimism of what do you have to offer no one can replace.

00:39:02.534 --> 00:39:05.447
I love that, you know, and that's in your comment.

00:39:05.447 --> 00:39:16.014
I take that it really resonates with me because even along with my career, there's been times where I was like you know what things might be getting a little difficult or what's going on, what's happening.

00:39:16.014 --> 00:39:28.047
But once I turned the mirror on myself like you mentioned, that important part of self-reflection I was like oh, oh, okay, it's like I'm the one that needs to change things in order for my circumstances to change.

00:39:28.047 --> 00:39:46.170
So maybe, like you said, as educators too, just kind of reflecting on our current practice and what might be something that we can do to, like you mentioned, reclaim that joy and reclaim, you know, what we used to do and reclaim that optimism and just going forth and doing what's best for our students.

00:39:46.170 --> 00:39:53.166
And I know that that's the difficult part too many times because you've got directives that come from the top and they kind of move down.

00:39:53.166 --> 00:40:04.148
And teachers, we definitely love our autonomy, where we are the experts in the room, and as long as that we cover our teaks, it doesn't matter how we may be covering it.

00:40:05.302 --> 00:40:25.389
One of the best advice that I ever got since I came in from business into education not going through the traditional education route but I had one leader, one principal always said look, mr Mendoza, I really don't care how you teach the standard, as long as you stay within this box.

00:40:25.389 --> 00:40:27.240
You know, it's almost like a sandbox.

00:40:27.240 --> 00:40:37.132
You can play around and do whatever you like and teach it however you like, but as long as you cover what you need to cover, just throw anything at them.

00:40:37.132 --> 00:40:42.512
If you want to use the Chromebooks, if you want to use Screencastify, if you want to use anything, just go for it.

00:40:42.512 --> 00:40:46.691
You know, and of course it's going to look different than what everybody else is doing.

00:40:46.691 --> 00:41:15.764
No-transcript, they enjoyed the learning, they enjoyed the projects.

00:41:15.764 --> 00:41:25.846
I myself enjoyed what I was doing, and you know so, sometimes it's just also having a great leader and somebody that understands that and that can make a huge difference, for sure.

00:41:26.247 --> 00:41:26.927
That's a good comment.

00:41:26.927 --> 00:41:28.291
You're fortunate to have great leaders.

00:41:29.139 --> 00:41:29.842
Excellent, all right.

00:41:29.842 --> 00:41:37.769
Well, becky, before we wrap up, I would love for you to for our new listeners, or maybe even current listeners that haven't connected with you yet.

00:41:37.769 --> 00:41:48.074
All right, can you please tell us how they might be able to connect with you on socials, and also how and where they may be able to find your book AI Optimism.

00:41:48.875 --> 00:41:54.704
Yeah, so everything comes off of BeckyKeenecom If they want to go there.

00:41:54.704 --> 00:41:56.530
It's linked to all my socials.

00:41:56.530 --> 00:42:00.271
You can grab me on TikTok or Insta and see the TikTok walks daily.

00:42:00.271 --> 00:42:14.403
There's a book section of the website that has both my books there, including free study guide, a couple other downloadables, and I will also call out that book study groups at schools can book free virtual author talks with me.

00:42:14.403 --> 00:42:18.313
So that's something I offer as well and that's all through my website.

00:42:18.313 --> 00:42:20.706
So that's the best way to get me is Beckykeencom.

00:42:21.367 --> 00:42:25.242
Excellent, and we'll make sure we link all of that in the show notes as well.

00:42:25.242 --> 00:42:44.052
So, please, guys, I definitely recommend this is definitely a great resource for you to bring into your district and, like you just heard it from Becky, author talks, free virtual author talks for book studies at your district, and I think that this could be something that would be wonderful to bring in that team of.

00:42:44.052 --> 00:42:56.146
Again, I would suggest, like Kyler said, if you can have 50% or more of end users in there along with your leadership, I think that would be a great balance to get some great conversations going.

00:42:56.146 --> 00:43:06.085
But, becky, thank you so much for being on the show today, but before we wrap up, we always love to end the show with these last three questions, so hopefully you are ready for those.

00:43:06.085 --> 00:43:13.610
So, as we know, every superhero has a pain point or a weakness, and for Superman, kryptonite was his weakness.

00:43:13.610 --> 00:43:21.566
So I want to ask you in the current state of education, what would you say is your current edu kryptonite?

00:43:22.487 --> 00:43:35.045
I hope I understand the question properly, but I would say the word magic to describe AI just kills me, and there's a couple of companies who do it.

00:43:35.045 --> 00:43:38.213
And I don't mean it like at a personal level.

00:43:38.213 --> 00:43:47.074
I just I really, really struggle with having educators feel like it's magic Because, anyway, not to get on the soap soapbox.

00:43:47.074 --> 00:44:01.846
But I feel like it relinquishes their control over what's happening and they have so much control through great prompting and when we relegate it to magic, it feels mysterious and like we don't have an influence on what happens, so that bothers me that is.

00:44:02.447 --> 00:44:33.807
that is a great answer and actually it really resembles an answer that I had from a previous guest who also said you know that if teachers were to understand that a lot of platforms offer just pre prompts or just you know that are already there, but if you actually show teachers the power of even if it's just chat, gpt and prompting, that you may even, I guess, get better results than what is there?

00:44:34.650 --> 00:44:36.606
Yeah, so those are some of the things.

00:44:36.606 --> 00:44:43.610
So yeah, that is a wonderful, wonderful response and, like I said, it really lines up with many of the guests that I've had here.

00:44:43.610 --> 00:44:49.931
All right, so question number two Becky, I want to ask you if you could have a billboard with anything on it.

00:44:49.931 --> 00:44:51.713
What would it be and why?

00:44:52.755 --> 00:44:55.863
It would say you can do anything but you can't do everything.

00:44:55.863 --> 00:45:07.264
And the why is because I have that sign hanging right here in my office and it's a great reminder every day of you know, just because it's possible doesn't mean we should.

00:45:07.264 --> 00:45:14.771
And being able to say no to things that don't fit my goals and my priorities.

00:45:14.771 --> 00:45:20.376
Also, understanding that if I want to go out and achieve something, I can, and I think everyone deserves that message.

00:45:21.135 --> 00:45:21.536
Excellent.

00:45:21.536 --> 00:45:22.938
That is a wonderful message.

00:45:22.938 --> 00:45:24.021
Thank you so much.

00:45:24.021 --> 00:45:27.625
Definitely very inspiring, and it couldn't have come at a better time for V2 as well.

00:45:27.625 --> 00:45:28.547
So that was right now.

00:45:28.547 --> 00:45:32.221
I just smiled and I was like, oh, that hit and it's wonderful.

00:45:32.221 --> 00:45:33.704
Thank you so much for sharing that.

00:45:33.704 --> 00:45:44.221
And my last question for you, Becky, is if you could trade places with one person for a single day, it doesn't matter who may be who would that be and why?

00:45:45.543 --> 00:46:01.427
So this, I really had to put some thought into this question, but I'm going to go with James Jacinda Ardern, the New Zealand PM, because I I am not into politics at all.

00:46:01.427 --> 00:46:04.530
If anyone knows me personally, I'm like I'm the most neutral.

00:46:04.530 --> 00:46:15.902
I really like to stay out of it, but I do think that there's probably so much going on that we have no idea about Right Like we'll like blow up angry about something that happened in the world and we don't.

00:46:15.902 --> 00:46:28.601
We don't know 99 percent of the decisions that happened and why, and so and she's doing such an amazing job in a country that I've gotten to visit and love and I have great, great colleague connections there.

00:46:28.601 --> 00:46:46.371
So, yeah, I would want to trade places and see what it's like to be in that role and to be, you know, inundated with very high level decisions that have tons of context that nobody knows about, and feeling the pressure of that, and it would just be interesting and then I would enjoy getting out, love it.

00:46:47.413 --> 00:46:48.414
Love it, Love it.

00:46:48.414 --> 00:46:49.545
Great answer, Becky.

00:46:49.545 --> 00:46:51.246
Well, Becky, thank you so much.

00:46:51.246 --> 00:46:53.307
It has been an honor and a pleasure to have you here.

00:46:53.307 --> 00:47:02.713
Thank you so much for sharing so many wonderful gems, great insight, and you know just your book, your work, what you've been working on this whole time AI Optimism.

00:47:02.713 --> 00:47:09.702
So please, to all our audience members, connect with Becky, Make sure you get the book and thank you so much.

00:47:09.744 --> 00:47:13.192
Becky, yeah, I should probably said that just yeah, we'll definitely just get.

00:47:13.231 --> 00:47:14.563
But hey, we go to your website.

00:47:14.563 --> 00:47:15.706
We'll be able to find it.

00:47:15.706 --> 00:47:16.688
So we'll be good to go.

00:47:16.688 --> 00:47:19.626
But, becky, thank you, I appreciate you being so gracious.

00:47:19.706 --> 00:47:30.592
It's been great to can be connected with you for so many years and see the wonderful work that you're doing and now now you know get to have you here on my show and get to have a wonderful conversation with you.

00:47:30.592 --> 00:47:34.427
Thank you so much for sharing your knowledge with us and for all our audience members.

00:47:34.427 --> 00:47:44.706
Please make sure you visit our website at myedtechlife, where you can check out this amazing episode and the other 331 episodes now over a five year span.

00:47:44.706 --> 00:47:53.722
I promise you, guys, if you go back through our library or archives, you'll definitely find a little something just for you that you can sprinkle onto what you are already doing great.

00:47:53.722 --> 00:47:55.367
So please make sure you check that out.

00:47:55.367 --> 00:47:57.423
And I want to give a big shout out to our sponsors.

00:47:57.764 --> 00:48:10.969
Thank you so much Book Creator, thank you so much EduAid, and thank you so much Yellowdig for believing in us and believing in our mission of bringing these wonderful conversations into our education space to continue to grow and learn together.

00:48:10.969 --> 00:48:12.445
So thank you to you all.

00:48:12.445 --> 00:48:14.869
So you make sure that you check them out too as well.

00:48:14.869 --> 00:48:17.809
Their links are gonna be in the show notes as well.

00:48:17.809 --> 00:48:21.621
And, my friends, until next time, don't forget, stay techie.

00:48:21.621 --> 00:48:51.115
Thank you.
Becky Keene Profile Photo

Educator, Author, Speaker

Becky Keene is an educator, author, and speaker focused on innovative teaching and learning. She specializes in instructional coaching, game-based learning, and integrating AI into education to empower students as creators. She has developed esports programs for schools and explores immersive learning through games. Becky speaks globally on AI in education and has spent over 20 years designing professional learning experiences for teachers. A National Board Certified Teacher, ISTE Certified Educator, and Certified Instructional Coach, she spent 15 years teaching, coaching, and leading programs in public schools. She is the author of the book AI Optimism and holds an MS Ed in early literacy.