WEBVTT
00:00:30.396 --> 00:00:33.798
Hello everybody and welcome to another great episode of my EdTech Life.
00:00:33.798 --> 00:00:42.649
Thank you so much for joining us on this wonderful day and, wherever it is that you're joining us from around the world, thank you, as always, for all of your support.
00:00:42.649 --> 00:00:44.969
We appreciate all the likes, the shares, the follows.
00:00:44.969 --> 00:00:48.651
Thank you so much for engaging with our content and sharing our content.
00:00:48.651 --> 00:00:52.189
Thank you so much to all our new listeners as well.
00:00:52.189 --> 00:00:55.450
We definitely appreciate all of that love.
00:00:55.450 --> 00:00:56.865
Thank you, thank you, thank you.
00:00:56.865 --> 00:01:11.796
As you know, we do what we do for you to bring you some amazing conversations and amazing guests and, of course, as always, today is no different, because we have an amazing guest and an author that we're going to be talking about their new book release.
00:01:11.796 --> 00:01:15.868
So I'm really excited for you to meet Becky Keene.
00:01:15.868 --> 00:01:17.813
Becky, how are you doing today?
00:01:18.617 --> 00:01:20.724
I'm great Thanks, happy Friday.
00:01:20.724 --> 00:01:21.707
We're recording on Friday.
00:01:22.230 --> 00:01:27.049
Yeah, we're recording on Friday, so hopefully you guys are having a wonderful, wonderful Friday as well.
00:01:27.049 --> 00:01:28.837
So thank you, becky, for being here.
00:01:28.837 --> 00:01:30.646
I'm really excited to connect with you.
00:01:30.646 --> 00:01:46.034
I know I've been a longtime follower and then scrolling through TikTok to just been loving the content that you're putting out with your TikTok walks and talking about AI and just kind of everything leading up to your book, and you know so many great things there.
00:01:46.034 --> 00:01:54.004
So I'm really excited for all our listeners current listeners and any new listeners that are joining to get to connect with you.
00:01:54.004 --> 00:02:01.368
And before we dive into our great conversation, can you give us a little brief introduction and what your context is within the education space?
00:02:02.171 --> 00:02:02.933
Oh, absolutely.
00:02:02.933 --> 00:02:05.852
I am an educator first and foremost.
00:02:05.852 --> 00:02:10.348
I still identify as an educator, although I'm not in a classroom full-time anymore.
00:02:10.348 --> 00:02:13.362
I'm not in a district full-time, but that's my heart.
00:02:13.362 --> 00:02:19.520
I spent 15 years working for a large school system south of Seattle Washington.
00:02:19.520 --> 00:02:35.876
I've taught everyone from little tiny eight-year-olds through adults, and I've spent the last 10 years specifically focused on adult professional learning in education and primarily ed tech, although my background is in literacy.
00:02:35.876 --> 00:02:37.811
So it's been a journey.
00:02:38.818 --> 00:02:39.060
Excellent.
00:02:39.060 --> 00:02:46.514
So now we're going from that literacy now in the ed tech space to, you know, ai literacy and AI optimism.
00:02:46.514 --> 00:02:51.920
Like I mentioned to our audience members, we're definitely going to be talking about your new release that we have right here.
00:02:51.920 --> 00:02:58.741
Ai Optimism a guide to yeah in my hand, and so I'm really excited for us to talk about this.
00:02:58.741 --> 00:03:11.955
So I want to ask you, becky, just to kind of start off the conversation, is what inspired you to write AI Optimism, and how do you personally define AI Optimism within the education context?
00:03:13.300 --> 00:03:22.775
So I have been involved in the AI movement for the last few years, like many of us, and I started to notice some trends that really bothered me.
00:03:22.775 --> 00:03:27.324
I started to notice some trends that really bothered me.
00:03:27.324 --> 00:03:33.704
I started to notice that over and over on the social media groups I was a part of, at conferences, I was seeing this trend toward one of two directions.
00:03:33.704 --> 00:03:37.733
One was oh no, the world is ending right, we're all going to die.
00:03:37.733 --> 00:03:41.004
The robots are taking over Just a lot of fear-based decision-making.
00:03:41.004 --> 00:03:47.342
And the other was the oversimplification of this powerful tool we have in front of us.
00:03:47.342 --> 00:03:59.925
So ed tech companies I'm not going to blame educators for this, but ed tech companies really focusing on these very, very low level uses of AI, you know, just constantly pushing.
00:03:59.925 --> 00:04:10.634
You can make a worksheet, you can make a quiz, you can make a lesson plan and unfortunately, that is just really not where I hope this tool ends in education.
00:04:10.634 --> 00:04:25.225
So for me it was kind of more about I felt that I had something to say and I wanted to capture everything that I had, you know, kind of been swirling around in my head as I go on walks and walk my dog and do different things.
00:04:25.286 --> 00:04:34.081
I kind of had all these thoughts brewing and I finally thought, okay, I want to write something about that, and I had been familiar with the SAMR model for what?
00:04:34.081 --> 00:04:39.391
The last 20 years, and so I just started really thinking through.
00:04:39.391 --> 00:05:00.807
You know, it's okay to use AI at that S substitution level in many cases, and that's where most of us got started using AI, but let's not end the journey there, and so one little fun fact you might be interested in is I wrote the entire book before I had a title, because the title was not the point.
00:05:00.807 --> 00:05:08.802
The book was the point, and so, after everything was written, my editor and I had some conversations.
00:05:08.802 --> 00:05:15.382
I had a couple other close friend conversations and said, okay, what do I really want to call this?
00:05:15.382 --> 00:05:23.827
And that's when we came up with AI Optimism as the level, but it's the content that I wanted to talk about first.
00:05:24.430 --> 00:05:43.214
Love it and you know there's a lot to unpack there in that statement, because I, along with you, obviously, you know we have, you know, very similar circles and you know, in a lot of the chats and a lot of the social media platforms and so on.
00:05:43.235 --> 00:05:44.920
And so you know, and even as part of my dissertation study exactly what you mentioned.
00:05:44.920 --> 00:05:45.944
Congratulations.
00:05:45.944 --> 00:05:48.107
By the way, we have to stop for just a minute.
00:05:48.107 --> 00:05:50.975
You just made a major announcement.
00:05:50.975 --> 00:05:52.848
I'm so incredibly happy for you.
00:05:52.848 --> 00:05:56.889
That is obviously a huge journey and congratulations.
00:05:57.350 --> 00:05:57.831
Thank you.
00:05:57.831 --> 00:05:59.524
Thank you, becky, I really appreciate it.
00:05:59.524 --> 00:06:07.555
But it really lines up with what you said because, as as you said, there it was like a split into like two factions.
00:06:07.555 --> 00:06:16.579
You know the oh no, like you know let's resist this because of you know, just the unknown, the fear of the unknown, losing control.
00:06:16.579 --> 00:06:26.872
And one of the things that I've learned in doing PD, and especially when you're going to implement something, a lot of the pushback that you get it's not so much that you're adding something to the teacher.
00:06:26.872 --> 00:06:36.002
A lot of times it's really they feel like they're losing that control and so maybe for a lot of them it was that loss and fear of that control that they once had.
00:06:36.002 --> 00:06:44.230
And then, of course, you've got the other speedboat side where it's like, hey, let's do this, let's go, but, like you mentioned, at that very low level.
00:06:44.230 --> 00:06:59.968
And so I really love what you said about the SAMR model, because I know I had talked to you a little bit and asking you about this model and saying, because of the speed and how fast platforms are evolving and moving, is this a great model to still continue using?
00:06:59.968 --> 00:07:03.762
And I'm thinking, yes, it still is, because it doesn't over.
00:07:04.184 --> 00:07:26.721
I feel, and this is my opinion and other guests that I have interviewed that were part of the study it was really they're stating the use of AI has been so low level where it's just substitution, and I think you and I can agree that we've seen a lot of technology that has come through and you know, ok, we've got Chromebooks now that has come through.
00:07:26.721 --> 00:07:27.725
And you know, okay, we've got Chromebooks now.
00:07:27.725 --> 00:07:31.220
Okay, so now instead of a physical worksheet, now I'm going to give it to you digitally but you're still annotating.
00:07:31.220 --> 00:07:44.564
So I was like, okay, so we're just substituting the physical for that digital and it just seems like we get stuck on that S and there's some teachers that are great at augmenting so they go to that next step.
00:07:44.564 --> 00:07:54.766
I recently had a guest on joe christiansen who pretty much mentioned it's like we're pretty much just stay stuck on the s and we never full on go into the r.
00:07:54.867 --> 00:08:03.189
So I love that you mentioned of your ideas and thinking how the ed tech companies are kind of selling it into.
00:08:03.189 --> 00:08:05.853
Yes, we're going to help you do your lesson plans faster.
00:08:05.853 --> 00:08:11.952
We're going to help you answer or create emails faster, do these you know newsletters a lot faster.
00:08:11.952 --> 00:08:16.108
But how is that going to augment what you're doing?
00:08:16.108 --> 00:08:21.685
How is it going to modify, how is it going to, you know, reinvent or redefine education?
00:08:21.685 --> 00:08:42.404
So I want to ask you here I know that you emphasize the Samar model in your book, so tell us a little bit about how you see that we can take this to that really next level, or how ed tech companies can do better in getting us into the AM and the R, the augmentation, modification and redefining the content.
00:08:43.847 --> 00:08:47.234
Well, that is totally what the book is about.
00:08:47.234 --> 00:09:05.147
So the book starts with the AI optimism framework, which is three core principles privacy, praxis and prompting and that really the whole beginning of the book talks about educators taking a step back from you know, oh, I want to try an AI tool.
00:09:05.147 --> 00:09:09.471
That's my least favorite comment in the whole world.
00:09:09.471 --> 00:09:19.811
It's like I want to try this tool and as an instructional coach someone with 10, a decade of instructional coaching experience I would always say no, no, you know, we're not here to try a tool.
00:09:19.811 --> 00:09:24.072
We're here to do best practice and to think about pedagogy first.
00:09:24.072 --> 00:09:30.914
So the practice part of that conversation is really that Think about what you want your students to achieve.
00:09:30.914 --> 00:09:35.649
What do you envision this learning experience looking like for your students?
00:09:35.649 --> 00:09:40.231
Do you envision they all sit down and fill out a worksheet, in which case we can end the conversation?
00:09:40.231 --> 00:09:51.754
But if you envision them being engaged and empowered and excited about what they're doing, then that's something that we have to set aside and then consider tools that keep student data safe.
00:09:51.754 --> 00:09:53.245
That's the privacy component.
00:09:53.245 --> 00:09:59.253
And also, how am I using the tools available to really prompt richly?
00:09:59.253 --> 00:10:01.783
And we see a lot of these.
00:10:01.783 --> 00:10:21.643
I call them easy button tools that are helpful to get started, absolutely, but they shouldn't be where we end with AI, because all they're doing is churning out predictive, pattern-based you know very didactic content in many cases when we could be using our prompting skills to do more.
00:10:22.163 --> 00:10:37.133
So it's starting with those three core principles in mind and then assessing and I have a wheel framework and there are six categories around the edge of the wheel and then assessing what do we want to do, and it's probably one of those six categories.
00:10:37.679 --> 00:10:43.273
I'm trying to design something, or create something, or support my learners, or analyze data, right.
00:10:43.312 --> 00:10:55.903
So it's thinking about the task next and then having a little self-check on how much do I want to invite AI to be my assistant for this task, for my pedagogical purpose.
00:10:55.903 --> 00:11:10.869
So AI Optimism is this decision-making process that starts with what I want to achieve, in the middle is what I'm trying to do, and at the end is the tool that's going to support me and my students the best.
00:11:10.869 --> 00:11:32.729
And so, to answer your question, with the SAMR model in mind, I then have a choice Do I want to use a tool that meets my needs, that sits at a substitution level, or do I want to use that incremental innovation process to move forward into a MRR, and that might change what my entire assessment looks like.
00:11:32.729 --> 00:11:50.707
That's the point of the R right Redefinition things that weren't previously possible, but it might also just be some small tweaks along the way that help me build more student voice, more engagement, more autonomy, more productivity, more creativity, more students as creators.
00:11:50.707 --> 00:11:57.129
You know whatever I'm trying to achieve, so for me it's about that decision along the way.
00:11:57.831 --> 00:11:58.192
Excellent.
00:11:58.192 --> 00:12:03.062
I really love that, you know, and I think that's something that a lot of teachers do have in mind.
00:12:03.062 --> 00:12:19.135
But I think sometimes it's just the excitement of the tool and working as a digital learning coordinator for many years is we bring a platform in and then all of a sudden, the hype is gone.
00:12:19.135 --> 00:12:36.192
But then what I notice is they never, or we never, really go in a lot deeper than that superficial that got them all excited and it's like, well, ok, we already used it, like OK, like that the glitz and the glamour and the shine is gone.
00:12:36.192 --> 00:12:41.692
And now it's like, hey, well, I want to use that app because it just has like one additional component to it.
00:12:41.692 --> 00:12:42.341
But wait a minute.
00:12:42.341 --> 00:12:46.302
I mean, this does the exact same thing, but have we even gone deeper into it?
00:12:46.743 --> 00:12:55.547
And I think that we've gotten into that trend where we just hop from app to app to app to app and we really never dive in deep.
00:12:55.547 --> 00:13:22.182
And so I love what you mentioned, too, about you know it's okay to slow down, but see where it is that you want to take your students and see how you may get them there in a balanced approach with you know, even no tech, little tech, to that final product or that tech platform that's really going to help you modify and redefine and do something that is that was once inconceivable.
00:13:22.182 --> 00:13:25.687
You know to do that, you know, and I know they mentioned that in the Samar model.
00:13:25.687 --> 00:13:34.827
So I think that's something that, as educators, we really need to just kind of slow down and say it's okay, like let's look at the goals first.
00:13:34.827 --> 00:13:42.447
And like they always say, you know, it's like a task before apps, so let's see what is the task, what is that final outcome, what's the process and what is the final product.
00:13:42.447 --> 00:13:47.193
What's the process and what is the final product and what's the best tool to get us there.
00:13:47.313 --> 00:14:05.384
So I really like that, and so I want to kind of talk a little bit and continue that conversation, because in the redefinition chapter you talk about AI enabling tasks that were once inconceivable, things that we didn't think about first, you know, as opposed to just substituting conceivable things that we didn't think about first, you know, as opposed to just substituting.
00:14:05.384 --> 00:14:11.010
But how else might we take that app to that next level to really enhance the learning experience?
00:14:11.010 --> 00:14:35.476
So how do you balance the excitement of doing the redefinition and doing something inconceivable with the excitement and the possibilities of students, you know, doing something that they never thought they could, but also balance that with the over-reliance of, maybe, a particular tool for the problem-solving aspect of it.
00:14:36.779 --> 00:14:37.061
Right.
00:14:37.061 --> 00:14:51.197
So that really comes down to the question that I talk about in the very beginning of the book, which is helping students understand how to answer the question does this use of AI limit my learning?
00:14:51.197 --> 00:14:59.438
And I know it probably seems really like you know, pie in the sky, like there's no way our kids are actually going to do this.
00:14:59.438 --> 00:15:24.909
But I think these red light, green light models that we've adapted to the use of AI, where we're kind of telling the students, okay, you can use it, for you know this purpose for this assignment, but you can't use it for this purpose, for this assignment I feel like at some point we have to start helping them make critical thinking decisions about the way I'm trying to use AI right now.
00:15:25.537 --> 00:15:27.724
Is that limiting me or supporting me?
00:15:27.724 --> 00:15:32.163
Is it helping me design something that would be previously impossible?
00:15:32.163 --> 00:15:40.128
Because maybe I don't know how to code and so I'm not going to learn that for this assignment, but I'm super excited to create a website without having to learn coding.
00:15:40.128 --> 00:15:43.945
Or is the point of the assignment to learn coding?
00:15:43.945 --> 00:15:51.524
And now I'm skipping past all of that really important knowledge that I'm going to need for my career, my goals, my passions, my hobbies.
00:15:51.524 --> 00:15:57.388
So I feel like helping students understand that question does this use of AI limit?
00:15:57.388 --> 00:16:06.658
My learning is going to set them up for success beyond the walls of a classroom, where they have a poster on the wall telling them exactly how and when they're supposed to be using.
00:16:39.016 --> 00:17:00.155
Excellent, I love that and you know that's such an important question and I think oftentimes we don't give our students enough credit as educators and immediately you know a lot of the reactions from November 2022, and even still to this day it's like, oh, they're just going to cheat, they're just going to go ahead and do this, and so on and so forth, and you know, but that's our assumption.
00:17:00.496 --> 00:17:11.038
You know, yes, there will be some students and, just like we know, there's always, you know, that handful of students that will try and do as little as possible to just get what we need.
00:17:11.097 --> 00:17:28.424
But it's because it's like, hey, what is it that we need to get an A, you know, and they're just going to give you the minimum, maybe even just a C, because it's like they're not as engaged, they're not, you know, really just getting that attention that they need to enhance that learning experience.
00:17:28.625 --> 00:17:38.484
So I really like that you talk about that and that we can possibly now, you know, with the information that is out there, what can we do better to help our students?
00:17:38.484 --> 00:17:40.913
And I know that that starts with the teachers first.
00:17:40.913 --> 00:17:51.451
So, becky, in your experience, since I know that you do a lot of PD and you're, you know, pretty much everywhere nationwide, doing a lot of trainings and you're, you know, pretty much everywhere nationwide doing a lot of trainings for teachers.
00:17:51.451 --> 00:17:51.750
What are some?
00:17:51.750 --> 00:18:16.238
Maybe, if you can give us two tips for anybody in a leadership role in a school that provides PD, what can we do to help our teachers kind of maybe not think just that like plagiarism, the eminent, like just the negative side, and maybe reframe their thinking to seeing more of that potential and then being able to translate that to students.
00:18:18.086 --> 00:18:19.754
That's what AI Optimist is all about.
00:18:19.754 --> 00:18:24.949
Right Is acknowledging the challenges and then still be willing to press forward.
00:18:24.949 --> 00:18:26.790
So two tips.
00:18:26.790 --> 00:18:38.211
One tip is I always talk for over 20 years I've been talking about incremental innovation, disruptive innovation.
00:18:38.552 --> 00:18:44.913
It sounds great in a business model but it's not sticky and it makes it drives a lot of fear.
00:18:44.913 --> 00:18:48.804
It feels like a push in rather than this.
00:18:48.804 --> 00:18:50.992
You know, organic role in this, together mentality.
00:18:50.992 --> 00:19:00.904
So I don't love when school systems come in and they're like you know, rip the bandaid off, we're all doing this thing, we're going to provide a bunch of PD, it's going to change everything Like.
00:19:00.904 --> 00:19:05.273
No one likes that and it's generally not very effective.
00:19:05.273 --> 00:19:08.309
I've seen those things come and go over time.
00:19:08.349 --> 00:19:35.356
So I really, really recommend that school leaders and instructional coaches suggest that incremental innovation where we're going to honor where people are now, because the reality is most of our educators nationwide in the United States are doing great work and they're doing the best they can, and so you know we don't have a lot of people out there just phoning in, so honor that and then talk to them about that.
00:19:35.504 --> 00:20:21.227
This is part two solving their problems, because every educator has things they would love to fix right Parent communication or student engagement or the workload of scoring, you know, ap and IB sample tests Like there's something that they're looking to achieve that is going to improve their life and therefore the lives of their students, because they've now freed up some cognitive space to do more what they love, right, and so being in that problem solver mindset again, instead of recommending an app like hey, you know, we're all going to use fill the blank, I don't want to like throw an app under the bus, but a lot of times those implementations don't fit the needs of all of our educators and all of our classrooms.
00:20:21.227 --> 00:20:41.740
So rather, approaching it with a problem solving mindset of what do you want to achieve and how can I support you getting there and then going on that journey to figure out what's the best fit, I think is absolutely a more sticky approach that's going to honor the expertise in the room and deliver something that actually shows.
00:20:44.224 --> 00:20:44.404
Excellent.
00:20:44.404 --> 00:20:44.786
I really love that.
00:20:44.786 --> 00:20:56.194
I recently had a conversation with Kyler Cheatham, who is somebody else that I found on TikTok and she works with organizations, but she also works with school districts and she really just echoes everything that you just said right now.
00:20:56.194 --> 00:21:26.512
It was just, you know, being she mentioned making sure that everybody that needs to be in the room is in the room, went to and stating that if 50% of your room is not filled with end users, then you need to kind of, you know, make that happen, to make sure that there is that buy-in, and understanding that it's not, like you mentioned, trying to put a band-aid, maybe even over a band-aid, you know, and then over another band-aid, and so on.
00:21:26.512 --> 00:22:04.779
So it's just to making sure that you're finding those solutions and being very active in the problem-solving process, and I think that that is something that is wonderful that at the very top, you know, for superintendents, ctos, curriculum directors, they all need to be in that room, because oftentimes those decisions are left to one person and they're based on maybe one teacher who came back from a conference who's very excited and says I need you to open this up or I need you to purchase this, but, like you mentioned understanding that that tool may not be for everybody and it may not be for every teacher or maybe even every learner.
00:22:04.779 --> 00:22:12.048
So, going back to, let's look at the root of the problem.
00:22:12.048 --> 00:22:16.323
What is it that we're trying to solve and then make the best decision there for the whole body there in your district.
00:22:16.323 --> 00:22:17.686
So I really like that a lot.
00:22:17.686 --> 00:22:27.363
That's just some really great tips and obviously the experience that you have in speaking in so many places, even from k-12 to higher ed, that definitely I love that.
00:22:27.383 --> 00:22:30.153
So again, guys, the book is AI Optimism.
00:22:30.153 --> 00:22:32.173
Make sure that you check that out.
00:22:32.173 --> 00:22:36.605
But, becky, as we continue our conversation, I know recently I saw a post.
00:22:36.605 --> 00:22:39.294
I believe you were in UT Austin.
00:22:39.354 --> 00:22:43.750
I believe Were you working- Yesterday got a little bit midnight, okay it was yesterday.
00:22:44.225 --> 00:22:44.547
So is that?
00:22:44.547 --> 00:22:46.032
Were you working with higher ed there?
00:22:47.004 --> 00:22:48.191
Yes, we've been working.
00:22:48.191 --> 00:22:54.224
So the company I work for, i2e, we've been supporting several higher ed institutions.
00:22:54.224 --> 00:23:05.381
We've spent a lot of time at the University of Texas, both at Austin and SISTEM, so Dallas, houston and so on supporting their AI implementation.
00:23:05.381 --> 00:23:30.292
So it's interesting because working with higher education, we're not really working with faculty, we're working with operations, business operations, most of the time right now, because we're working with people who are like this is going to save me hours a day on my job, you know those automations, and so it's not so much about instruction and that's been actually really fun, a good growing experience for me.
00:23:31.055 --> 00:23:31.355
Excellent.
00:23:31.355 --> 00:23:33.313
Well, and that's great that I asked that.
00:23:33.313 --> 00:23:49.690
I really thought it was, you know, with higher ed faculty, but now that you mentioned this, that it's more of the operations, you know, I think like K-12 and higher ed, we look at things obviously in that through that education lens, but then you also see on the outside.
00:23:49.690 --> 00:23:54.298
Now you know that productivity and you're working with operations and so on.
00:23:54.298 --> 00:23:56.608
So I want to ask you you know what?
00:23:56.608 --> 00:23:59.354
What are some of the things that you get?
00:23:59.354 --> 00:24:10.067
Let's say, a little bit of fight back, or you know just a little bit, uh okay, on the education side, what do they kind of like fight back on, and in the operation side, what do they kind of like fight back on?
00:24:10.067 --> 00:24:14.475
And in the operation side, what might be some things that they fight back on?
00:24:14.475 --> 00:24:19.076
Just to kind of get a comparison of the education space and the operations space.
00:24:20.766 --> 00:24:26.458
So, educators, I feel like push back a little bit on tools that don't really add value.
00:24:26.458 --> 00:24:50.093
So, for example, I was walking through a conference you know, you and I both go to the ed tech conferences and one of the vendor booths had a nice big sign advertising that they were so excited about changing, like a YouTube video into a PowerPoint with AI so that you could share that with your students.
00:24:50.093 --> 00:24:56.378
And I kind of just wanted to be honest, like and that's your use case.
00:24:56.378 --> 00:25:20.672
So, you know, I just walked through that mentality of wait a minute, we're, we're taking a captioned, dynamic, personalized right, I can pause, I can rewind, I can rewatch instructional piece and we're using the best tools we have on the planet to make it more didactic, more central, focused, less accessible, less engaged.
00:25:20.751 --> 00:25:21.513
Like, what are we doing?
00:25:21.513 --> 00:25:31.746
We're going backwards and I think that educators do notice when a tool comes out and it's like this is the opposite of what we want to be doing in education.
00:25:31.746 --> 00:25:38.767
So there's that pushback of is your tool actually adding value or are you just trying to get me to use it?
00:25:38.767 --> 00:25:41.273
You know, I like that skepticism.
00:25:41.273 --> 00:25:44.267
Part of AI optimism is going here's what's possible.
00:25:45.470 --> 00:26:04.855
That's where I want to go Like don't limit me by again these easy button options that are just stuffing me full of more content to push out to my students, and so I love that approach.
00:26:04.855 --> 00:26:10.658
So we're working Cherie with medical centers, legal teams, real estate.
00:26:10.658 --> 00:26:17.042
They're handling contracts, they're handling donor information, right, alumni relations.
00:26:17.042 --> 00:27:09.086
There's highly sensitive information flowing through those AI models, and so for them.
00:27:09.086 --> 00:27:12.832
Honestly, we cover out of the gate, like in the first five minutes at every session.
00:27:12.832 --> 00:27:16.576
Here's why this is data secure and here's why that matters for you.
00:27:16.576 --> 00:27:32.218
And, yes, you can use HIPAA compliant data, or this is HIPAA compliant those types of things, if they're using the right tools, because that's incredibly important and nobody wants their research in a data leak.
00:27:33.906 --> 00:27:34.911
That's very interesting.
00:27:34.911 --> 00:27:46.327
And going back to what you said, you know, one of the things that shocked me at ISTE, as I was walking through the conference, is I was approached by an app and you know, kind of like.
00:27:46.327 --> 00:27:49.316
I was like is this, is this for real?
00:27:49.316 --> 00:27:53.757
So they kind of talked to me and they said yeah, you know, have you heard of us?
00:27:53.757 --> 00:27:57.676
And I was like no, like, maybe like on social media.
00:27:57.676 --> 00:28:06.772
And then so they're really giving me the you know their spiel and everything and they're saying you know this, you know application is for teachers and you know this is what it does.
00:28:06.772 --> 00:28:08.091
And this is what they said.
00:28:08.091 --> 00:28:17.329
They're like you see, those apps over there, those apps help the teacher before the lesson there, those apps help the teacher before the lesson.
00:28:17.329 --> 00:28:20.378
These apps over here help the teacher after the lesson, but we help them during the lesson.
00:28:20.378 --> 00:28:24.393
And I was like okay, I'm intrigued, my ears perked up yeah yeah, yeah.
00:28:24.413 --> 00:28:26.339
So I was like, okay, so how does that happen?
00:28:26.339 --> 00:28:33.352
He's like, well, you download an application and then the teacher puts on their.
00:28:33.352 --> 00:28:39.221
I guess their expectation was the teacher was going to hang their phone here and was going to record while they teach.
00:28:39.221 --> 00:28:42.365
And I was like, okay, interesting.
00:28:42.365 --> 00:28:44.370
And then I said, well, what about?
00:28:44.370 --> 00:28:47.478
You know, students speaking at that time?
00:28:47.478 --> 00:28:52.297
You know how is it going to protect their privacy when names are being called out or things of that sort?
00:28:52.297 --> 00:29:02.829
And the voices and they're like, oh, uh, well, the teacher wears it really close here and so they wouldn't hear, like the student voices because they're supposed to be wearing it close here.
00:29:02.850 --> 00:29:08.994
And I said, yeah, but they're still mentioning, you know, there's still going to be some, uh, identifiable information bits there.
00:29:08.994 --> 00:29:17.257
And they're like they kind of just stayed quiet and then after that they're like, well, uh, thank you, and that was it, and I walked out and it was very scary, you know.
00:29:17.257 --> 00:29:23.697
But, like you mentioned, there are some things where I feel like wait a minute, like we've already done that and we've already done that.
00:29:23.697 --> 00:29:31.748
And, yes, the optimism is there, becky, but sometimes and I don't know if you feel that way, but I guess maybe the feeling for me was like we could be doing so much more.
00:29:31.748 --> 00:29:37.554
I guess, maybe the feeling for me was we could be doing so much more but, like you mentioned it, sometimes I feel like are we going backwards?
00:29:38.295 --> 00:29:44.622
Right, right, like, did we forget about data privacy when we launched some of these tools?
00:29:44.622 --> 00:29:50.115
Or did we forget about, like, best practice?
00:29:50.115 --> 00:29:50.998
Yeah, I totally agree.
00:29:50.998 --> 00:29:56.404
I'm glad you spoke up and I'm glad that you had that lens of like hang on, really.
00:29:56.404 --> 00:30:18.555
Yeah, I had a similar conversation with an app, I mean when I was at ASU GSB and and they said, oh, what's the number one concern you hear about when you know, when you're working with educators, and and I said data privacy and and I was talking to like the COO or something, and he said, gosh, you're the first person to say that.
00:30:18.555 --> 00:30:19.879
And I said who are you talking to?
00:30:22.410 --> 00:30:23.312
oh my goodness.
00:30:23.653 --> 00:30:28.149
I'm glad that there are people making that no for sure.
00:30:28.209 --> 00:30:41.437
And and then for me that's the biggest thing too, and but also just the fact that you know you're expecting a, an educator, to hang a phone around them and put it so close right here.
00:30:41.437 --> 00:30:53.457
So it's just picking up and and I get it, you know it's the voice note thing and then afterwards you, you know it's going to transcribe and maybe give you or offer suggestions at least that's what it was saying, that's what I took from it.
00:30:53.457 --> 00:30:59.106
But I was like I was like, oh my gosh, like I don't know about that, you know, but we'll see, but anyway.
00:30:59.106 --> 00:31:21.474
So, becky, just as we kind of round out this conversation and maybe as we start kind of wrapping up, I do want to ask you your thoughts on this, because I know that this is something that always there was a lot of guests and a lot of authors also that came on, and even from both sides is the use of AI.
00:31:21.474 --> 00:31:30.135
If you are a school that is very well funded, well good for you to provide students the those same similar tools, because I mean the cost of it.
00:31:30.156 --> 00:31:52.095
You know the talk is it's always going to create more of that divide, not only in the learning aspect, but just obviously in equity access.
00:31:52.095 --> 00:31:57.471
So can you tell us, in your experience, what might we be able to do?
00:31:57.471 --> 00:32:03.048
Well, maybe not as educators, but maybe I guess maybe as platforms.
00:32:03.048 --> 00:32:10.608
What might be some advice for platforms to say, hey, you know what, we see what you're doing, we don't want that divide.
00:32:10.608 --> 00:32:13.281
What can we do to provide equal access?
00:32:14.702 --> 00:32:28.954
I 100% agree with you that I feel AI has created a larger equity gap right now in many scenarios and it's heartbreaking because AI is the one tool recently that could be an equalizer.
00:32:30.880 --> 00:32:43.173
So my advice to edtech companies is, and as much as we might hate the freemium model, I think it makes sense to offer a lot of what is student facing.
00:32:43.254 --> 00:32:51.990
So like a school AI or a, you know, a chat, gbt, gemini, copilot, whatever the school is using scenario where the kids are engaging.
00:32:51.990 --> 00:33:05.267
Canva is another example where the kids can engage with AI, offering that free, freely available, right that tells us, and those are safe for education tools that you know, we know have good data privacy agreements.
00:33:05.267 --> 00:33:15.807
So for the schools to be able to at least have an entry point right, if one teacher or the building itself is like we want to do this, then they can get set up and get started.
00:33:15.807 --> 00:33:45.230
The premium features I get the people that are running these companies have jobs and they have, you know, they have operating expenses and they have AI costs and so, yes, keep a paid model if the district wants analytics, for example, or, you know, a dashboard or these deep dive tools that give high level oversight into what's happening in the system, but we don't compromise privacy and we don't compromise access.
00:33:45.230 --> 00:33:49.027
I love that.
00:33:49.929 --> 00:34:13.608
No, that is wonderful and you know, and the reason that I asked that question and it was one that actually wasn't really planned, but it's this morning again, I woke up to LinkedIn and saw a post that somebody just openly posted that said hey, you know, we wanted to get a quote for this specific platform and they said it was for 150 students and 50 faculty 38,000.
00:34:13.608 --> 00:34:15.632
I was like 38,000.
00:34:15.632 --> 00:34:19.965
And so I'm thinking to myself what in all is involved in all of that.
00:34:19.965 --> 00:34:23.291
It's 200 people using this and it's 38,000.
00:34:23.291 --> 00:34:30.885
And so my thought process and also just because going by the comments, is it based on, of course, usage?
00:34:30.885 --> 00:34:33.117
I mean, are you going to get those users that are going in there?