WEBVTT
00:00:30.155 --> 00:00:33.557
Hello everybody and welcome to another great episode of my EdTech Life.
00:00:33.557 --> 00:00:43.109
Thank you so much for joining us on this wonderful day and, wherever in the world that you're joining us from, as always, thank you so much for all the likes, the shares, the follows.
00:00:43.109 --> 00:00:49.609
Thank you so much for interacting with our content, for all the wonderful feedback and to all our new followers, welcome.
00:00:49.609 --> 00:00:56.151
Thank you so much for all of your support and I definitely want to give a big shout out to our newest sponsor, book Creator.
00:00:56.151 --> 00:01:07.281
Thank you, book Creator, for this awesome mug, and thank you so much for believing in our mission and bringing amazing conversations here into our education landscape so we can all continue to grow.
00:01:07.281 --> 00:01:13.483
So thank you, thank you, thank you, and today I am really excited to welcome two amazing gentlemen.
00:01:13.825 --> 00:01:19.260
These two gentlemen are people that I follow on LinkedIn and put out some amazing content.
00:01:19.260 --> 00:01:40.760
If you have questions about integrating AI and AI literacy and literacy and AI and so many other things involving literacy, these two gentlemen will definitely have some answers for you in all of their posts, or you can even reach out to them or pose a question to them on their LinkedIn comments, and I promise you that they are so wonderful at answering those questions.
00:01:40.760 --> 00:01:53.385
So I would love to welcome to the show all the way from Australia on a Saturday morning Mr Paul Matthews, and I would love to welcome here from the US ona Friday afternoon, mr Jason Guglia.
00:01:53.385 --> 00:01:57.554
So, paul, how are you doing this morning there in Australia?
00:01:58.700 --> 00:02:01.203
I'm doing so well, fonz, I'm doing so well.
00:02:01.203 --> 00:02:04.688
We were just chatting off air, but I've had a great week in the classroom this week.
00:02:04.688 --> 00:02:08.173
It's now 5.09 AM, which is my time.
00:02:08.173 --> 00:02:09.253
It's Saturday morning.
00:02:09.253 --> 00:02:11.061
I absolutely love this time of the day.
00:02:11.061 --> 00:02:15.581
I feel like my creative powers are exactly where they need to be, so it's good to be with you both.
00:02:16.263 --> 00:02:16.745
Excellent.
00:02:16.745 --> 00:02:19.512
And Jason, how are you doing this afternoon here in the States?
00:02:20.501 --> 00:02:20.861
I'm good.
00:02:20.861 --> 00:02:24.995
It's so fun and interesting to be on like different days.
00:02:24.995 --> 00:02:31.483
If you told me that I would be doing this like five years ago and talking to people all over the world, it would like blow my mind and I wouldn't believe you.
00:02:31.483 --> 00:02:35.131
So I feel like I'm in a very different part of my day.
00:02:35.131 --> 00:02:41.467
So me it's right after two and I just went through like three or four hours of meetings.
00:02:41.467 --> 00:02:58.070
So I just feel like I'm in a very it's not quite the end of the day, but it's a point where, like, the work has sort of been done and it's so, and I'm sure you both know this or had this experience in department meetings AI always comes up and so I'm not allowed to like just sit there and vegetate.
00:02:58.070 --> 00:03:00.087
I always talk.
00:03:00.087 --> 00:03:02.420
So that's where I am in my day, but I'm doing well.
00:03:02.420 --> 00:03:03.262
I'm tired.
00:03:04.102 --> 00:03:04.603
Excellent.
00:03:04.603 --> 00:03:13.149
Well, thank you both for being here today, on this wonderful day, and I'm really excited to talk about, and we're going to be focusing mainly on, your book, you know.
00:03:13.149 --> 00:03:27.117
So I'm really excited and the book of course, we see it here in the image where Paul is has the book there Artificial intelligence real literacy a practical guide to using AI for 10 evidence-based literacy practices in education.
00:03:27.117 --> 00:03:36.995
So, as we know, 2022 happened and I don't want to go that far back, but that kind of changed a lot of education, a lot of practices in education.
00:03:36.995 --> 00:03:42.050
But today we're going to be focusing on that literacy aspect and component of it, which is great because, like I mentioned, you two are two people that I love to follow, especially when it deals with this.
00:03:42.050 --> 00:03:48.486
Great because, like I mentioned, you two are two people that I love to follow, especially when it deals with this topic.
00:03:48.848 --> 00:03:52.481
So I'm going to go ahead and start with you, jason, at this moment.
00:03:52.481 --> 00:04:00.990
We know that the book opens up with a great anecdote, you know, of using AI in the classroom to teach students about reading.
00:04:00.990 --> 00:04:07.870
So I want to start with you here, because it's very interesting that the book title is called Artificial Intelligence Real Literacy.
00:04:07.870 --> 00:04:10.896
So I want to start with you here, because it's very interesting that the book title is called Artificial Intelligence, real Literacy.
00:04:10.896 --> 00:04:17.293
So I want to ask you you know what is it that we need in education right now and the purpose of this book and how it can help teachers.
00:04:18.680 --> 00:04:32.737
Yeah, so the real idea here in many ways is that we have this mountain of evidence, I think, that in my so I teach college and I focus on reading and writing, and there's a lot of hesitancy there.
00:04:32.737 --> 00:04:42.475
If you wanted to move into a space that is so divided in terms of using AI, not using AI, it is English department.
00:04:42.475 --> 00:05:05.247
So literally in my department I have myself and I've incorporated into many parts of my curriculum we have several faculty members who completely ban it and they would tell their students if you use AI, I will find you right, like there is that sort of a language to it, and then many people in the middle, and that is very much representative, I think, of English departments, especially at the college level.
00:05:05.247 --> 00:05:40.146
There's all of this division, all of this hesitancy, and one of the things that I'm constantly reminding people of and this is many reasons is the idea behind the book, is that we have this mountain of evidence for what works, for what actually works with teaching works, for what actually works with teaching, and one of the things that I truly believe is that AI has changed a lot of things about society, about the way we interact, but it hasn't really changed the principles of good teaching, that, what makes teaching engaging, what helps us learn how to read and write and really focus on the value of literacy.
00:05:40.146 --> 00:05:42.091
Those are pretty constant.
00:05:42.132 --> 00:05:51.151
We have all this evidence that really, really we should have been sticking to a long time ago, and especially at my level, at the college level, many of us do not do that.
00:05:51.151 --> 00:05:55.399
Many of us hadn't been sticking to those evidence-based practices.
00:05:55.399 --> 00:05:59.428
There was a lot of lecturing, there was a lot of one-sided conversations.
00:05:59.428 --> 00:06:02.269
A professor tells a student what to do and they do it.
00:06:02.269 --> 00:06:16.709
That sort of transactional approach, which we've known for a really really long time, doesn't work, and so one of the best uses of AI in my, in our opinion, is actually sticking to the evidence.
00:06:16.709 --> 00:06:27.062
So if we have certain strategies that help students read and write, now we can stick to them in a way that's easier, right or faster, actually saves us time, and we can stick to it.
00:06:27.182 --> 00:06:32.403
This is something that obviously we focus on it with literacy, but I do think it's broadly applicable.
00:06:32.403 --> 00:06:48.225
I do think that people can approach it from other fields and say well, this is what we know about teaching and learning and this is how I can use AI to augment what I'm already doing, because maybe I wasn't doing things as much as I could have.
00:06:48.225 --> 00:06:52.252
So one of the examples that I often give is making learning accessible.
00:06:52.252 --> 00:07:00.769
Many of us have been going for the last 10 years to these training and learning about how to create accessible materials and everything.
00:07:01.220 --> 00:07:02.564
A lot of us weren't doing it.
00:07:02.564 --> 00:07:06.754
We just weren't, there wasn't time, and so now one of the ways to use AI is to make sure that we are creating accessible and everything.
00:07:06.754 --> 00:07:18.084
A lot of us weren't doing it, we just weren't, there wasn't time, and so now one of the ways to use AI is to make sure that we are creating accessible learning or we are having these activities that we know work and that, in many ways, is the kind of impetus behind the book, and that's something that I think.
00:07:18.084 --> 00:07:26.420
Obviously we focus on it with literacy, but it applies to other fields and it applies regardless of what you teach Excellent.
00:07:26.701 --> 00:07:31.533
Well, paul, now on to you Very similar question that I'm going to ask you.
00:07:31.533 --> 00:07:36.172
But I want to ask you because, again, it's such a very interesting title and I'm still with that.
00:07:36.172 --> 00:07:39.333
But I love the way that Jason hit on a couple of things there.
00:07:39.333 --> 00:07:56.528
That, as far as still sticking to that evidence-based practice and using the AI to augment and I was recently at a conference, too, in Puerto Rico and they were talking about the SAMR model and then they're talking about that augmentation piece and a lot of people just not really ever getting there.
00:07:56.528 --> 00:08:08.644
It's usually the substitution aspect of it and it's really like going from paper to a Chromebook or a screen but never really being able to take that next step and what that looks like.
00:08:08.644 --> 00:08:17.081
But now I want to ask you you know, what does this juxtaposition mean to you personally and how does it reflect in your own practice and in your vision of education?
00:08:18.725 --> 00:08:28.790
That's a really good question, and it sort of harkens back to one of my big philosophies that undergird my AI practice, fonz, and it's this I'm not a big tech guy.
00:08:28.790 --> 00:08:31.742
Unlike yourself, I don't get that excited about technology.
00:08:31.742 --> 00:08:40.347
What I get really, really excited about is my students learning, my students growing, my students being formed in the way that I want as an educator.
00:08:40.347 --> 00:08:46.587
That's why I wake up early and stay up late, that's why I work as hard as I can in this field, because I want my students to grow.
00:08:46.587 --> 00:08:58.009
And so then, when it comes to artificial intelligence, I'm really excited about it, not because of what it is, not because it's a specific kind of technology, but because of what it can do, because of the impact it can have.
00:08:58.451 --> 00:09:15.568
And the impact it can have in my classroom is that it helps me do more evidence-based practices more often for more learners, and that's actually a vision that a lot of educators can get behind, because most educators out there they don't get really excited about technology.
00:09:15.568 --> 00:09:31.533
They haven't got the smart fridges, they haven't got those vacuums that are robots and vacuum your house by themselves, but what they do have is a burning passion to see their students grow and learn and attain this sort of knowledge that we want them to have.
00:09:31.533 --> 00:09:39.323
In a lot of classrooms that's not happening, and so that's my in when I talk about artificial intelligence with educators.
00:09:39.323 --> 00:09:48.940
Not that it's a technology that you can get excited about, but it's an impact you can get excited about, and that's the philosophy that then would sit under our practice.
00:09:48.940 --> 00:09:51.548
That's the purpose underneath the practice.
00:09:51.548 --> 00:10:01.133
Hey, let's use it to do the things that we know will help our learners, because that's going to help us then have the sort of lasting impact that we all show up to work to have.
00:10:02.674 --> 00:10:03.436
That's excellent.
00:10:03.436 --> 00:10:05.448
Thank you so much for sharing that, and then.
00:10:05.448 --> 00:10:07.081
So now my next question.
00:10:07.081 --> 00:10:10.034
You know, and again, your book was well.
00:10:10.034 --> 00:10:17.052
One of the things that I must compliment you on is how easy it was to read and how practical it is to be able to learn from it.
00:10:17.052 --> 00:10:20.635
So I want to ask you, though, about talking about the central principles.
00:10:20.635 --> 00:10:37.765
This is something that really stuck out to me, using AI for options and not answers, so I want to ask you and I'll start with you, paul, but for both of you how did each of you arrive at this philosophy, and could you share some examples of your own teaching practices where you found this particularly valuable?
00:11:09.740 --> 00:11:18.100
Options, not answers, is one of the best philosophies you can embed within your AI practice, because what it does, fonz, is it recognizes that the educator is the expert.
00:11:18.100 --> 00:11:19.783
The educator is the expert.
00:11:19.783 --> 00:11:29.144
So we have knowledge about our curriculum, about our content, about our learners, that the AI will just never have, and so when I go to AI, I don't ask it to do something for me.
00:11:29.144 --> 00:11:42.075
For example really simple example I'm doing a multiple choice quiz and that's for retrieval practice, to help my learners think through something we might have learned last lesson If I need 10 questions, I don't ask for 10.
00:11:42.998 --> 00:11:43.338
Why not?
00:11:43.539 --> 00:11:52.775
Because that would be asking for the resource in its final form, what I want to do, treating myself as the person who's the expert in the room, not the artificial intelligence.
00:11:53.097 --> 00:11:56.373
If I want 10 questions, I ask for 15 and I choose the best 10.
00:11:56.373 --> 00:12:01.831
Now, that might seem like a really simple shift, and it is, but it makes a world of difference.
00:12:01.831 --> 00:12:11.225
It allows me actually to leverage my wisdom and discretion and pastoral knowledge of my students and understanding of what exactly we covered in class.
00:12:11.225 --> 00:12:21.383
It's a fantastic shift that just treats me as the expert and allows me to create not just a generic resource for a generic class, but a tailored resource for my class.
00:12:21.383 --> 00:12:24.551
So that would be my big encouragement to any educator they go.
00:12:24.551 --> 00:12:27.740
Oh, I'm not quite sure what to do with artificial intelligence.
00:12:27.740 --> 00:12:31.480
I keep producing this really sort of generic beige output.
00:12:31.480 --> 00:12:51.760
If you ask for twice as much as you need and then use your intellect, your expertise as a teacher, to choose the best stuff, and then that's the stuff you're giving to your class, it's a great way to keep your AI work human-centered and actually just make sure that you're giving your learners the best high-quality resources that you're able to.
00:12:52.669 --> 00:12:53.030
Excellent.
00:12:53.030 --> 00:12:59.174
Now, Paul, just to clarify real quick, before I go to Jason with this question what great levels is it that you're currently teaching?
00:13:00.399 --> 00:13:01.000
Good question.
00:13:01.000 --> 00:13:06.783
So I teach year 9 and 10 history and I also teach psychology in year 11 and 12.
00:13:06.783 --> 00:13:07.866
Excellent.
00:13:07.986 --> 00:13:11.879
All right, and this is great because I want to give some context to our listeners too.
00:13:11.879 --> 00:13:12.260
As well.
00:13:12.260 --> 00:13:18.701
As we know, jason works in higher ed, and then that way, I forgot to ask you where it is that you you know your area of expertise.
00:13:18.701 --> 00:13:20.082
So thank you so much for sharing that.
00:13:20.082 --> 00:13:34.480
Because one of the things, though, that I do want to highlight and point out is what I love that you said is that the teacher is still in control and using their own critical thinking skills and using, of course, their expertise in the content.
00:13:35.302 --> 00:13:51.542
My biggest fear and I've said this so many times since the very beginning is just that oftentimes, with the new tech and that new excitement and we're always pressed for time is that that initial output is going to be gospel to teachers and they're just going to go ahead and send it out.
00:13:52.009 --> 00:14:22.023
So I think that I really love that you mentioned that, if you do ask for a little bit more is, as a teacher, we must do our due diligence to make sure, obviously, that that output is correct and based on that output, like you mentioned, using our best judgment, knowing our students, knowing our audience, being able to take those outputs and, like I always say, kind of add them as a little seasoning to what you as a teacher are already doing great and maybe expand and augment on those lessons and that learning.
00:14:22.023 --> 00:14:23.852
So I really love what you said there.
00:14:23.852 --> 00:14:31.375
Now, jason, going on to you as well, in the higher ed space, using AI for options, not answers.
00:14:31.375 --> 00:14:37.558
I know that you post often on a lot of the projects that you work on, but tell me a little bit about your experience with that.
00:14:38.650 --> 00:14:43.442
It's so interesting because I had an experience yesterday that I'm just going to bring in.
00:14:43.442 --> 00:14:49.567
So yesterday I had the opportunity to talk to a group of high school students.
00:14:49.567 --> 00:15:04.056
So these are high school students who are looking at our college and they they had me run a 45-minute session on AI and I did it on the AI mindset and one of the things that really threw those students for a loop is it's a 45-minute session.
00:15:04.056 --> 00:15:12.495
We only did things with AI for about 15 minutes and we talked a lot about their approach to this technology.
00:15:12.495 --> 00:15:16.224
And Paul used the word, as he was talking, shift.
00:15:16.224 --> 00:15:29.557
And one of the things that I'll really focus on is that there are a couple of shifts that do really need to happen to make uses of this technology, and one shift is a move from a kind of transactional mindset.
00:15:29.557 --> 00:15:36.538
Many of those high school students really saw most of them not all of them had used Chat2BT or another AI program.
00:15:36.538 --> 00:15:51.628
They'd use like Snapchat several of them or use Grammarly, so they had some experience with it and they had this very transactional approach that they saw AI as producing something for them, and very few of them.
00:15:51.628 --> 00:16:12.052
This is one of the first big shifts that really happened with me and I think that is really really helpful for thinking about options, not answers is that moving towards a more conversational mindset, so that we're using AI a lot of times not just for the output but what it generates in us, right.
00:16:12.052 --> 00:16:16.766
So for me, the idea of creating options, not answers is connected to thinking about AI is.
00:16:16.846 --> 00:16:25.419
I'm a co thinker and one of the things that I talk to my students a lot about is that everyone can have different processes for doing this.
00:16:25.419 --> 00:16:36.394
So one of the things that I do when I want to generate options, not answers is when I use an AI program, I actually start small and then I expand.
00:16:36.394 --> 00:16:39.284
So I might say I'm thinking about something.
00:16:39.284 --> 00:16:40.350
I often do this.
00:16:40.350 --> 00:16:47.090
I'm like kicking back an idea, get back and forth an idea about a game in class or an activity in class.
00:16:47.090 --> 00:16:48.330
I might say I'm sort of stumped.
00:16:48.330 --> 00:16:52.932
So I might go into an AI program and say just give me three options, right.
00:16:52.932 --> 00:17:02.086
I usually start small, sometimes I just do two, but often three is a good number and I might say, out of those three, one might be really good.
00:17:02.086 --> 00:17:04.312
One is usually pretty awful or not actually possible.
00:17:04.312 --> 00:17:12.246
It's not so we could actually do it and then one sort of mediocre, and then I can say, all right, I like number two, give me 10 more like that.
00:17:12.246 --> 00:17:24.931
And that's when you sort of you take that example, you expand it and then out of those 10, I might say you know, number two is good, number four is good, number eight is really good, all right, so I want you to generate now 10 more.
00:17:25.412 --> 00:17:37.830
Use number eight as your model, and that allows me to just redirect it more and more, because the other shift that I think needs to happen is thinking about prompting.
00:17:37.830 --> 00:18:03.231
A lot of my students when they come into the classroom, certainly if they're my college class, and a lot of my students when they come into the classroom, certainly if they're in my college class and a lot of those high school students had this language too they are very interested in prompt engineering, and one of the things that that term just does so incorrectly is it gives students the impression that what you really need is a perfect prompt you can send in there Maybe it's a couple of paragraphs, maybe it's a page, and then you get that output.
00:18:03.231 --> 00:18:21.190
And so I noticed this, like a couple of years ago no one was leaning into those follow-up messages, no one was really doing it and the idea that really that's the most important part never really occurred to most of them, and I had to come in and say, all right, that's your first message, that's what you got.
00:18:21.190 --> 00:18:22.772
Now how do you go from there?
00:18:22.772 --> 00:18:24.175
How do you redirect it?
00:18:24.175 --> 00:18:25.665
How do you let it know?
00:18:25.665 --> 00:18:30.076
This is what's working, this is what's not working, and that's the option mindset.
00:18:30.076 --> 00:18:42.623
That's when it's just about give me options and then I can come in and say you know, we're going to shift this, we're going to make this change, or maybe and sometimes I do this we're going to make this change.
00:18:42.623 --> 00:18:51.808
Or maybe and sometimes I do this too I might look at a list of options from ChatDVT or any program and I might just end the chat and say I have ideas based off of that.
00:18:51.828 --> 00:19:14.894
So it's not actually coming from the AI, but and this is where I use it as a co thinker something that pops into my mind because of the conversation and this all comes back to a lot of the real themes that I see in this conversation, which is one of the best things we can do is look at this technology, figure it out, play with it and then look beyond it.
00:19:15.475 --> 00:19:26.561
Try to look back, look at whether it is helping our students out or how it can help our students out, looking at how it's going to be used to improve things with teaching and learning.
00:19:26.602 --> 00:19:44.217
And that's really what matters, because the tech is kind of glitz and glamour and nice and cool and it's very fun and at least for me to watch it just spout out answers, but in the end we need to look past it, actually make use of it, and that's the doubleness that I think is lost on a lot of people.
00:19:44.217 --> 00:19:49.309
I think it just takes a lot of thought and practice to kind of get to that point.
00:19:49.309 --> 00:20:04.532
But that's where I think we need to be and that's where you start to really approach AI as a co thinker or as a generator of options, not answers, and that's such a different mindset to me and, I think, a lot of my students and certainly this group of high school students.
00:20:04.532 --> 00:20:06.852
Those are big shifts.
00:20:06.852 --> 00:20:13.748
That's something that we need to get beyond, because many of my students come in with a very transactional mindset when they use this technology.
00:20:14.530 --> 00:20:15.192
Very much so.
00:20:15.192 --> 00:20:37.891
And going back again, it's that transactional mindset that I think has developed and myself, working in K through 12, but also going, you know, through higher ed and master's and doctoral programs, there were many times where there was a specific project or specific something that we needed to do and then it was always like just tell me what to do so I can get the A, and then that's all I'm going to do.
00:20:37.891 --> 00:20:39.792
Hey, and then that's all I'm going to do, as opposed to one year.
00:20:39.792 --> 00:20:51.805
One of our professors kind of threw, you know just kind of this surprise to us and said hey, here's a choice board, you've got six options that you can choose from, and in any of those combinations I need 22 contact hours.
00:20:51.805 --> 00:20:53.811
And I was like this is amazing.
00:20:53.811 --> 00:21:13.576
I was like because coming in K-12, we do choice boards and I did choice boards, but the look on my you know colleagues faces when they're just like I don't get it, like I don't understand, like just tell me how to get the A and she's like you get to choose from you know these five, six areas and they were just what, like what's going on?
00:21:13.576 --> 00:21:20.701
So it's that transactional thinking that I do see a lot and I think I really can relate to where it is that you're coming from.
00:21:21.060 --> 00:22:19.803
But this kind of leads me to my next question, paul, and I'll go ahead and start with you, because one of the things that you mentioned in the book and I know we've kind of hit on it a little bit, but I want to go in a little bit deeper is the do the basics better?
00:22:19.803 --> 00:22:21.586
And I think with this question right now.
00:22:21.586 --> 00:22:28.928
Previously you kind of answered a little bit of that, but rather than doing entirely something new, it's that perspective.
00:22:28.928 --> 00:22:39.455
How has this evolved in your current teaching and with your experiences, but also some of the pushback, paul, tell me a little bit about that.
00:22:39.455 --> 00:22:41.298
Do the basics better?
00:22:43.125 --> 00:23:00.418
Well, that's one of the key philosophies in the book Fonz, and it comes from this idea that artificial intelligence meets us with no clear purpose, so it doesn't come with a set of instructions, really, and so then we're left to think about well, what should we use it for?
00:23:00.418 --> 00:23:07.579
The reason a lot of educators get frustrated about AI is because it represents a significant disruption.
00:23:07.579 --> 00:23:26.070
And then there's, unfortunately, an idea that pops into a lot of people's heads implicitly, and it's that if I'm going to use new technology, I've got to do new things with it, and so a teacher will stand back then and I've had this conversation a hundred times Teacher will stand back and go.
00:23:26.070 --> 00:23:35.230
Well, for the last 20 years I've been collecting pedagogical strategies and I've been collecting lessons and I've been collecting certain ways of assessing.
00:23:35.230 --> 00:23:40.766
That's a grab bag that makes me who I am as a teacher.
00:23:40.766 --> 00:23:47.848
That's part of my teacher personality, and they fear that if they're going to use new technology, they're going to use new technology, they're going to have to do new things.
00:23:47.848 --> 00:24:04.193
That bag it's so important that they've been collecting for that's going to have to go in the bin and that's a big fear, and I can understand why no one wants to start again, especially when we've worked so hard at cultivating a set of practices and pedagogies that we think are effective and work well.
00:24:04.193 --> 00:24:10.994
So the underlying message in the book is that we don't have to use new technology to do new things.
00:24:10.994 --> 00:24:21.052
We can actually use it to do the basics of education better, and so a really simple example of that and it's one we show people how to do in the book is text differentiation.
00:24:21.755 --> 00:24:23.541
It is one of the most basic things you can do.
00:24:23.541 --> 00:24:41.005
In my grade nine class, there are people who are reading at a grade nine level, but there's a sizable minority who are reading at a grade seven level, and so if I hand that grade nine level text out to every learner, that's what I like to call the spray and pray right, I'm spraying it out there, I'm praying, they can read it.
00:24:41.005 --> 00:24:44.912
Often they can't, and the problem with that is that then they're disengaged.
00:24:44.912 --> 00:25:08.368
All the thinking that comes from that reading, all the class discussion, all the retrieval practice that we'll do over the next couple of weeks, well, it's lost on that learner, and so there's a sort of a multitude of bad effects that come from them just not being able to access the reading One of the most basic things we can do for a learner is then adjust the length and complexity of a reading down from year nine level to year seven level.
00:25:09.028 --> 00:25:14.608
The only thing was, although that's pretty basic and relatively simple, it's time intensive.
00:25:14.608 --> 00:25:19.087
It used to take me about half an hour to adjust a 10 minute reading and I might be doing what?
00:25:19.087 --> 00:25:20.932
14 readings a week, 15.
00:25:20.932 --> 00:25:22.978
I just don't have that kind of time, sadly.
00:25:22.978 --> 00:25:25.571
And that's the beauty of artificial intelligence.
00:25:25.571 --> 00:25:27.967
I can do what used to take me half an hour.
00:25:27.967 --> 00:25:30.031
I can now do it in half a minute.
00:25:30.653 --> 00:25:38.053
And it's not the sort of amazing, sparkling, brand new, cutting edge pedagogy.
00:25:38.053 --> 00:25:42.949
It's just a thing teachers have been doing forever, done more effectively for more learners.
00:25:42.949 --> 00:25:44.574
So that's our big vision.
00:25:44.574 --> 00:25:47.248
We can use it to do the basics of education better.
00:25:47.248 --> 00:25:56.836
Now, of course, fonz, there are some people out there who are doing some of that brand new, cutting edge, metaverse, virtual reality sort of stuff, and God bless those guys.
00:25:56.836 --> 00:25:58.227
That's absolutely fine.
00:25:58.227 --> 00:25:59.330
I've got nothing against it.
00:25:59.691 --> 00:26:09.253
But just when it comes to me and my practice and the vision Jason and I are trying to share, we're saying well, we can actually use it, we can encourage teachers.
00:26:09.253 --> 00:26:11.615
All the things that you currently do in your practice.
00:26:11.615 --> 00:26:15.901
They're still relevant, they're still valuable in a world that has AI in it.
00:26:15.901 --> 00:26:29.961
In fact, you can lean into those things that you're already an expert in and do them more often for more learners, and I find that's a vision that's not only compelling but also disarming for teachers, and they go actually, you know what All the expertise I already have.
00:26:29.961 --> 00:26:32.369
It's valuable, I can use it.
00:26:32.369 --> 00:26:40.517
In fact, I can do more of it, and that actually moves teachers along and helps them get excited about the possibilities of AI in their practice.
00:26:41.685 --> 00:26:45.791
It's a great answer and that's something that I find very interesting.
00:26:45.791 --> 00:26:49.917
You hit on a lot of great things there, especially with teachers just being.
00:26:49.917 --> 00:26:55.705
You know, many times they may not adapt as easily and they see this as a threat.
00:26:55.705 --> 00:27:07.731
But now the way that you explain this is just about being able to do that basic, what they would normally do or have to do to help support their students, but just doing it a lot quicker or have to do to help support their students, but just doing it a lot quicker.
00:27:07.731 --> 00:27:28.379
I mean, one of the examples could be and I've had this in my experience where you're starting a new unit, you're already well into the middle of the unit and then you get a new student that comes in, and in my area sometimes the students come in speaking Japanese or speaking Korean and now it's like, well, like I don't have that material here, and sometimes it takes long for those people that specialize in getting the materials for them.
00:27:29.087 --> 00:27:48.413
It may take them a little bit longer because there's more students, but now, quickly and effectively, like I can take a reading, translate it to somewhat, you know they can understand follow along with the lesson and to be able to, you know, keep them, you know, at pace with us, and then, of course, make any adjustments needed for those reading levels, and I think that's fantastic.
00:27:48.413 --> 00:27:50.599
Jason, what has been your experience?
00:27:50.599 --> 00:27:57.990
I know that you post a lot about this, too, as well, but you know what have been some of the good and then, of course, some of the pushback that you get.
00:27:59.593 --> 00:28:19.298
I am in a ton of meetings now where higher ed professionals so some of them are faculty, some of them are administrators and some have some other role in the college or university where it sort of ends in the same spot and it ends with someone saying we have to fundamentally change every single thing that we do.
00:28:19.298 --> 00:28:23.445
And when are you going to give us the time to do that Right?
00:28:23.445 --> 00:28:24.811
There's always that follow up.
00:28:24.811 --> 00:28:33.294
When are you going to say I know you're already teaching five classes, but now fundamentally change everything you do about assessment or anything you do about teaching?
00:28:33.294 --> 00:28:44.451
And I often use that as an opportunity, if there's time, for me to come in and do a little bit of course correction there, because I do think that there is this misunderstanding about what disruption means.
00:28:44.451 --> 00:28:50.541
I think there are certain things that this technology will really disrupt and force us to really revisit.
00:28:50.541 --> 00:29:09.116
But I don't think we should fall victim to the idea that this means we just throw everything out the window, because I completely agree with Paul that we have, regardless of what you teach, regardless of what level you're at, you have all of these strategies and techniques to engage students and help them along.
00:29:09.116 --> 00:29:10.464
We don't have to get rid of those.
00:29:10.464 --> 00:29:15.718
So a lot of it is trying to figure out things that maybe you can scale now that you couldn't scale five years ago.
00:29:15.718 --> 00:29:24.737
So his example of text differentiation is a big one, and in college level, a lot of my students are still struggling with literacy.
00:29:24.737 --> 00:29:30.485
So being able to personalize something for them and create a plan from there is really really helpful.
00:29:30.485 --> 00:29:51.334
Or if it's something that you can just do more frequently, so in the classroom, role-playing I've done for years I think I've done role-playing for 10 plus years in my humanities courses and now there are ways to scale it up a little bit more, right, you can use AI to allow for more and more practice, and so things like that that really, really help the student out.
00:29:51.334 --> 00:30:13.427
And I think that there's another part of this conversation which is and I'll kind of lean into the one of the words that Paul used, and he was talking purpose, having a sense of purpose that this technology comes to us with no sense of purpose, no real direction, and that's how we lean into our own sense of purpose.
00:30:13.447 --> 00:30:23.580
A lot of what I do when I talk to faculty members about assessment in particular, is asking them what the purpose of something is, and this is something that we should be talking about for a long time, right?
00:30:23.580 --> 00:30:36.394
So if you are teaching a literature class or whatever it is, and all of your assignments are just essays, you just have a bunch of essays that we're doing, right, really taking a step back and saying, all right, what was the purpose of that?
00:30:36.394 --> 00:30:41.675
Let's take that to the basics, let's break it down, what do you want to get out of that?
00:30:41.675 --> 00:30:46.211
And being very self-reflective about it.
00:30:46.211 --> 00:30:53.075
So for some of us, we might say, oh, I actually thought about it and I'm not that interested in giving an essay.
00:30:53.075 --> 00:31:00.033
Maybe there's another way to do it, because at the college level, we all give essays, every single class.
00:31:00.033 --> 00:31:01.096
I don't care what you're doing.
00:31:01.205 --> 00:31:04.375
A student will go and take an English class, write a bunch of essays.
00:31:04.375 --> 00:31:13.331
They'll go into history, write a bunch of essays, go into math, write a bunch of essays, go into math, yes, and write essays.
00:31:13.331 --> 00:31:14.316
Go into chem and write essays, like built into it.
00:31:14.316 --> 00:31:15.923
And so for some of us, we think about was that actually the best way to do that?
00:31:15.923 --> 00:31:27.925
And then for others and this is just where purpose comes in we might say, yes, I want my students to write an essay, and here is why, and that it's that why that really really matters.
00:31:28.326 --> 00:31:36.499
And I was talking to a group of faculty maybe about a week or so ago and one of them said you know, I really, and you know I gave a particular reason.
00:31:36.499 --> 00:31:44.408
I really want my students to write essays because I want them to see what structured thought looks like, even if it's not totally authentic.
00:31:44.408 --> 00:31:46.294
I want them to get a sense of what that looks like.
00:31:46.294 --> 00:31:54.034
And so she said but because of AI, I have to get rid of it, I have to delete it.
00:31:54.034 --> 00:31:57.005
And I came in and said no, you actually don't, you don't have to do that, right, that's that may not be necessary.
00:31:57.005 --> 00:32:07.220
And then we ended up talking about you know, maybe there's a way to make just the process more visible for students, like what are ways that maybe it's a matter of process over product?
00:32:07.220 --> 00:32:21.112
And then how can you then reshape it if you want to reshape it to focus on that, and so that allowed them to come in and say, oh, maybe I don't have to get rid of the essay, because they really believe in that and they have the evidence to support using that in their particular context.
00:32:21.664 --> 00:32:24.875
So sometimes it might make sense to move in another direction.
00:32:25.384 --> 00:32:27.173
Sometimes it might make sense to move in another direction.
00:32:27.192 --> 00:32:34.254
Sometimes it might mean, oh, you just go back to what you actually wanted out of the assignment and you can keep the essay and maybe there is a change you can make, or maybe you don't make a change at all.
00:32:34.605 --> 00:32:50.208
And being able to make those decisions is a huge part of who we are, and one of the things that I believe is that we need to do, as educators, anything that doesn't make us feel like machines, one of the defining things about us.
00:32:50.208 --> 00:33:02.753
We have these personalities, we have these senses of purpose, and so we need to actually lean into that, because moving in that direction it just makes us mechanical in a certain way.
00:33:02.753 --> 00:33:13.993
Mechanical in a certain way, and that so much of it is based off of educators being the experts, knowing, knowing the evidence, or hopefully knowing the evidence and being able to make those decisions.
00:33:13.993 --> 00:33:20.634
So I think that in the end, it comes down to just looking at everything and making a nuanced decisions, and it might not mean doing something brand new.