March 29, 2025

Episode 318: Paul Matthews & Jason Gulya

Episode 318: Paul Matthews & Jason Gulya
Spotify podcast player badge
Goodpods podcast player badge
Apple Podcasts podcast player badge
Amazon Music podcast player badge
Pandora podcast player badge
RSS Feed podcast player badge
Spotify podcast player iconGoodpods podcast player iconApple Podcasts podcast player iconAmazon Music podcast player iconPandora podcast player iconRSS Feed podcast player icon

Artificial Intelligence, Real Literacy – Evidence-Based Practices in Action

 Join me on this engaging episode of My EdTech Life as I sit down with Paul Matthews and Jason Gulya to dive into their book titled Artificial Intelligence, Real Literacy: A Practical Guide to Using AI For Evidence-Based Literacy Practices in Education and explore how AI is transforming literacy instruction through evidence-based practices. We discuss everything from integrating AI in the classroom to verifying its output, and how teachers can harness technology without losing the human touch. Plus, hear personal stories, practical tips, and a rapid-fire round of fun questions!

Timestamps:
00:00: Welcome & Introduction
00:01: Meet Our Guests – Paul Matthews (Australia) & Jason Gulya (US)
00:03: Book Overview & The Need for Evidence-Based Practices
00:04: AI in Literacy – Strategies & Philosophies (Options, Not Answers)
00:10: Principles in Practice – Tailoring AI for Better Teaching
00:21: Doing the Basics Better – From Text Differentiation to Classroom Impact
00:32: Verifying AI Output – Organic Intelligence & Fact-Checking Tips
00:49: Rapid-Fire Round – Quick Advice and Fun Insights
00:56: Closing Remarks & Sponsor Shout-Outs

Sponsors & Call-to-Action:
A big thank you to our sponsors – Book Creator, Yellow Dig, and EduAide.AI for supporting our mission. Don’t forget to follow Paul and Jason on LinkedIn for more insights and join our community at My EdTech Life for additional high-quality content.

Stay Techie!

Peel Back Education exists to uncover, share, and amplify powerful, authentic stories from inside classrooms and beyond, helping educators, learners, and the wider community connect meaningfully with the people and ideas shaping education today.

Authentic engagement, inclusion, and learning across the curriculum for ALL your students. Teachers love Book Creator.

Support the show

Thank you for watching or listening to our show! 

Until Next Time, Stay Techie!

-Fonz

🎙️ Love our content? Sponsor MyEdTechLife Podcast and connect with our passionate edtech audience! Reach out to me at myedtechlife@gmail.com. ✨

 

00:30 - Welcome and Introduction to Guests

04:15 - AI and Real Literacy Book Overview

11:17 - Evidence-Based Teaching with AI

23:15 - Options Not Answers: AI Philosophy

35:46 - Doing the Basics Better with AI

44:34 - Auditing AI with Organic Intelligence

53:29 - Commission and Impact on Education

59:47 - Final Thoughts and Closing Questions

WEBVTT

00:00:30.155 --> 00:00:33.557
Hello everybody and welcome to another great episode of my EdTech Life.

00:00:33.557 --> 00:00:43.109
Thank you so much for joining us on this wonderful day and, wherever in the world that you're joining us from, as always, thank you so much for all the likes, the shares, the follows.

00:00:43.109 --> 00:00:49.609
Thank you so much for interacting with our content, for all the wonderful feedback and to all our new followers, welcome.

00:00:49.609 --> 00:00:56.151
Thank you so much for all of your support and I definitely want to give a big shout out to our newest sponsor, book Creator.

00:00:56.151 --> 00:01:07.281
Thank you, book Creator, for this awesome mug, and thank you so much for believing in our mission and bringing amazing conversations here into our education landscape so we can all continue to grow.

00:01:07.281 --> 00:01:13.483
So thank you, thank you, thank you, and today I am really excited to welcome two amazing gentlemen.

00:01:13.825 --> 00:01:19.260
These two gentlemen are people that I follow on LinkedIn and put out some amazing content.

00:01:19.260 --> 00:01:40.760
If you have questions about integrating AI and AI literacy and literacy and AI and so many other things involving literacy, these two gentlemen will definitely have some answers for you in all of their posts, or you can even reach out to them or pose a question to them on their LinkedIn comments, and I promise you that they are so wonderful at answering those questions.

00:01:40.760 --> 00:01:53.385
So I would love to welcome to the show all the way from Australia on a Saturday morning Mr Paul Matthews, and I would love to welcome here from the US ona Friday afternoon, mr Jason Guglia.

00:01:53.385 --> 00:01:57.554
So, paul, how are you doing this morning there in Australia?

00:01:58.700 --> 00:02:01.203
I'm doing so well, fonz, I'm doing so well.

00:02:01.203 --> 00:02:04.688
We were just chatting off air, but I've had a great week in the classroom this week.

00:02:04.688 --> 00:02:08.173
It's now 5.09 AM, which is my time.

00:02:08.173 --> 00:02:09.253
It's Saturday morning.

00:02:09.253 --> 00:02:11.061
I absolutely love this time of the day.

00:02:11.061 --> 00:02:15.581
I feel like my creative powers are exactly where they need to be, so it's good to be with you both.

00:02:16.263 --> 00:02:16.745
Excellent.

00:02:16.745 --> 00:02:19.512
And Jason, how are you doing this afternoon here in the States?

00:02:20.501 --> 00:02:20.861
I'm good.

00:02:20.861 --> 00:02:24.995
It's so fun and interesting to be on like different days.

00:02:24.995 --> 00:02:31.483
If you told me that I would be doing this like five years ago and talking to people all over the world, it would like blow my mind and I wouldn't believe you.

00:02:31.483 --> 00:02:35.131
So I feel like I'm in a very different part of my day.

00:02:35.131 --> 00:02:41.467
So me it's right after two and I just went through like three or four hours of meetings.

00:02:41.467 --> 00:02:58.070
So I just feel like I'm in a very it's not quite the end of the day, but it's a point where, like, the work has sort of been done and it's so, and I'm sure you both know this or had this experience in department meetings AI always comes up and so I'm not allowed to like just sit there and vegetate.

00:02:58.070 --> 00:03:00.087
I always talk.

00:03:00.087 --> 00:03:02.420
So that's where I am in my day, but I'm doing well.

00:03:02.420 --> 00:03:03.262
I'm tired.

00:03:04.102 --> 00:03:04.603
Excellent.

00:03:04.603 --> 00:03:13.149
Well, thank you both for being here today, on this wonderful day, and I'm really excited to talk about, and we're going to be focusing mainly on, your book, you know.

00:03:13.149 --> 00:03:27.117
So I'm really excited and the book of course, we see it here in the image where Paul is has the book there Artificial intelligence real literacy a practical guide to using AI for 10 evidence-based literacy practices in education.

00:03:27.117 --> 00:03:36.995
So, as we know, 2022 happened and I don't want to go that far back, but that kind of changed a lot of education, a lot of practices in education.

00:03:36.995 --> 00:03:42.050
But today we're going to be focusing on that literacy aspect and component of it, which is great because, like I mentioned, you two are two people that I love to follow, especially when it deals with this.

00:03:42.050 --> 00:03:48.486
Great because, like I mentioned, you two are two people that I love to follow, especially when it deals with this topic.

00:03:48.848 --> 00:03:52.481
So I'm going to go ahead and start with you, jason, at this moment.

00:03:52.481 --> 00:04:00.990
We know that the book opens up with a great anecdote, you know, of using AI in the classroom to teach students about reading.

00:04:00.990 --> 00:04:07.870
So I want to start with you here, because it's very interesting that the book title is called Artificial Intelligence Real Literacy.

00:04:07.870 --> 00:04:10.896
So I want to start with you here, because it's very interesting that the book title is called Artificial Intelligence, real Literacy.

00:04:10.896 --> 00:04:17.293
So I want to ask you you know what is it that we need in education right now and the purpose of this book and how it can help teachers.

00:04:18.680 --> 00:04:32.737
Yeah, so the real idea here in many ways is that we have this mountain of evidence, I think, that in my so I teach college and I focus on reading and writing, and there's a lot of hesitancy there.

00:04:32.737 --> 00:04:42.475
If you wanted to move into a space that is so divided in terms of using AI, not using AI, it is English department.

00:04:42.475 --> 00:05:05.247
So literally in my department I have myself and I've incorporated into many parts of my curriculum we have several faculty members who completely ban it and they would tell their students if you use AI, I will find you right, like there is that sort of a language to it, and then many people in the middle, and that is very much representative, I think, of English departments, especially at the college level.

00:05:05.247 --> 00:05:40.146
There's all of this division, all of this hesitancy, and one of the things that I'm constantly reminding people of and this is many reasons is the idea behind the book, is that we have this mountain of evidence for what works, for what actually works with teaching works, for what actually works with teaching, and one of the things that I truly believe is that AI has changed a lot of things about society, about the way we interact, but it hasn't really changed the principles of good teaching, that, what makes teaching engaging, what helps us learn how to read and write and really focus on the value of literacy.

00:05:40.146 --> 00:05:42.091
Those are pretty constant.

00:05:42.132 --> 00:05:51.151
We have all this evidence that really, really we should have been sticking to a long time ago, and especially at my level, at the college level, many of us do not do that.

00:05:51.151 --> 00:05:55.399
Many of us hadn't been sticking to those evidence-based practices.

00:05:55.399 --> 00:05:59.428
There was a lot of lecturing, there was a lot of one-sided conversations.

00:05:59.428 --> 00:06:02.269
A professor tells a student what to do and they do it.

00:06:02.269 --> 00:06:16.709
That sort of transactional approach, which we've known for a really really long time, doesn't work, and so one of the best uses of AI in my, in our opinion, is actually sticking to the evidence.

00:06:16.709 --> 00:06:27.062
So if we have certain strategies that help students read and write, now we can stick to them in a way that's easier, right or faster, actually saves us time, and we can stick to it.

00:06:27.182 --> 00:06:32.403
This is something that obviously we focus on it with literacy, but I do think it's broadly applicable.

00:06:32.403 --> 00:06:48.225
I do think that people can approach it from other fields and say well, this is what we know about teaching and learning and this is how I can use AI to augment what I'm already doing, because maybe I wasn't doing things as much as I could have.

00:06:48.225 --> 00:06:52.252
So one of the examples that I often give is making learning accessible.

00:06:52.252 --> 00:07:00.769
Many of us have been going for the last 10 years to these training and learning about how to create accessible materials and everything.

00:07:01.220 --> 00:07:02.564
A lot of us weren't doing it.

00:07:02.564 --> 00:07:06.754
We just weren't, there wasn't time, and so now one of the ways to use AI is to make sure that we are creating accessible and everything.

00:07:06.754 --> 00:07:18.084
A lot of us weren't doing it, we just weren't, there wasn't time, and so now one of the ways to use AI is to make sure that we are creating accessible learning or we are having these activities that we know work and that, in many ways, is the kind of impetus behind the book, and that's something that I think.

00:07:18.084 --> 00:07:26.420
Obviously we focus on it with literacy, but it applies to other fields and it applies regardless of what you teach Excellent.

00:07:26.701 --> 00:07:31.533
Well, paul, now on to you Very similar question that I'm going to ask you.

00:07:31.533 --> 00:07:36.172
But I want to ask you because, again, it's such a very interesting title and I'm still with that.

00:07:36.172 --> 00:07:39.333
But I love the way that Jason hit on a couple of things there.

00:07:39.333 --> 00:07:56.528
That, as far as still sticking to that evidence-based practice and using the AI to augment and I was recently at a conference, too, in Puerto Rico and they were talking about the SAMR model and then they're talking about that augmentation piece and a lot of people just not really ever getting there.

00:07:56.528 --> 00:08:08.644
It's usually the substitution aspect of it and it's really like going from paper to a Chromebook or a screen but never really being able to take that next step and what that looks like.

00:08:08.644 --> 00:08:17.081
But now I want to ask you you know, what does this juxtaposition mean to you personally and how does it reflect in your own practice and in your vision of education?

00:08:18.725 --> 00:08:28.790
That's a really good question, and it sort of harkens back to one of my big philosophies that undergird my AI practice, fonz, and it's this I'm not a big tech guy.

00:08:28.790 --> 00:08:31.742
Unlike yourself, I don't get that excited about technology.

00:08:31.742 --> 00:08:40.347
What I get really, really excited about is my students learning, my students growing, my students being formed in the way that I want as an educator.

00:08:40.347 --> 00:08:46.587
That's why I wake up early and stay up late, that's why I work as hard as I can in this field, because I want my students to grow.

00:08:46.587 --> 00:08:58.009
And so then, when it comes to artificial intelligence, I'm really excited about it, not because of what it is, not because it's a specific kind of technology, but because of what it can do, because of the impact it can have.

00:08:58.451 --> 00:09:15.568
And the impact it can have in my classroom is that it helps me do more evidence-based practices more often for more learners, and that's actually a vision that a lot of educators can get behind, because most educators out there they don't get really excited about technology.

00:09:15.568 --> 00:09:31.533
They haven't got the smart fridges, they haven't got those vacuums that are robots and vacuum your house by themselves, but what they do have is a burning passion to see their students grow and learn and attain this sort of knowledge that we want them to have.

00:09:31.533 --> 00:09:39.323
In a lot of classrooms that's not happening, and so that's my in when I talk about artificial intelligence with educators.

00:09:39.323 --> 00:09:48.940
Not that it's a technology that you can get excited about, but it's an impact you can get excited about, and that's the philosophy that then would sit under our practice.

00:09:48.940 --> 00:09:51.548
That's the purpose underneath the practice.

00:09:51.548 --> 00:10:01.133
Hey, let's use it to do the things that we know will help our learners, because that's going to help us then have the sort of lasting impact that we all show up to work to have.

00:10:02.674 --> 00:10:03.436
That's excellent.

00:10:03.436 --> 00:10:05.448
Thank you so much for sharing that, and then.

00:10:05.448 --> 00:10:07.081
So now my next question.

00:10:07.081 --> 00:10:10.034
You know, and again, your book was well.

00:10:10.034 --> 00:10:17.052
One of the things that I must compliment you on is how easy it was to read and how practical it is to be able to learn from it.

00:10:17.052 --> 00:10:20.635
So I want to ask you, though, about talking about the central principles.

00:10:20.635 --> 00:10:37.765
This is something that really stuck out to me, using AI for options and not answers, so I want to ask you and I'll start with you, paul, but for both of you how did each of you arrive at this philosophy, and could you share some examples of your own teaching practices where you found this particularly valuable?

00:11:09.740 --> 00:11:18.100
Options, not answers, is one of the best philosophies you can embed within your AI practice, because what it does, fonz, is it recognizes that the educator is the expert.

00:11:18.100 --> 00:11:19.783
The educator is the expert.

00:11:19.783 --> 00:11:29.144
So we have knowledge about our curriculum, about our content, about our learners, that the AI will just never have, and so when I go to AI, I don't ask it to do something for me.

00:11:29.144 --> 00:11:42.075
For example really simple example I'm doing a multiple choice quiz and that's for retrieval practice, to help my learners think through something we might have learned last lesson If I need 10 questions, I don't ask for 10.

00:11:42.998 --> 00:11:43.338
Why not?

00:11:43.539 --> 00:11:52.775
Because that would be asking for the resource in its final form, what I want to do, treating myself as the person who's the expert in the room, not the artificial intelligence.

00:11:53.097 --> 00:11:56.373
If I want 10 questions, I ask for 15 and I choose the best 10.

00:11:56.373 --> 00:12:01.831
Now, that might seem like a really simple shift, and it is, but it makes a world of difference.

00:12:01.831 --> 00:12:11.225
It allows me actually to leverage my wisdom and discretion and pastoral knowledge of my students and understanding of what exactly we covered in class.

00:12:11.225 --> 00:12:21.383
It's a fantastic shift that just treats me as the expert and allows me to create not just a generic resource for a generic class, but a tailored resource for my class.

00:12:21.383 --> 00:12:24.551
So that would be my big encouragement to any educator they go.

00:12:24.551 --> 00:12:27.740
Oh, I'm not quite sure what to do with artificial intelligence.

00:12:27.740 --> 00:12:31.480
I keep producing this really sort of generic beige output.

00:12:31.480 --> 00:12:51.760
If you ask for twice as much as you need and then use your intellect, your expertise as a teacher, to choose the best stuff, and then that's the stuff you're giving to your class, it's a great way to keep your AI work human-centered and actually just make sure that you're giving your learners the best high-quality resources that you're able to.

00:12:52.669 --> 00:12:53.030
Excellent.

00:12:53.030 --> 00:12:59.174
Now, Paul, just to clarify real quick, before I go to Jason with this question what great levels is it that you're currently teaching?

00:13:00.399 --> 00:13:01.000
Good question.

00:13:01.000 --> 00:13:06.783
So I teach year 9 and 10 history and I also teach psychology in year 11 and 12.

00:13:06.783 --> 00:13:07.866
Excellent.

00:13:07.986 --> 00:13:11.879
All right, and this is great because I want to give some context to our listeners too.

00:13:11.879 --> 00:13:12.260
As well.

00:13:12.260 --> 00:13:18.701
As we know, jason works in higher ed, and then that way, I forgot to ask you where it is that you you know your area of expertise.

00:13:18.701 --> 00:13:20.082
So thank you so much for sharing that.

00:13:20.082 --> 00:13:34.480
Because one of the things, though, that I do want to highlight and point out is what I love that you said is that the teacher is still in control and using their own critical thinking skills and using, of course, their expertise in the content.

00:13:35.302 --> 00:13:51.542
My biggest fear and I've said this so many times since the very beginning is just that oftentimes, with the new tech and that new excitement and we're always pressed for time is that that initial output is going to be gospel to teachers and they're just going to go ahead and send it out.

00:13:52.009 --> 00:14:22.023
So I think that I really love that you mentioned that, if you do ask for a little bit more is, as a teacher, we must do our due diligence to make sure, obviously, that that output is correct and based on that output, like you mentioned, using our best judgment, knowing our students, knowing our audience, being able to take those outputs and, like I always say, kind of add them as a little seasoning to what you as a teacher are already doing great and maybe expand and augment on those lessons and that learning.

00:14:22.023 --> 00:14:23.852
So I really love what you said there.

00:14:23.852 --> 00:14:31.375
Now, jason, going on to you as well, in the higher ed space, using AI for options, not answers.

00:14:31.375 --> 00:14:37.558
I know that you post often on a lot of the projects that you work on, but tell me a little bit about your experience with that.

00:14:38.650 --> 00:14:43.442
It's so interesting because I had an experience yesterday that I'm just going to bring in.

00:14:43.442 --> 00:14:49.567
So yesterday I had the opportunity to talk to a group of high school students.

00:14:49.567 --> 00:15:04.056
So these are high school students who are looking at our college and they they had me run a 45-minute session on AI and I did it on the AI mindset and one of the things that really threw those students for a loop is it's a 45-minute session.

00:15:04.056 --> 00:15:12.495
We only did things with AI for about 15 minutes and we talked a lot about their approach to this technology.

00:15:12.495 --> 00:15:16.224
And Paul used the word, as he was talking, shift.

00:15:16.224 --> 00:15:29.557
And one of the things that I'll really focus on is that there are a couple of shifts that do really need to happen to make uses of this technology, and one shift is a move from a kind of transactional mindset.

00:15:29.557 --> 00:15:36.538
Many of those high school students really saw most of them not all of them had used Chat2BT or another AI program.

00:15:36.538 --> 00:15:51.628
They'd use like Snapchat several of them or use Grammarly, so they had some experience with it and they had this very transactional approach that they saw AI as producing something for them, and very few of them.

00:15:51.628 --> 00:16:12.052
This is one of the first big shifts that really happened with me and I think that is really really helpful for thinking about options, not answers is that moving towards a more conversational mindset, so that we're using AI a lot of times not just for the output but what it generates in us, right.

00:16:12.052 --> 00:16:16.766
So for me, the idea of creating options, not answers is connected to thinking about AI is.

00:16:16.846 --> 00:16:25.419
I'm a co thinker and one of the things that I talk to my students a lot about is that everyone can have different processes for doing this.

00:16:25.419 --> 00:16:36.394
So one of the things that I do when I want to generate options, not answers is when I use an AI program, I actually start small and then I expand.

00:16:36.394 --> 00:16:39.284
So I might say I'm thinking about something.

00:16:39.284 --> 00:16:40.350
I often do this.

00:16:40.350 --> 00:16:47.090
I'm like kicking back an idea, get back and forth an idea about a game in class or an activity in class.

00:16:47.090 --> 00:16:48.330
I might say I'm sort of stumped.

00:16:48.330 --> 00:16:52.932
So I might go into an AI program and say just give me three options, right.

00:16:52.932 --> 00:17:02.086
I usually start small, sometimes I just do two, but often three is a good number and I might say, out of those three, one might be really good.

00:17:02.086 --> 00:17:04.312
One is usually pretty awful or not actually possible.

00:17:04.312 --> 00:17:12.246
It's not so we could actually do it and then one sort of mediocre, and then I can say, all right, I like number two, give me 10 more like that.

00:17:12.246 --> 00:17:24.931
And that's when you sort of you take that example, you expand it and then out of those 10, I might say you know, number two is good, number four is good, number eight is really good, all right, so I want you to generate now 10 more.

00:17:25.412 --> 00:17:37.830
Use number eight as your model, and that allows me to just redirect it more and more, because the other shift that I think needs to happen is thinking about prompting.

00:17:37.830 --> 00:18:03.231
A lot of my students when they come into the classroom, certainly if they're my college class, and a lot of my students when they come into the classroom, certainly if they're in my college class and a lot of those high school students had this language too they are very interested in prompt engineering, and one of the things that that term just does so incorrectly is it gives students the impression that what you really need is a perfect prompt you can send in there Maybe it's a couple of paragraphs, maybe it's a page, and then you get that output.

00:18:03.231 --> 00:18:21.190
And so I noticed this, like a couple of years ago no one was leaning into those follow-up messages, no one was really doing it and the idea that really that's the most important part never really occurred to most of them, and I had to come in and say, all right, that's your first message, that's what you got.

00:18:21.190 --> 00:18:22.772
Now how do you go from there?

00:18:22.772 --> 00:18:24.175
How do you redirect it?

00:18:24.175 --> 00:18:25.665
How do you let it know?

00:18:25.665 --> 00:18:30.076
This is what's working, this is what's not working, and that's the option mindset.

00:18:30.076 --> 00:18:42.623
That's when it's just about give me options and then I can come in and say you know, we're going to shift this, we're going to make this change, or maybe and sometimes I do this we're going to make this change.

00:18:42.623 --> 00:18:51.808
Or maybe and sometimes I do this too I might look at a list of options from ChatDVT or any program and I might just end the chat and say I have ideas based off of that.

00:18:51.828 --> 00:19:14.894
So it's not actually coming from the AI, but and this is where I use it as a co thinker something that pops into my mind because of the conversation and this all comes back to a lot of the real themes that I see in this conversation, which is one of the best things we can do is look at this technology, figure it out, play with it and then look beyond it.

00:19:15.475 --> 00:19:26.561
Try to look back, look at whether it is helping our students out or how it can help our students out, looking at how it's going to be used to improve things with teaching and learning.

00:19:26.602 --> 00:19:44.217
And that's really what matters, because the tech is kind of glitz and glamour and nice and cool and it's very fun and at least for me to watch it just spout out answers, but in the end we need to look past it, actually make use of it, and that's the doubleness that I think is lost on a lot of people.

00:19:44.217 --> 00:19:49.309
I think it just takes a lot of thought and practice to kind of get to that point.

00:19:49.309 --> 00:20:04.532
But that's where I think we need to be and that's where you start to really approach AI as a co thinker or as a generator of options, not answers, and that's such a different mindset to me and, I think, a lot of my students and certainly this group of high school students.

00:20:04.532 --> 00:20:06.852
Those are big shifts.

00:20:06.852 --> 00:20:13.748
That's something that we need to get beyond, because many of my students come in with a very transactional mindset when they use this technology.

00:20:14.530 --> 00:20:15.192
Very much so.

00:20:15.192 --> 00:20:37.891
And going back again, it's that transactional mindset that I think has developed and myself, working in K through 12, but also going, you know, through higher ed and master's and doctoral programs, there were many times where there was a specific project or specific something that we needed to do and then it was always like just tell me what to do so I can get the A, and then that's all I'm going to do.

00:20:37.891 --> 00:20:39.792
Hey, and then that's all I'm going to do, as opposed to one year.

00:20:39.792 --> 00:20:51.805
One of our professors kind of threw, you know just kind of this surprise to us and said hey, here's a choice board, you've got six options that you can choose from, and in any of those combinations I need 22 contact hours.

00:20:51.805 --> 00:20:53.811
And I was like this is amazing.

00:20:53.811 --> 00:21:13.576
I was like because coming in K-12, we do choice boards and I did choice boards, but the look on my you know colleagues faces when they're just like I don't get it, like I don't understand, like just tell me how to get the A and she's like you get to choose from you know these five, six areas and they were just what, like what's going on?

00:21:13.576 --> 00:21:20.701
So it's that transactional thinking that I do see a lot and I think I really can relate to where it is that you're coming from.

00:21:21.060 --> 00:22:19.803
But this kind of leads me to my next question, paul, and I'll go ahead and start with you, because one of the things that you mentioned in the book and I know we've kind of hit on it a little bit, but I want to go in a little bit deeper is the do the basics better?

00:22:19.803 --> 00:22:21.586
And I think with this question right now.

00:22:21.586 --> 00:22:28.928
Previously you kind of answered a little bit of that, but rather than doing entirely something new, it's that perspective.

00:22:28.928 --> 00:22:39.455
How has this evolved in your current teaching and with your experiences, but also some of the pushback, paul, tell me a little bit about that.

00:22:39.455 --> 00:22:41.298
Do the basics better?

00:22:43.125 --> 00:23:00.418
Well, that's one of the key philosophies in the book Fonz, and it comes from this idea that artificial intelligence meets us with no clear purpose, so it doesn't come with a set of instructions, really, and so then we're left to think about well, what should we use it for?

00:23:00.418 --> 00:23:07.579
The reason a lot of educators get frustrated about AI is because it represents a significant disruption.

00:23:07.579 --> 00:23:26.070
And then there's, unfortunately, an idea that pops into a lot of people's heads implicitly, and it's that if I'm going to use new technology, I've got to do new things with it, and so a teacher will stand back then and I've had this conversation a hundred times Teacher will stand back and go.

00:23:26.070 --> 00:23:35.230
Well, for the last 20 years I've been collecting pedagogical strategies and I've been collecting lessons and I've been collecting certain ways of assessing.

00:23:35.230 --> 00:23:40.766
That's a grab bag that makes me who I am as a teacher.

00:23:40.766 --> 00:23:47.848
That's part of my teacher personality, and they fear that if they're going to use new technology, they're going to use new technology, they're going to have to do new things.

00:23:47.848 --> 00:24:04.193
That bag it's so important that they've been collecting for that's going to have to go in the bin and that's a big fear, and I can understand why no one wants to start again, especially when we've worked so hard at cultivating a set of practices and pedagogies that we think are effective and work well.

00:24:04.193 --> 00:24:10.994
So the underlying message in the book is that we don't have to use new technology to do new things.

00:24:10.994 --> 00:24:21.052
We can actually use it to do the basics of education better, and so a really simple example of that and it's one we show people how to do in the book is text differentiation.

00:24:21.755 --> 00:24:23.541
It is one of the most basic things you can do.

00:24:23.541 --> 00:24:41.005
In my grade nine class, there are people who are reading at a grade nine level, but there's a sizable minority who are reading at a grade seven level, and so if I hand that grade nine level text out to every learner, that's what I like to call the spray and pray right, I'm spraying it out there, I'm praying, they can read it.

00:24:41.005 --> 00:24:44.912
Often they can't, and the problem with that is that then they're disengaged.

00:24:44.912 --> 00:25:08.368
All the thinking that comes from that reading, all the class discussion, all the retrieval practice that we'll do over the next couple of weeks, well, it's lost on that learner, and so there's a sort of a multitude of bad effects that come from them just not being able to access the reading One of the most basic things we can do for a learner is then adjust the length and complexity of a reading down from year nine level to year seven level.

00:25:09.028 --> 00:25:14.608
The only thing was, although that's pretty basic and relatively simple, it's time intensive.

00:25:14.608 --> 00:25:19.087
It used to take me about half an hour to adjust a 10 minute reading and I might be doing what?

00:25:19.087 --> 00:25:20.932
14 readings a week, 15.

00:25:20.932 --> 00:25:22.978
I just don't have that kind of time, sadly.

00:25:22.978 --> 00:25:25.571
And that's the beauty of artificial intelligence.

00:25:25.571 --> 00:25:27.967
I can do what used to take me half an hour.

00:25:27.967 --> 00:25:30.031
I can now do it in half a minute.

00:25:30.653 --> 00:25:38.053
And it's not the sort of amazing, sparkling, brand new, cutting edge pedagogy.

00:25:38.053 --> 00:25:42.949
It's just a thing teachers have been doing forever, done more effectively for more learners.

00:25:42.949 --> 00:25:44.574
So that's our big vision.

00:25:44.574 --> 00:25:47.248
We can use it to do the basics of education better.

00:25:47.248 --> 00:25:56.836
Now, of course, fonz, there are some people out there who are doing some of that brand new, cutting edge, metaverse, virtual reality sort of stuff, and God bless those guys.

00:25:56.836 --> 00:25:58.227
That's absolutely fine.

00:25:58.227 --> 00:25:59.330
I've got nothing against it.

00:25:59.691 --> 00:26:09.253
But just when it comes to me and my practice and the vision Jason and I are trying to share, we're saying well, we can actually use it, we can encourage teachers.

00:26:09.253 --> 00:26:11.615
All the things that you currently do in your practice.

00:26:11.615 --> 00:26:15.901
They're still relevant, they're still valuable in a world that has AI in it.

00:26:15.901 --> 00:26:29.961
In fact, you can lean into those things that you're already an expert in and do them more often for more learners, and I find that's a vision that's not only compelling but also disarming for teachers, and they go actually, you know what All the expertise I already have.

00:26:29.961 --> 00:26:32.369
It's valuable, I can use it.

00:26:32.369 --> 00:26:40.517
In fact, I can do more of it, and that actually moves teachers along and helps them get excited about the possibilities of AI in their practice.

00:26:41.685 --> 00:26:45.791
It's a great answer and that's something that I find very interesting.

00:26:45.791 --> 00:26:49.917
You hit on a lot of great things there, especially with teachers just being.

00:26:49.917 --> 00:26:55.705
You know, many times they may not adapt as easily and they see this as a threat.

00:26:55.705 --> 00:27:07.731
But now the way that you explain this is just about being able to do that basic, what they would normally do or have to do to help support their students, but just doing it a lot quicker or have to do to help support their students, but just doing it a lot quicker.

00:27:07.731 --> 00:27:28.379
I mean, one of the examples could be and I've had this in my experience where you're starting a new unit, you're already well into the middle of the unit and then you get a new student that comes in, and in my area sometimes the students come in speaking Japanese or speaking Korean and now it's like, well, like I don't have that material here, and sometimes it takes long for those people that specialize in getting the materials for them.

00:27:29.087 --> 00:27:48.413
It may take them a little bit longer because there's more students, but now, quickly and effectively, like I can take a reading, translate it to somewhat, you know they can understand follow along with the lesson and to be able to, you know, keep them, you know, at pace with us, and then, of course, make any adjustments needed for those reading levels, and I think that's fantastic.

00:27:48.413 --> 00:27:50.599
Jason, what has been your experience?

00:27:50.599 --> 00:27:57.990
I know that you post a lot about this, too, as well, but you know what have been some of the good and then, of course, some of the pushback that you get.

00:27:59.593 --> 00:28:19.298
I am in a ton of meetings now where higher ed professionals so some of them are faculty, some of them are administrators and some have some other role in the college or university where it sort of ends in the same spot and it ends with someone saying we have to fundamentally change every single thing that we do.

00:28:19.298 --> 00:28:23.445
And when are you going to give us the time to do that Right?

00:28:23.445 --> 00:28:24.811
There's always that follow up.

00:28:24.811 --> 00:28:33.294
When are you going to say I know you're already teaching five classes, but now fundamentally change everything you do about assessment or anything you do about teaching?

00:28:33.294 --> 00:28:44.451
And I often use that as an opportunity, if there's time, for me to come in and do a little bit of course correction there, because I do think that there is this misunderstanding about what disruption means.

00:28:44.451 --> 00:28:50.541
I think there are certain things that this technology will really disrupt and force us to really revisit.

00:28:50.541 --> 00:29:09.116
But I don't think we should fall victim to the idea that this means we just throw everything out the window, because I completely agree with Paul that we have, regardless of what you teach, regardless of what level you're at, you have all of these strategies and techniques to engage students and help them along.

00:29:09.116 --> 00:29:10.464
We don't have to get rid of those.

00:29:10.464 --> 00:29:15.718
So a lot of it is trying to figure out things that maybe you can scale now that you couldn't scale five years ago.

00:29:15.718 --> 00:29:24.737
So his example of text differentiation is a big one, and in college level, a lot of my students are still struggling with literacy.

00:29:24.737 --> 00:29:30.485
So being able to personalize something for them and create a plan from there is really really helpful.

00:29:30.485 --> 00:29:51.334
Or if it's something that you can just do more frequently, so in the classroom, role-playing I've done for years I think I've done role-playing for 10 plus years in my humanities courses and now there are ways to scale it up a little bit more, right, you can use AI to allow for more and more practice, and so things like that that really, really help the student out.

00:29:51.334 --> 00:30:13.427
And I think that there's another part of this conversation which is and I'll kind of lean into the one of the words that Paul used, and he was talking purpose, having a sense of purpose that this technology comes to us with no sense of purpose, no real direction, and that's how we lean into our own sense of purpose.

00:30:13.447 --> 00:30:23.580
A lot of what I do when I talk to faculty members about assessment in particular, is asking them what the purpose of something is, and this is something that we should be talking about for a long time, right?

00:30:23.580 --> 00:30:36.394
So if you are teaching a literature class or whatever it is, and all of your assignments are just essays, you just have a bunch of essays that we're doing, right, really taking a step back and saying, all right, what was the purpose of that?

00:30:36.394 --> 00:30:41.675
Let's take that to the basics, let's break it down, what do you want to get out of that?

00:30:41.675 --> 00:30:46.211
And being very self-reflective about it.

00:30:46.211 --> 00:30:53.075
So for some of us, we might say, oh, I actually thought about it and I'm not that interested in giving an essay.

00:30:53.075 --> 00:31:00.033
Maybe there's another way to do it, because at the college level, we all give essays, every single class.

00:31:00.033 --> 00:31:01.096
I don't care what you're doing.

00:31:01.205 --> 00:31:04.375
A student will go and take an English class, write a bunch of essays.

00:31:04.375 --> 00:31:13.331
They'll go into history, write a bunch of essays, go into math, write a bunch of essays, go into math, yes, and write essays.

00:31:13.331 --> 00:31:14.316
Go into chem and write essays, like built into it.

00:31:14.316 --> 00:31:15.923
And so for some of us, we think about was that actually the best way to do that?

00:31:15.923 --> 00:31:27.925
And then for others and this is just where purpose comes in we might say, yes, I want my students to write an essay, and here is why, and that it's that why that really really matters.

00:31:28.326 --> 00:31:36.499
And I was talking to a group of faculty maybe about a week or so ago and one of them said you know, I really, and you know I gave a particular reason.

00:31:36.499 --> 00:31:44.408
I really want my students to write essays because I want them to see what structured thought looks like, even if it's not totally authentic.

00:31:44.408 --> 00:31:46.294
I want them to get a sense of what that looks like.

00:31:46.294 --> 00:31:54.034
And so she said but because of AI, I have to get rid of it, I have to delete it.

00:31:54.034 --> 00:31:57.005
And I came in and said no, you actually don't, you don't have to do that, right, that's that may not be necessary.

00:31:57.005 --> 00:32:07.220
And then we ended up talking about you know, maybe there's a way to make just the process more visible for students, like what are ways that maybe it's a matter of process over product?

00:32:07.220 --> 00:32:21.112
And then how can you then reshape it if you want to reshape it to focus on that, and so that allowed them to come in and say, oh, maybe I don't have to get rid of the essay, because they really believe in that and they have the evidence to support using that in their particular context.

00:32:21.664 --> 00:32:24.875
So sometimes it might make sense to move in another direction.

00:32:25.384 --> 00:32:27.173
Sometimes it might make sense to move in another direction.

00:32:27.192 --> 00:32:34.254
Sometimes it might mean, oh, you just go back to what you actually wanted out of the assignment and you can keep the essay and maybe there is a change you can make, or maybe you don't make a change at all.

00:32:34.605 --> 00:32:50.208
And being able to make those decisions is a huge part of who we are, and one of the things that I believe is that we need to do, as educators, anything that doesn't make us feel like machines, one of the defining things about us.

00:32:50.208 --> 00:33:02.753
We have these personalities, we have these senses of purpose, and so we need to actually lean into that, because moving in that direction it just makes us mechanical in a certain way.

00:33:02.753 --> 00:33:13.993
Mechanical in a certain way, and that so much of it is based off of educators being the experts, knowing, knowing the evidence, or hopefully knowing the evidence and being able to make those decisions.

00:33:13.993 --> 00:33:20.634
So I think that in the end, it comes down to just looking at everything and making a nuanced decisions, and it might not mean doing something brand new.

00:33:20.634 --> 00:33:28.075
Maybe it's making a change, maybe it's not making a change, but being able to make that decision, I think, is key for what we should be doing as educators.

00:33:28.676 --> 00:33:44.675
Excellent, and that kind of brings me to a nice segue and talking about addressing some of those concerns and I know you hit a little bit on that, jason too and Paul, you know, throughout our conversation, you know that kind of gets sprinkled in, you know, especially during a topic like this, you know, dealing with AI.

00:33:44.675 --> 00:33:47.286
But I want to go and start with you, paul, you know.

00:33:47.286 --> 00:33:59.352
I know that you've talked about this in the book too as well, talking about auditing AI with organic intelligence, and I thought that that was fantastic the way that that was titled, and I know that's one of the sections there in the book.

00:33:59.352 --> 00:34:15.398
So I want to ask you, each of you and we'll start with you, paul, talking about organic intelligence how do you approach this verification process in your own work and what tips do you offer educators who are concerned about AI accuracy?

00:34:16.425 --> 00:34:22.284
Well, look, I think educators have every right to be wary of the accuracy of AI output.

00:34:22.284 --> 00:34:30.514
We talk about it in the book, but there's three big ways that artificial intelligence can just get things wrong, and some of them are sneakier than others.

00:34:30.514 --> 00:34:35.916
So the first one is just plain wrong, and we give an example in the book of artificial intelligence.

00:34:35.916 --> 00:34:42.469
There was ChatGPT saying that there was a bloke who walked across the English channel and he did it in eight hours and it's the world record.

00:34:42.469 --> 00:34:44.454
And, of course, no one's walking across the English channel.

00:34:44.454 --> 00:34:53.322
So there's just going to be stuff out there that's plain wrong and thankfully that's pretty easy to pick up, especially for us, for our students maybe not so much.

00:34:53.322 --> 00:35:01.961
They're probably picking up that no one's walking on water, but some of those big, egregious errors, you still need your content, knowledge and your expertise to be able to pick them out.

00:35:01.961 --> 00:35:03.460
So it can be really, really wrong.

00:35:03.460 --> 00:35:05.362
It can also be subtly wrong.

00:35:05.362 --> 00:35:08.143
We talk about the James Webb telescope.

00:35:08.143 --> 00:35:10.960
I don't know if you remember back when Google was launching its AI.

00:35:10.960 --> 00:35:16.190
Oh, let me know some interesting facts about the James Webb telescope that I can talk to my son about.

00:35:16.190 --> 00:35:24.884
And he said oh well, it was the first telescope to take a photo of a planet in another galaxy or something similar, and of course it did do that, but it wasn't the first.

00:35:24.884 --> 00:35:28.726
And that wiped like a couple of hundred million off the Google share price.

00:35:28.726 --> 00:35:33.907
But it was a very subtle error and it took an astrophysicist on Twitter to pick it up.

00:35:34.014 --> 00:35:42.106
So it can be really wrong, it can be a little bit wrong or it can be what we call right but wrong, and the way that I've seen it be right but wrong.

00:35:42.106 --> 00:35:46.637
A friend of mine was telling a story where he made a multiple choice quiz for his class.

00:35:46.637 --> 00:35:51.664
As he's giving that quiz, they're sort of looking around at him and going sir, what's going on?

00:35:51.664 --> 00:35:52.666
And he's not sure.

00:35:52.666 --> 00:36:01.206
And so he looks around, goes to one of the smartest students in the class who's already completed the quiz, and every single answer is D.

00:36:01.206 --> 00:36:08.747
All of the above, and so that's right because it's technically correct, but it's also wrong because that's not how we make those quizzes.

00:36:09.115 --> 00:36:13.124
So AR can be really wrong, it can be a little bit wrong or it can be right but wrong.

00:36:13.124 --> 00:36:21.684
But the lowest common denominator, fonz, is that we just have to use our wisdom and discretion and subject matter knowledge to be able to navigate it.

00:36:21.684 --> 00:36:25.202
We have to treat it with what I would call a cheerful cynicism.

00:36:25.202 --> 00:36:29.916
So we're not angry at the tools, we're not trying to always second guess them.

00:36:29.916 --> 00:36:39.155
We're grateful that we have really powerful tools that allow us to do this work of education and do those evidence-based practices more often for more learners.

00:36:39.155 --> 00:37:00.146
But we're always just keeping half an eye out and saying, well, look, I'm remembering, I'm the expert here and so not only am I making sure that there's factual accuracy which is great to make sure that we are able to see there's no obvious mistruths but also like an alignment with where I'm coming from as a teacher At my school.

00:37:00.146 --> 00:37:06.695
We have vision, mission, values, attributes of a learner, these sorts of things, consistent whole school language.

00:37:06.695 --> 00:37:08.880
That I just don't expect in a million years.

00:37:09.101 --> 00:37:13.262
Ai is going to get right on the first crack and that's why we audit the AI.

00:37:13.262 --> 00:37:19.776
With the OI, with the organic intelligence, I go well, look, what song sheet am I singing from at my school?

00:37:19.776 --> 00:37:22.463
And how can I talk about being a courageous learner?

00:37:22.463 --> 00:37:27.643
Or how can I talk about resilience that whole school language in this lesson.

00:37:27.643 --> 00:37:34.887
So I'm auditing that AI output all the time and I'm not just checking for factual accuracy, although that's really important.

00:37:35.014 --> 00:37:39.606
I'm also saying is this actually helping form a cohesive body of work?

00:37:39.606 --> 00:37:56.266
So, as my student goes from English to science to maths, they're hearing about the same dispositions, they're hearing about the same vision and mission, they're hearing about those same core themes as a school, and of course, that's one of the big things AI can help us do as well, isn't it, fonz?

00:37:56.266 --> 00:38:09.675
Because weaving in those attributes or those visions or those dispositions that every school they'll have five or seven of them Weaving them in is like the last thing we do in our lesson and it's the first thing that we leave off if we run out of time.

00:38:09.675 --> 00:38:21.307
So we make sure we have our content and an engaging lesson, but then all that bigger whole school narrative, shared language, all that sort of stuff well, it can quite easily not be lived out.

00:38:21.307 --> 00:38:25.845
It's just laminated and sit on a wall because we don't have the time to meaningfully integrate it.

00:38:26.287 --> 00:38:48.525
It's a great thing that we can use artificial intelligence for, but the big idea here is that as educators and we talked about this at the beginning of the podcast we're the experts, right, we understand what's going on, and not just from a content and subject matter expertise angle also, just from a like we are a specific school, we're not a generic school teaching generic students.

00:38:48.525 --> 00:38:51.119
We are a specific school teaching these students.

00:38:51.119 --> 00:39:08.407
So how can we make sure that actually our AIU still respects that fact and we're not flooding our classroom with generic beige resources but we're actually being able to be more tailored and more personalized, not only for our learners but from our school perspective, than ever before?

00:39:09.916 --> 00:39:10.938
Jason, how about yourself?

00:39:12.222 --> 00:39:33.307
Yeah, one of the things that constantly throws my students for a loop and I teach AI, power communication I teach AI in my courses and one of the ideas that constantly throws them for a loop is that they need a plan for engaging with AI, that they actually need something, a process that they can use, which many of them they had no idea.

00:39:33.307 --> 00:39:39.288
So the vast majority of my students do come in and think about or approach AI as a search engine.

00:39:39.288 --> 00:39:43.655
Right, they think it's something you can just put something in there and you can just take it as it is.

00:39:43.655 --> 00:39:48.776
And I give my students an exercise and I gave the high school students the same exercise yesterday.

00:39:48.776 --> 00:39:54.909
I brought them into a room, I gave them a pen and a sheet of paper and I said all right, here's the situation.

00:39:54.909 --> 00:39:55.990
I gave them a scenario.

00:39:55.990 --> 00:40:01.663
I said imagine you are doing your work and you are given Carl.

00:40:01.663 --> 00:40:03.747
So your group is three to four people.

00:40:03.747 --> 00:40:04.829
You're given Carl.

00:40:05.135 --> 00:40:08.307
Now here's the interesting thing about Carl he's a genius.

00:40:08.307 --> 00:40:09.954
No one really disputes that.

00:40:09.954 --> 00:40:12.260
Everyone knows he's a genius, but it's a genie.

00:40:12.260 --> 00:40:15.688
He's a genius because he goes online and he just does the Internet search.

00:40:15.688 --> 00:40:18.538
He looks at Wikipedia and he has a photographic memory.

00:40:18.538 --> 00:40:22.041
He's not really reading, but he sort of just scans everything and really holds on to it.

00:40:22.041 --> 00:40:34.838
And I tell them, you know, the thing about Carl is that if you ask him a question and you can play with the numbers depending on what you see and say, 50% of the time he just regurgitates what he read online.

00:40:34.838 --> 00:40:39.735
He remembers something, grabs it from Wikipedia, gives it to you, throws it at your door Cool.

00:40:39.735 --> 00:40:57.302
25% of the time he's brilliant, he says something that's counterintuitive and really interesting, and then 25% of the time he just gets it wrong, he messes up, he brings in his own biases, and so that 25% of the time he's sort of just a wild card.

00:40:57.302 --> 00:40:59.182
You don't know what you're going to get out of him.

00:40:59.182 --> 00:41:05.983
And so I had students on there just a sheet of paper, come up with a plan how are you going to approach Carl in your group?

00:41:05.983 --> 00:41:14.847
And then I had them get together, do a kind of pair and share and create a plan for working Carl into their group in a meaningful way.

00:41:14.847 --> 00:41:25.865
And yesterday when I did this exercise, I had some of them say we're just going to ignore Carl, we're not going to use Carl, and I had others saying oh, we're going to give Carl a handler.

00:41:25.865 --> 00:41:31.101
Right, dan, you're going to be in charge of Carl, this is what you're going to do, and we, like fleshed it out.

00:41:31.101 --> 00:41:37.619
We said, all right, so what are you going to do as a handler?

00:41:37.619 --> 00:41:38.684
But you can say fact check, but what does that mean?

00:41:38.684 --> 00:41:39.407
How are you going to fact check him?

00:41:39.427 --> 00:41:48.188
And in my college class, this is when we talk about lateral reading, which is really, for many of my students, becoming a major way of reading content.

00:41:48.188 --> 00:41:52.465
So for many of us, myself included, when I think of reading, I think vertical.

00:41:52.465 --> 00:41:59.721
I think you start at the top of the page, whether it's an actual book or a webpage, and you just go at the top and you read to the very end.

00:41:59.721 --> 00:42:11.300
And one of the things that we're learning from literacy studies is that for a lot of our students, they're not really reading that way much anymore, that they are reading laterally.

00:42:11.300 --> 00:42:14.117
So as they go, they start at the top and they go down.

00:42:14.117 --> 00:42:17.325
As they do, that, they're opening up windows on the side.

00:42:17.325 --> 00:42:18.847
This is the lateral part.

00:42:18.847 --> 00:42:25.858
So it's actually they're opening up windows in their browser, and this is actually now what I encourage my students to do.

00:42:25.858 --> 00:42:27.905
This is a way of fact checking Right.

00:42:27.905 --> 00:42:33.465
So if you have that output from AI, looking at that paragraph and if something sticks out to you, do a search.

00:42:33.465 --> 00:42:41.266
Nothing comes up Well, that's a red flag or a yellow flag but saying, oh, cnn reported the same thing, okay, cool.

00:42:41.797 --> 00:42:54.635
And so you start to develop these processes, and so one of the things that I've learned with my college students is that if I can tell them to fact check something, sometimes they know how to do that.

00:42:55.338 --> 00:42:56.559
Sometimes they don't.

00:42:56.760 --> 00:43:10.182
Sometimes they see information online and I don't blame them for this at all in a kind of amorphous way, so it's hard to figure out what is coming, what information is coming from where, and I try to teach my students processes to not have that happen.

00:43:10.182 --> 00:43:25.211
And it's a bit of a slog trying to get them to make that pretty big mindset shift so that they're not really thinking about AI as a search engine but are really thinking about as needing to use that organic intelligence.

00:43:25.211 --> 00:43:32.615
And I feel a little bad every time because I feel like I'm constantly ruining AI for them, that they think it's going to be this simple thing.

00:43:32.615 --> 00:43:40.309
And then you come in and say, well, it's actually really hard or requires all this additional work, and that's when it starts to lessen the appeal a little bit.

00:43:40.309 --> 00:43:51.709
But I also think that's where we need to be if we are going to have this kind of cheerful cynicism and I love that phrase from from Paul to really, really help us out and get the most out of this technology and not lose ourselves in the process.

00:43:53.115 --> 00:43:53.777
Great, great.

00:43:53.938 --> 00:44:01.787
Well, as we start wrapping up, gentlemen, I have probably just two last questions, and one of them that I really liked is just the way that you close out your book.

00:44:01.827 --> 00:44:12.126
Again, very practical book, a lot of great tools, especially, you know, for teachers in the classroom, easy to understand and, like I mentioned, things that you can quickly turn around and immediately use.

00:44:12.186 --> 00:44:14.028
At least that's the way that I found it.

00:44:14.028 --> 00:44:28.526
But one of the things that I like is that in your book it says it ends with a commission rather than a conclusion, and if you don't mind, I would love to just read this little section here that says while most books end with a conclusion, we end our book with a commission.

00:44:28.526 --> 00:44:38.586
In this context, a commission is a hearty call to action, and our call to action is this continue to build on the principles and practices you've learned here.

00:44:38.586 --> 00:44:52.903
And then I'm just going to skip to the last part here, where it talks about the emergence of AI has not changed the principles and practices of good teaching, which is something that we've hit on throughout this whole conversation and even from the very beginning.

00:44:52.903 --> 00:45:02.930
So I want to ask you and I'll start with Paul what specific impact do you hope that you see that this book creates in a classroom for a teacher.

00:45:04.855 --> 00:45:31.278
Well, one of the things I hope that it gives teachers is a real sense of agency, that they're not sort of under the tyranny of father time all the time, like we so often, just feel the squeeze and we feel like when there's the odd freak wave of a week where we don't have a bunch of marking to do or we don't have the parent teachers or the reports aren't due, we can actually create the sort of resources that we really want to and that we know will help our learners.

00:45:31.278 --> 00:45:37.945
I really hope this book allows educators through the principles and practices of wise AI used to go.

00:45:37.945 --> 00:45:40.398
You know what I can do all those things.

00:45:40.398 --> 00:45:46.680
I can create excellent resources for my learners leveraging my wisdom, and I can do it all the time.

00:45:46.680 --> 00:45:51.885
So that's one of the big things, because this is a parallel subject, but I'll go there for just a moment.

00:45:52.652 --> 00:45:55.383
We're suffering a burnout crisis in education.

00:45:55.383 --> 00:45:58.092
I see that in Australia we still have thousands.

00:45:58.092 --> 00:45:59.838
We are three months into the school year.

00:45:59.838 --> 00:46:10.980
We have thousands of open jobs, job vacancies that are not filled by teachers and we're having a hard time getting people to become teachers and we're having a hard time keeping our teachers as teachers.

00:46:11.481 --> 00:46:19.807
A lot of my friends that I went through university with now work at a bank or they work for a mortgage broker or they've changed into parallel industries.

00:46:19.807 --> 00:46:25.186
The closest I've ever come to burnout was not when I was working long hours.

00:46:25.186 --> 00:46:39.088
I'm probably working longer hours now these days than I've ever worked in my life, and I'm incredibly satisfied because, for me, my flirtation with burnout wasn't about working too many hours, it was about not having an impact.

00:46:39.088 --> 00:46:50.918
Actually, the closest I ever came to burnout was when I was probably working less than ever, but I would walk into the classroom, I'd teach and I'd leave and I would genuinely feel that I had not made a dent.

00:46:50.918 --> 00:46:53.325
And that crushed me as an educator.

00:46:53.325 --> 00:46:54.166
Because that's what you want to do.

00:46:54.166 --> 00:46:59.144
You want to inspire the next generation of learners, you want to make a meaningful difference in people's lives.

00:46:59.144 --> 00:47:11.101
And when you have the experience of walking in and out of that classroom hundreds and hundreds of times a term and just feeling like you're not making a dent, I found that very, very difficult to deal with.

00:47:11.795 --> 00:47:13.157
And so what helps us make a dent?

00:47:13.157 --> 00:47:16.085
Well, for me it was actually that evidence-based practice.

00:47:16.085 --> 00:47:17.695
There's a stack of evidence.

00:47:17.695 --> 00:47:22.146
I go where the evidence leads and in my own anecdotal experience, fonz.

00:47:22.146 --> 00:47:24.117
I find it has a huge impact.

00:47:24.117 --> 00:47:51.046
It's no surprise that the things that the evidence shows us help students read actually do help students read, and so going into that classroom and leaving walking back out the door knowing I have actually helped that student take a step in the right direction to becoming a fluent reader and writer Well, that not only helps me feel good about my practice, but I know that I'm having that impact that I so desperately want to have.

00:47:51.046 --> 00:48:11.896
So, when it comes to my hopes for the book, yes, I hope this trickles down to students and helps them read and write they're just crucial skills that are only getting more important but also, I hope it helps teachers feel that real deep sense of impact that is just so rewarding that it's just what we want, what we got in the profession to do.

00:48:13.057 --> 00:48:14.280
So, jason, how about yourself?

00:48:15.362 --> 00:48:20.249
Yeah, I love that answer but I won't just take Paul's answer A little bit on my own.

00:48:20.249 --> 00:48:30.375
But one of the big things for me that came out of working on this project and I return to again and again as I think about it more is just how valuable community is that?

00:48:30.375 --> 00:48:37.213
One of the big ideas I think very much kind of built into this book is that this is a communal project.

00:48:37.213 --> 00:48:40.201
Right If we're thinking about something like evidence-based teaching.

00:48:40.201 --> 00:48:44.202
Right If I wanted to think about evidence-based teaching and really really figure it out.

00:48:44.202 --> 00:48:47.521
It can't just be me, it can't just be me in my silo.

00:48:47.521 --> 00:48:55.401
And one of the things that has really happened to me over the last two years especially and one of the things that has really happened to me over the last two years especially is I become best friends with our CCIT.

00:48:55.401 --> 00:48:56.822
So they focus on teaching and learning.

00:48:56.822 --> 00:49:08.164
They have, you know, decades of you know more information than I do about evidence based learning and I use them, I show them things, I throw ideas by them and say what do you think about this?

00:49:08.164 --> 00:49:09.981
And it can be very sort of general.

00:49:09.981 --> 00:49:19.305
So one of the things that I did a few probably about a month or so ago I asked them about personalized education.

00:49:19.305 --> 00:49:19.887
What should I know about it?

00:49:19.887 --> 00:49:20.650
What are the pros, what are the cons?

00:49:20.650 --> 00:49:21.673
Right, and so, like, what evidence do you have?

00:49:21.673 --> 00:49:29.657
And I think that's that's also empowering, that's when we get a you know agency built into it, when it doesn't have to just be you, and that.

00:49:29.657 --> 00:49:32.363
So that is very much a reflection of my own story too.

00:49:32.742 --> 00:49:37.450
The closest I got to just burning out was when I just was on my own.

00:49:37.450 --> 00:49:45.735
You know, quite a few years ago I was an adjunct, I was teaching at a bunch of institutions, and if you adjunct in institutions, you don't have any engagement.

00:49:45.735 --> 00:49:51.563
You go, you teach and you run to the next college and you teach there, and there's something very weirdly isolating about it.

00:49:51.563 --> 00:49:58.672
And college and you teach there, and there's something very weirdly isolating about it.

00:49:58.672 --> 00:50:09.177
And so I think that really leaning into community and working together as much as we possibly can is really the only way, in my mind, we can make these adaptations, because I think it's easy to say evidence-based teaching, but we have to figure out how to use community to get to that, and that's really where it becomes a communal project.

00:50:09.418 --> 00:50:12.264
And I think it's important that this is a co-authored book.

00:50:12.264 --> 00:50:18.304
So Paul and I threw ideas back and forth and I think commission is probably the moment when we threw ideas back and forth the most.

00:50:18.304 --> 00:50:20.623
I think that's one of the ones we went back and forth on.

00:50:20.623 --> 00:50:22.159
I sent you something he's like what about this?

00:50:22.159 --> 00:50:23.043
And you said what about this?

00:50:23.043 --> 00:50:34.126
And we had a community that was very much built into the writing process, gentlemen.

00:50:34.487 --> 00:50:35.568
And the last question.

00:50:35.568 --> 00:50:37.211
We'll do a little rapid fire question.

00:50:37.211 --> 00:50:50.579
We'll keep it at 15 seconds or less, but the question is this if there's one piece of advice each of you would be able to give an educator who's starting to explore AI for literacy instruction, what would it be?

00:50:50.579 --> 00:50:51.365
We'll start off with you, Jason.

00:50:51.365 --> 00:50:52.275
For literacy instruction, what would it be?

00:50:52.275 --> 00:50:53.956
We'll start off with you, Jason.

00:50:54.057 --> 00:50:57.360
Be as honest as you can about why literacy matters.

00:50:57.360 --> 00:50:59.521
I think it's easy to miss that.

00:50:59.521 --> 00:51:02.784
But, like, why is it important to have literacy?

00:51:02.784 --> 00:51:07.971
Be literate, however you want to phrase it, and you know how does it actually help us out in just being in the world.

00:51:08.811 --> 00:51:09.172
Excellent.

00:51:11.021 --> 00:51:15.958
Paul, I would say you can't learn to ride a bike at a seminar.

00:51:15.958 --> 00:51:24.278
So the book is good, talking about it is good, but you just have to roll up your sleeves and have a go, and as soon as you do, you realize you're not so fragile, you're not made out of glass, the AI is not going to break you.

00:51:24.278 --> 00:51:27.425
Actually, it'll be a far more intuitive process than you think.

00:51:27.425 --> 00:51:31.039
So my big encouragement would be roll up those sleeves and have a go.

00:51:31.800 --> 00:51:33.481
Something great advice, gentlemen.

00:51:33.481 --> 00:51:41.934
Well, thank you so much for just enlightening us with so much you know, and I'm taking it all in and thank you, thank you for this amazing resource.

00:51:41.934 --> 00:51:52.744
Like I said, it's something that's practical, easy to read, and it's something one of those things, like you mentioned that if you roll up your sleeve, you can easily like, read a chapter, read a section, and immediately just turn around and try it out.

00:51:52.744 --> 00:51:59.416
And that's one of the things, too, that I encourage educators to is just to improvise, adapt and overcome the technology will be coming.

00:51:59.416 --> 00:52:03.204
Take what you already know and sprinkle it onto what you're already doing.

00:52:03.222 --> 00:52:04.526
Great, and you can do it slowly.

00:52:04.526 --> 00:52:19.365
You don't have to be a speedboat, you can be a tugboat or you can be a little bit of an anchor, but eventually, just slowly continue to move towards the way of the current and just continue to move forward, just like these gentlemen have said, and they continue to help out many educators in doing the same thing.

00:52:19.365 --> 00:52:29.981
So, thank you both gentlemen for being here today and just sharing your knowledge and your experience and your expertise, and this is a great book and we'll make sure that we link this in our show notes.

00:52:29.981 --> 00:52:35.340
But, paul, I even forgot to ask Paul where is it that our audience members can go ahead?

00:52:35.780 --> 00:52:36.282
and find this book.

00:52:36.282 --> 00:52:39.782
Great question, so glad you asked so you can head to Amazon.

00:52:39.782 --> 00:52:45.941
It's available on Amazon as a physical copy and it's also available as an ebook.

00:52:46.844 --> 00:52:51.478
Perfect, excellent, we'll make sure we link that in the show notes, all right, but, gentlemen, we're still not done.

00:52:51.478 --> 00:53:01.847
I always love to end the show with these last three questions, just to kind of get a little bit more, you know, I don't know, maybe a little lighthearted a little bit, even though the conversation has been great.

00:53:01.847 --> 00:53:02.048
I love it.

00:53:02.048 --> 00:53:06.460
But we'll go ahead and start with you, paul, you being a first time guest, jason's already kind of familiar with the questions.

00:53:06.460 --> 00:53:09.043
But, paul, you being a first time guest, I'm going to start with you.

00:53:09.043 --> 00:53:17.065
As we know, every superhero has an origin story and every superhero always also has a weakness or a pain point.

00:53:17.065 --> 00:53:21.958
For Superman, kryptonite was his weakness or pain point.

00:53:21.958 --> 00:53:29.860
And I want to ask you, in the current state of education, paul, what would you say is your current edu-kryptonite?

00:53:31.784 --> 00:53:39.965
Honestly, I always feel like I'm in a rush, I feel like I don't have enough time, and that's one of the reasons I got so excited about AI.

00:53:39.965 --> 00:53:43.619
But I feel like there's a frantic nature to teaching.

00:53:43.619 --> 00:53:56.387
It often attends the experience of most teachers just that we are under the pump, you finally get your teaching and learning sequence done and then you lose four periods to a carnival and three periods to standardized testing and these sorts of things.

00:53:56.387 --> 00:54:02.258
So that's what I feel, but of course, that problem can quite easily be turned into a series of possibilities.

00:54:02.258 --> 00:54:06.307
I think, with AI helps me solve the time crunch crisis.

00:54:06.307 --> 00:54:08.860
So, yeah, that would be my kryptonite, though, fonz.

00:54:08.860 --> 00:54:10.666
Excellent, paul Jason.

00:54:11.896 --> 00:54:13.260
I get stuck in my own head.

00:54:13.260 --> 00:54:15.106
I have a hard time.

00:54:15.106 --> 00:54:17.619
I get just excited about things.

00:54:17.619 --> 00:54:23.503
I want to work it out fully in my own mind and I need that additional push to talk to the other person, talk to the student.

00:54:23.503 --> 00:54:31.539
So I get stuck in my own head and sometimes that works out well and sometimes it just means that something that hypothetically seemed like it could work it just doesn't.

00:54:31.539 --> 00:54:34.865
So I'm trying to find ways to push against that, but that's my kryptonite.

00:54:35.887 --> 00:54:36.447
Oh, excellent.

00:54:36.447 --> 00:54:37.496
Thank you so much, gentlemen.

00:54:37.496 --> 00:54:38.521
Great answers, Jason.

00:54:38.521 --> 00:54:39.990
I'm going to start with you for this next one.

00:54:39.990 --> 00:54:44.304
If you could have a billboard with anything on it, what would it be and why?

00:54:45.606 --> 00:54:52.081
Process over product, just really find ways to value the present.

00:54:52.081 --> 00:54:54.788
I think that many of us we want to like jump past it.

00:54:54.788 --> 00:54:58.586
We want to get to the point where we figure this all out or we produce that thing.

00:54:58.586 --> 00:55:01.644
I think one of the best things we can do just focus on the process.

00:55:01.644 --> 00:55:03.481
Students need to do that too, right?

00:55:03.481 --> 00:55:04.800
Learning is a process.

00:55:04.800 --> 00:55:08.074
Doing something is something that you know, you learn by doing.

00:55:08.074 --> 00:55:08.235
Right.

00:55:08.235 --> 00:55:10.945
You can't just learn to ride a bike in a seminar.

00:55:10.945 --> 00:55:15.724
You need to go, you need to do it, and that really requires this kind of process mindset.

00:55:15.724 --> 00:55:25.242
So, for me, just being able to enjoy the present, even though we all want to focus on the future and everything like that, is just so important for actually making learning meaningful.

00:55:26.105 --> 00:55:26.847
Excellent, Paul.

00:55:26.847 --> 00:55:27.327
How about you?

00:55:27.327 --> 00:55:28.519
What would your billboard say?

00:55:29.075 --> 00:55:35.963
My billboard would say education isn't about producing work, it's about producing work, it's about producing people, and that's a big message that I'm excited about.

00:55:35.963 --> 00:55:43.076
Right, so it's not about doing the work, it's about producing a certain kind of person, and that's a vision of education I'm really excited about.

00:55:43.717 --> 00:55:44.780
Excellent, Great.

00:55:44.780 --> 00:55:45.581
All right, Paul.

00:55:45.581 --> 00:55:47.246
Last question We'll start off with you.

00:55:47.246 --> 00:55:53.528
If you can trade places with anyone, anyone for one day, who would it be and why?

00:55:54.434 --> 00:55:56.079
Honestly, I'd trade places with Jason.

00:55:56.079 --> 00:56:02.365
I've never tried out higher education before and I'm hoping it comes across my path sometime.

00:56:02.365 --> 00:56:08.043
I've never taught in a university and as I look into my future, I think, hmm, yeah, that would be pretty interesting.

00:56:08.043 --> 00:56:09.385
Maybe I'll give that a go one day.

00:56:09.914 --> 00:56:10.396
There you go.

00:56:10.396 --> 00:56:11.298
Excellent Jason.

00:56:11.298 --> 00:56:12.159
How about yourself?

00:56:12.159 --> 00:56:15.066
Who would you switch places for with a day before it?

00:56:15.166 --> 00:56:18.182
is a nice answer, and I'm not going to give a nice answer.

00:56:18.182 --> 00:56:20.383
I'm going to give one that's just completely random.

00:56:20.383 --> 00:56:39.039
If I could trade places with one person, it would be Christopher Nolan, which is this very weird because he is the only person who I can think of that just gets like these really massive just projects he can do and kind of go wherever he wants with it, and so I'm very, very envious of him being able to do that and envious of his artistry.

00:56:39.039 --> 00:56:45.471
That's the person to just think like it would be cool to just have this limitless amount of money where you can use really, really ambitious things.

00:56:45.471 --> 00:56:50.663
But it also gets me away from education, but I love movies, so that's there.

00:56:50.702 --> 00:57:02.300
Well, gentlemen, again it was a pleasure speaking with you Paul Matthews, joining us in an early Saturday morning from Australia, Jason joining us here from the States what an international show.

00:57:02.300 --> 00:57:03.124
This is wonderful.

00:57:03.124 --> 00:57:07.943
Thank you both for this great experience and thank you both for what you continue to do.

00:57:07.943 --> 00:57:16.762
Please, for all our audience members, if you're not following Paul or Jason on socials, mainly on LinkedIn they put up some great stuff on LinkedIn.

00:57:16.762 --> 00:57:18.146
Make sure you follow them.

00:57:18.146 --> 00:57:25.166
If you have any questions, please just even put it in their comments and they're very gracious at answering and all they want to do is really help.

00:57:25.327 --> 00:57:27.416
And, of course, you've got two different viewpoints.

00:57:27.416 --> 00:57:34.427
You've got, you know, kind of like the high school, I would say you know, in that K-12 space, and then, of course, you've got the higher ed space also as well.

00:57:34.427 --> 00:57:35.990
So please make sure you reach out to them.

00:57:35.990 --> 00:57:58.084
And, of course, guys, if you want more episodes and more just, high quality content and conversations like this, please make sure you visit our website at myedtechlife myedtechlife where you can check out this amazing episode and the other 317 episodes, where, I promise you, you will find a little something that you can sprinkle onto what you are already doing.

00:57:58.084 --> 00:57:59.347
Great, and, as always.

00:57:59.347 --> 00:58:03.383
Thank you so much to our sponsors Book Creator, yellowdig, eduaid.

00:58:03.383 --> 00:58:08.222
Thank you so much for believing in our mission and thank you so much for making these shows happen.

00:58:08.222 --> 00:58:11.976
We really appreciate you from the bottom of my heart, but my friends also.

00:58:11.976 --> 00:58:42.135
As always, until next time, don't forget, stay, techie you.
Jason Gulya Profile Photo

English Professors

I'm an English Professor at Berkeley College, where I teach anything related to writing and the humanities.

Lately, my focus has been on encouraging higher education faculty and staff to incorporate AI into their workflows and teach responsibly and selectively.

Paul Matthews Profile Photo

Author

Paul is an Australian teacher, consultant, and TEDx speaker who is dedicated to partnering with schools to navigate the challenges and opportunities presented by Artificial Intelligence. Paul is passionate about providing practical strategies to help teachers save time, be more creative, and have a bigger impact in the classroom.