WEBVTT
00:00:30.096 --> 00:00:33.557
Hello everybody and welcome to another great episode of my EdTech Life.
00:00:33.557 --> 00:00:39.109
Thank you so much for joining us on this wonderful day and wherever it is that you're joining us from around the world.
00:00:39.109 --> 00:00:42.189
Thank you, as always, for all of your support.
00:00:42.189 --> 00:00:44.627
As always, we appreciate all the likes, the shares, the follows.
00:00:44.627 --> 00:00:46.593
Thank you so much for interacting with our content, as always, for all of your support.
00:00:46.593 --> 00:00:47.877
As always, we appreciate all the likes, the shares, the follows.
00:00:47.877 --> 00:00:48.920
Thank you so much for interacting with our content.
00:00:49.719 --> 00:00:54.152
We really appreciate your support and I definitely want to give a big shout out to our newest supporter, book Creator.
00:00:54.152 --> 00:01:05.543
Thank you so much for supporting our mission and believing in what we're doing and bringing some amazing conversations into the education space so that we may all continue to grow together.
00:01:05.543 --> 00:01:14.004
And I'm really excited about today's conversation, as always, like I said, being able to have your own podcast and being able to look for guests.
00:01:14.004 --> 00:01:18.346
It's always amazing when things line up and you get to bring a guest on that.
00:01:18.346 --> 00:01:34.396
You follow on a certain platform and you just are really interested and intrigued by their views and what they post and you just want to bring those conversations and amplify their voices also here at our table at the my EdTech Life conversation table and I'm really excited to welcome today.
00:01:39.980 --> 00:01:40.361
Mr Rob Nelson.
00:01:40.361 --> 00:01:41.224
Rob, how are you doing today?
00:01:41.245 --> 00:01:43.112
I'm doing great Bonds, Thanks so much for having me on the show.
00:01:43.112 --> 00:01:45.680
I'm a big fan of what you do here Thank you very much, rob and I'm a big fan of what you do.
00:01:45.680 --> 00:01:48.888
And you're posting on your blog and, of course, on LinkedIn.
00:01:48.888 --> 00:02:00.165
And, of course, as you know, you know, 2022, from then on, has been just so much content for us, so much conversation sparking so many things and so many new ideas.
00:02:00.165 --> 00:02:10.171
And we're going to get into that, you know, because I really love your take, your perspectives and, like I said, being a cautious advocate and, you know, kind of being in the middle of things.
00:02:10.171 --> 00:02:21.770
You know, I always love to bring various viewpoints to the table when it, you know, excuse me, bring various viewpoints to the table when we are talking about AI.
00:02:21.770 --> 00:02:37.524
So, before we dive in, rob, for all my audience members that are out there listening at this moment and may not be familiar with your work just yet, but after today they will can you give us a little brief introduction and what your context is within the education space?
00:02:38.286 --> 00:02:38.485
Sure.
00:02:38.485 --> 00:02:45.941
So my ed tech life was as an educational bureaucrat in the provost office at the University of Pennsylvania for 18 years.
00:02:45.941 --> 00:02:47.949
So at heart I'm a teacher.
00:02:47.949 --> 00:03:14.270
I started out as a teacher, but I learned early on that teaching will burn you out, and so I made a decision when I finished my PhD at Rutgers to go into administration, and so I oversaw academic technology for the provost office at Penn for 18 years, and that meant implementing enterprise technology, things like Canvas, course evaluations, grad admissions applications, and so that's really where my professional experience lies.
00:03:14.270 --> 00:03:17.260
And then I also taught part-time and continue to teach part-time.
00:03:17.260 --> 00:03:20.368
And recently, as you said, I've made a transition.
00:03:20.368 --> 00:03:29.967
I left my job as a bureaucrat and now I write full-time and still teach on the side and do a little consulting and public speaking as a way to pay the bills.
00:03:30.909 --> 00:03:31.372
Excellent.
00:03:31.372 --> 00:03:37.272
Well, that is a great background which kind of, in a very natural way, just is a nice segue to my first question.
00:03:37.272 --> 00:04:01.173
Based on all the experience that you just finished sharing with us, I wanted to ask you how have those experiences shaped your perspective of AI in education and what experiences, from your years overseeing technologies and initiatives and so on, you know, what are some things that we may be missing in today's AI in education discussions.
00:04:02.699 --> 00:04:14.069
Yeah, I think it comes down to adoption of new technologies, which when a new technology comes out, it sort of explodes on the scene and the early adopters and enthusiasts start talking about it.
00:04:14.069 --> 00:04:22.564
It's often in the context of like, wow, this is going to change everything and it feels like it's going to be instant because it's so obvious to the early adopters that this is going to be so great.
00:04:22.564 --> 00:04:26.012
Sometimes it turns out to be great and sometimes it turns out to be kind of a dud.
00:04:26.012 --> 00:04:37.446
So, for example, I think the MOOCs the massive online courses, were a technology that everybody looked at and thought, wow, this is going to disrupt education.
00:04:37.446 --> 00:04:40.648
Places like Harvard and Penn are just going to disappear.
00:04:40.648 --> 00:04:46.742
That turned out not to be the case and I think you know we can talk about that specific case.
00:04:46.742 --> 00:05:00.630
But I think the lesson I've drawn from my years implementing technology on campus is that change happens at the speed of universities and colleges and schools, not at the speed of the technology companies and technology enthusiasts would necessarily think.
00:05:02.740 --> 00:05:04.889
You know, and that's something that's very interesting, like you said.
00:05:04.889 --> 00:05:16.791
You know, for those early adopters, it just seems like everything like this new technology and anything new that comes out, is going to be, you know, the solution to all our inefficiencies.
00:05:16.791 --> 00:05:26.192
And, of course, in education right now I'm just going to go ahead and throw it out there in the state of Texas, we're getting ready in the next couple of weeks to start, you know, state testing.
00:05:26.192 --> 00:05:44.250
Prior to this, a couple of weeks ago, we started seeing where around the district, you know, schools were purchasing specific platforms and you know, kind of like at the last minute, hoorah, you know, trying to get those grades up, and pretty much it's just a triage to make sure that the students do well.
00:05:44.250 --> 00:05:46.346
But, you know, trying to get those grades up, and pretty much it's just a triage to make sure that the students do well.
00:05:46.346 --> 00:06:10.312
But you know, one of the things is is that, from my years of experience, it just seems that, starting January, everybody's trying to find that one answer and I think sometimes, even with AI, when that came out in 2022, and even until now, you still see that and hear like, yes, this is going to change things, this is going to change things and from 22 till now, you know I feel that there still hasn't been enough research.
00:06:10.312 --> 00:06:24.528
But I'm still interested to see you know some of that research and see if grades are going up, because, of course, like you mentioned new technologies I've seen the iPads roll out and everybody's one-to-one, and this is going to revolutionize education and this is going to get those test scores up.
00:06:24.990 --> 00:06:26.232
Haven't really seen that either.
00:06:26.232 --> 00:06:34.526
Chromebooks, everybody's one-to-one, and the Chromebooks are going to be the next best thing and they're going to help our students and I personally have not seen that either.
00:06:34.526 --> 00:06:52.404
So it's very interesting that you do mention that and, especially with your experience in higher ed I did have Jason Guglia on the show yesterday and interviewing, so I want to get your perspective as far as what you have seen and experienced in higher education that divide, you know.
00:06:52.404 --> 00:06:55.331
So I want to hear what has been your experience.
00:06:57.254 --> 00:06:57.454
Sure.
00:06:57.454 --> 00:07:01.331
So I think it comes down to some of the different perspectives I was talking about.
00:07:01.331 --> 00:07:16.040
Educational technologists, the people who develop and build new technology are, I believe, in most cases earnestly trying to solve problems, but they see problems differently than the way that classroom teachers see them, and Jason's a good example.
00:07:16.040 --> 00:07:16.742
Somebody talks about this.
00:07:16.742 --> 00:07:21.302
Dan Meyer, I know you've had on the show, is another great example of somebody who sees that difference.
00:07:21.302 --> 00:07:43.271
And so I think you know what I've seen is a lot of polarization, people who are rightly concerned about the way that Silicon Valley in particular, the sort of big giant technology companies, are using their not just their financial capital, but the cultural and social power they have to sort of impose a vision of what this technology is supposed to do.
00:07:43.271 --> 00:07:47.129
And then you've got people who are resisting that, and I'm certainly among those.
00:07:47.250 --> 00:07:52.468
I think there's a great deal of concern about the social and economic context in which this stuff is happening.
00:07:52.468 --> 00:08:02.485
But at the level of classroom instruction, I think what's happening is, for the most part, teachers are coming to grips with this technology the way they have all the technologies you've talked about.
00:08:02.485 --> 00:08:12.541
You know everything that's from iPhones to iPads, to the PC, Chromebooks, laptops, going back to the earlier history of technology.
00:08:12.541 --> 00:08:24.529
Even the adoption of chalk and chalkboards Like those are changes that technology has sort of pushed on or confronted teachers with, and they've responded and turned those tools into things that are useful.
00:08:24.529 --> 00:08:26.326
I think that's the process.
00:08:26.326 --> 00:08:26.908
That's what I see.
00:08:26.908 --> 00:08:42.650
That's the positive aspect of what's happening right now is I see lots of teachers experimenting with these tools, trying to figure out what their value is as an educational tool, as opposed to what the people who built them necessarily think they should be used for, how they imagine it's going to play out Nice, Excellent.
00:08:42.931 --> 00:08:43.172
All right.
00:08:43.172 --> 00:08:52.265
So I want to kind of talk a little bit about your writing, too as well.
00:08:52.265 --> 00:09:03.544
I know that you have the AI log, you know which is where on Substack, which I follow and that you post and you share a lot of great views, but one of them that draws my, or has drawn my, attention, and I wanted to ask you about it today, was the way that you use the rabbit and the duck metaphor.
00:09:03.544 --> 00:09:12.589
We have the rabbit of glad tidings and the duck of doom, so can you elaborate on this dual perception and how it impacts education policy and decisions?
00:09:13.431 --> 00:09:13.711
Sure.
00:09:13.711 --> 00:09:21.383
So the duck rabbit is a famous example of what's sometimes called an ambiguous picture, like what you see.
00:09:21.383 --> 00:09:24.769
When you look at the picture, you can make it look like one thing or the other.
00:09:24.769 --> 00:09:49.028
And the duck rabbit is a famous one because the philosopher Wittgenstein used it in some of his work and the sort of way that teachers received that was certainly in the realm of like.
00:09:49.139 --> 00:09:50.364
Oh my God, this is changing everything.
00:09:50.364 --> 00:09:51.729
It's awful, we need to stop it.
00:09:51.729 --> 00:09:55.726
We need to stop our students from using it, but then you've got people who see it.
00:09:55.726 --> 00:09:57.025
Well, this is a new technology.
00:09:57.025 --> 00:10:02.032
It's an exciting way to understand and use knowledge.
00:10:02.032 --> 00:10:07.157
These models, these partial language models, summarize information and spit it back out and use knowledge.
00:10:07.157 --> 00:10:12.129
These models, these large language models, summarize information and spit it back out in ways that are interesting and potentially educational.
00:10:12.129 --> 00:10:18.110
And so, seeing those two aspects, this thing that's going to is very threatening to our jobs.
00:10:18.110 --> 00:10:19.525
You know AI is going to replace this.
00:10:19.525 --> 00:10:22.009
That message gets repeated over and over again.
00:10:22.009 --> 00:10:28.206
On the other hand, ai is going to save us all this work because it can do the boring stuff so that we can turn our attention to the important work.
00:10:28.206 --> 00:10:33.272
So that's the sort of swirl of what's been happening.
00:10:33.272 --> 00:10:38.609
I think that image of the duck rabbit is a nice way to say it's both those things and something else, something new.
00:10:40.503 --> 00:10:55.350
Now, with your experience and I know that you're a speaker now and you go out there and train or people go and listen I want to ask you, when you talk about this and one of the things that you mentioned that duality of this being both things what are gonna take our jobs?
00:10:55.350 --> 00:11:09.908
And then there's the other side that says no, no one's gonna take your job, only somebody else that uses AI effectively.
00:11:09.908 --> 00:11:20.447
And then one side, I see also is just real, like playing into the fear of this, like if you haven't even been using this today, you're doing your students a disservice.
00:11:20.447 --> 00:11:32.787
And then there's that side and so, like I said, what is it that you're seeing out there and how do you, you know, just kind of bring that together for yourself to say, okay, where is it that I stand on these issues?
00:11:33.809 --> 00:11:38.142
Yeah, well, I think we still have an enormously wide range of people responding to this.
00:11:38.142 --> 00:11:45.267
There are still people who have never really used one of these tools before, and so they're getting all their information secondhand.
00:11:45.267 --> 00:11:54.750
On the other hand, you have people who have been using them from the very beginning to try to do interesting or educational work with them, and so I think that challenge remains.
00:11:54.750 --> 00:12:10.254
But what I've seen over time I mean it's been two and a half years, basically, since chat TBT sort of exploded on the scene, and that's not a lot of time, but it's time enough for people to have moved away from the sort of freak out modes that we saw.
00:12:11.277 --> 00:12:28.787
And when I've been giving talks lately, I've used the image of like maybe it's a revolution, but maybe it's going to be a boring revolution that the notion that this is going to be transformative and we're going to see these things, super intelligences appear, and everything's going to change overnight just isn't how we're going to experience this.
00:12:28.787 --> 00:12:32.769
Much like things like the iPhone and the PC.
00:12:32.769 --> 00:12:55.351
They feel and are transformative and they're going to change things, but that process takes place over a period of time, there's adjustments, and what I really believe is important is that we use our critical thinking skills and analysis to think about how we use these things to improve education, to improve organizations, and that's where I think the conversation is starting to move.
00:12:56.320 --> 00:13:02.364
Good and that's excellent and that's so good to hear, because there's webinar after webinar that I'll sit on too as well.
00:13:02.364 --> 00:13:26.605
And I don't know if it's so much more in the K-12 space as it is in the higher ed space, because I'm not involved in webinars at that level, but in the K-12 space there just seems to be this sense of urgency like hey, if you're not doing this, you're doing your students a disservice, because this is the way that they're growing and this is the way that they're going to need these skills to get into college and for the jobs of tomorrow.
00:13:26.605 --> 00:13:32.168
And it's almost this attitude of move fast and break things as opposed to just simply.
00:13:32.168 --> 00:13:40.734
Like you said, it can revolutionize, it will, but maybe it's in a slower, more boring and more, you know, calm process process.
00:13:40.734 --> 00:13:44.456
But it just seems like it's a go go, go, go go mentality.
00:13:44.557 --> 00:13:58.506
And going to conferences you see some of the top platforms that are out there that are just pushing this so much and you know, I feel sometimes they may be even preying on some educators, as far as you know, onto the burnout aspect.
00:13:58.506 --> 00:14:00.176
Like this is going to save you that time.
00:14:00.176 --> 00:14:01.681
This is going to give you back that time.
00:14:01.681 --> 00:14:10.206
This is going to do this and this is going to do that, but you have to pay us, you know, I don't know X amount of dollars per license, per site and so on.
00:14:10.206 --> 00:14:17.912
And a lot of educators are like, oh my gosh, I really need this, because they feel overwhelmed and they feel like this is going to be that sense of relief.
00:14:17.912 --> 00:14:25.557
Now, in higher ed is there kind of that mentality, or maybe is there a crowd like that too as well.
00:14:26.740 --> 00:14:27.140
Absolutely.
00:14:27.140 --> 00:14:44.173
I mean, higher ed is sort of the same pressures that K-12 has in terms of you know it's cast in terms of business needs right, businesses need graduates with these skills and I think Silicon Valley I mean you mentioned the Zucker fame, zuckerberg or Facebook idea move past and break things.
00:14:44.173 --> 00:14:45.778
You mentioned the Zucker fame, zuckerberg or Facebook idea.
00:14:45.778 --> 00:14:46.519
Move past and break things.
00:14:46.519 --> 00:15:01.157
They're very much using that as a marketing term, right, as a as a way of marketing these projects and trying to raise revenue, which they have invested huge amounts of capital in this technology and they need to get a return and that's a big problem for them.
00:15:02.019 --> 00:15:05.409
It's not the problem that K-12 or higher ed is trying to solve.
00:15:05.409 --> 00:15:40.606
We're trying to figure out how to educate students for this new society, for the changes that are coming, for the way this is going to transform the work of people who work in knowledge, business and education, and so I think refocusing our attention on, well, what is it we need these tools to do and how do we get the tools to do what we want them to do, is really cutting against the grain of the way that Silicon Valley and giant technology companies and a lot of the startups who are in that sort of movement that move fast and break things, movement um are are talking about it, and so that that's that, that misalignment.
00:15:40.606 --> 00:15:55.167
I think that that I, I, I think, slowly but surely um teachers, institutions, um are coming around to uh to establishing their domain, their uh ability to control or decide how these things get deployed.
00:15:56.051 --> 00:15:56.472
Excellent.
00:15:56.472 --> 00:16:08.735
Now talking about deployment, you know, kind of going back to your writing, because if you haven't, for those listeners that are joining us today or watching us on the replay, please make sure that you do check out Rob's Substack.
00:16:08.735 --> 00:16:19.620
He has great writing there and you know a lot of these questions you know is just going a little bit deeper into the writings and that he has and has available to all of us, which is a great resource.
00:16:19.620 --> 00:16:23.033
So please, I mean you know we'll definitely link it in the show notes as well.
00:16:23.033 --> 00:16:30.308
But, rob, I wanted to ask you you know you have this series it's a two part series is what is an LLM doing in my classroom?
00:16:30.308 --> 00:16:38.669
So I want to ask you how do you perceive the role of large language models in enhancing or possibly hindering the learning process?
00:16:39.793 --> 00:16:41.336
Yeah, thanks for asking about that.
00:16:41.336 --> 00:16:46.393
So that series is actually going to wrap up, maybe tomorrow, but certainly next week.
00:16:46.393 --> 00:16:52.062
I've got one more piece to write about for that, and that was really a reflection on my own practice in the classroom.
00:16:52.062 --> 00:17:27.294
I'm a history teacher and the history I teach right now is in the grad school of education at Penn, and I'm teaching mostly aspiring educational bureaucrats, people who want to go into administration, and so I'm enormously lucky to be teaching in that environment, and one of the ways I was lucky is that, unlike a lot of teachers, I didn't have to figure out how to get one of these commercial tools to use, because there's a research center on campus that was willing to work with me so that I could use this technology in my teaching.
00:17:27.734 --> 00:17:29.238
I'm somebody who believes in.
00:17:29.238 --> 00:17:30.848
It's often called a flip classroom.
00:17:30.848 --> 00:17:35.624
I called it structured activities, and so my class is activities-based.
00:17:35.624 --> 00:17:44.874
It's very much student-focused, and so I treated the use of the LLM not as something I had to decide, but as something the students and I could work out together.
00:17:44.874 --> 00:17:49.186
How is this going to influence their work?
00:17:49.186 --> 00:17:52.112
How is it going to be valuable to them as an educational tool?
00:17:52.112 --> 00:18:00.290
How is it going to, you know, frustrate those aspirations or aims, and so we worked with that tool.
00:18:00.290 --> 00:18:05.209
It's called GPTA and it's basically an LLM-based assistant.
00:18:05.209 --> 00:18:06.813
They call it an assistant.
00:18:06.813 --> 00:18:20.414
I think of it as just a tool, just like a chalkboard or a pointer or anything else that you use in a classroom, and so we use that technology together, and those essays are reflections on how that went Excellent.
00:18:20.694 --> 00:18:21.897
So what is it?
00:18:21.897 --> 00:18:22.358
Do you know?
00:18:22.358 --> 00:18:26.135
As far as your essays that you've written, you know, and looking into that, you know.
00:18:26.135 --> 00:18:33.230
What do you see as far as this possibly hindering, or is this something that can possibly enhance the potential of learning in the classroom?
00:18:34.231 --> 00:18:34.512
Yeah.
00:18:34.512 --> 00:18:41.539
Well, I think it starts with again resisting that notion of it being like a teaching assistant or a teacher.
00:18:41.539 --> 00:18:43.666
These things are not going to replace teachers.
00:18:43.666 --> 00:18:51.510
There's just no way, and Dan Meyer is one of the best at describing that difference between what a tool is and what a teacher is.
00:18:51.510 --> 00:18:57.690
But so, taking that as our base, like, okay, this is a tool, what is it and what is it going to do for us?
00:18:57.690 --> 00:18:58.755
What value does it bring?
00:18:58.755 --> 00:19:07.357
I think what I discovered through this process is that students are very much able to make decisions about their own education.
00:19:07.357 --> 00:19:15.790
I believe that I see evidence of it in their activities, and so they were making choices, with my guidance, about how to use it, and it wasn't to write their essays for them.
00:19:16.932 --> 00:19:24.049
I am very confident that the students were not using this tool simply to replace the educational work that they needed to do.
00:19:24.049 --> 00:19:26.596
It was instead an additional resource.
00:19:26.596 --> 00:19:34.326
So one of the ways we used the tool was to add a layer of peer review, or add a layer of review to a peer review driven process.
00:19:34.326 --> 00:19:39.931
So in my classes, we write a long research essay about an institution of higher education.
00:19:39.931 --> 00:19:44.567
So one of the ways we used the LLM was to.
00:19:44.646 --> 00:19:51.405
Before they sat down with their student peer review group, they had the LLM do a review of their essay.
00:19:51.405 --> 00:20:06.067
It was trained on post-trained, I should say on my rubric, my sort of language that I use, that I want them to use too when they're evaluating a piece of writing, and the tool did that for them.
00:20:06.067 --> 00:20:09.435
I printed that out with the papers.
00:20:09.435 --> 00:20:14.191
And the peer review wasn't just about a student's reaction to the paper.
00:20:14.191 --> 00:20:26.917
It was about the student's reaction to the paper and then this initial machine review of the paper that they could incorporate into their own analysis of their peers' paper and the feedback they were going to give to the student.
00:20:58.015 --> 00:21:07.979
All right, and so for a lot of your students that are working with you in this class, did a lot of them already come in with experience using large language models?
00:21:07.979 --> 00:21:11.837
For some of them, was this their first time and what were some of the reactions there?
00:21:11.837 --> 00:21:17.180
If you can share that with us, because, again, I definitely want to get that perspective for our K-12,.
00:21:17.180 --> 00:21:25.285
You know educators that are saying you know we need to prepare them now for you know higher ed too, as well, as if higher ed is starting to adopt this.
00:21:25.285 --> 00:21:29.180
So what were some of the reactions there from some of your students?
00:21:30.241 --> 00:21:43.532
Yeah, to start with, I tried my best to make it clear that I wasn't going to be surveilling what they do, I wasn't going to be looking over their shoulder, that I wanted it to be a space of genuine experimentation.
00:21:43.532 --> 00:21:47.305
So I wasn't going to put limits on their use, tell them they couldn't use it for this, couldn't use it for that.
00:21:47.305 --> 00:22:05.584
But I did want them to come to class to share what their experience was, and initially, almost all of them had had pretty significant experience with ChatGPT and they used it, like many of us do, as just a sort of replacement for Google Search, a natural language interface to the internet.
00:22:05.584 --> 00:22:06.936
And I think that's one of the ways.
00:22:06.936 --> 00:22:17.640
I mean, that's definitely one thing that these models do is they provide a natural language interface to lots of information, including the ability to search the internet.
00:22:17.640 --> 00:22:24.541
And so we started with that sort of baseline, and the question I kept asking is what value can we get out of this tool?
00:22:24.541 --> 00:22:26.686
What educational value can we get out of this tool?
00:22:27.615 --> 00:22:32.882
And it's very clear when you frame it that way that simply having it do your work for you is not going to be educational value.
00:22:32.882 --> 00:22:42.023
There's no value there, and so, turning that from, okay, it can be used to cheat or it can be used to replace the output you need to give your teacher.
00:22:42.023 --> 00:22:43.922
How can we turn this into a process?
00:22:43.922 --> 00:22:45.718
And what are those processes look like?
00:22:45.718 --> 00:22:47.442
And I think there are lots of.
00:22:47.442 --> 00:22:50.635
You had Mike Kintz and Nick Podoletsky on the show.
00:22:50.635 --> 00:23:07.317
They're examples of people working in KT12, or very familiar with the KT12 environment who are doing this same kind of work, and so I think it's about working with your students in groups, learning as a social activity and taking that social nature and really just experimenting with the tools.
00:23:07.317 --> 00:23:14.146
And, like I said before, I think we're still in the early days of this, and so we're still finding out a lot about the educational potential for these tools.
00:23:14.914 --> 00:23:15.135
Excellent.
00:23:15.135 --> 00:23:33.561
Well, that's so good to hear and, of course, just for them, getting that experience and really seeing this as a maybe higher level experience, as opposed, like you mentioned, just, you know, kind of Google search using ChatGPT in that way, but now really going in deeper and seeing what can be done.
00:23:33.561 --> 00:23:34.124
So that's fantastic.
00:23:34.124 --> 00:23:35.185
But kind of brings me to my next question.
00:23:35.185 --> 00:23:37.511
You know, being that you are, you know, an educator too as well.
00:23:37.511 --> 00:23:42.086
I know that you've written extensively about anthropomorphizing AI.
00:23:46.775 --> 00:23:54.862
Anthropomorphizing AI Now, there was a post recently that I read, where there was somebody that posted you know, things that I will, I love to use AI for, and one of them they put the reasons was historical figures, you know.
00:23:54.862 --> 00:24:09.744
And then, of course, I had another gentleman that I saw there who's you know that I follow also as well that they were just opposed to it as far as you know, kind of going into and leaning into this, where now you're talking to this historical figure and the dangers that can come about.
00:24:09.744 --> 00:24:15.461
So what are your thoughts on that as far as anthropomorphizing, and what has been your experience with that?
00:24:17.035 --> 00:24:18.320
Yeah, thank you for that question.
00:24:18.320 --> 00:24:19.498
It's as a historian.
00:24:19.498 --> 00:24:22.747
It's something I care deeply about and thought a lot about.
00:24:22.747 --> 00:24:35.663
Some of my earliest writing was my experiments with um Conmigo's um uh tool that allows you to uh chat with a historical figure, uh, or um, a uh uh or a literary character.
00:24:35.663 --> 00:24:54.720
So John Warner who, uh, I hope you have on your show sometime because he's a he's a great, a truly great writer on this topic uh, on these topics, um, he calls it digital necromancy, in other words, the sort of sense that you can revive a historical figure using an LLM.
00:24:54.740 --> 00:24:57.045
I think that's just the wrong way to think about what these tools are, because they are they.
00:24:57.045 --> 00:24:59.537
You know, the whole project of artificial intelligence.
00:24:59.537 --> 00:25:09.782
It has been built around this metaphor that a machine thinking machine is like a human mind and it's gotten us some great new tools.
00:25:09.782 --> 00:25:17.256
But I think it's a fundamentally flawed way of thinking about this in the context of education, because, of course, you're not talking to another person.
00:25:17.256 --> 00:25:33.997
You can pretend that it is, but it's simply a machine, and I think there are just much better ways to think about how we use a cultural technology, like a large language model, than having it pretend to be a person, and so that's where I start.
00:25:33.997 --> 00:25:43.246
If all we're doing with these tools is pretending they're people, an assistant, a dead historical figure, then we're missing a lot of their potential use.
00:25:45.075 --> 00:25:47.727
Yeah, and that's something that I know, that I see often in the K-12 space there, missing a lot of their potential use.
00:25:47.727 --> 00:25:56.026
Yeah, and you know, and that's something that I know, that we I see often in the K-12 space, there are a lot of platforms out there that will offer these chatbots, and then, of course, teachers put in information and so on.
00:25:56.026 --> 00:26:05.628
And I know one of the comments was like well, this is what we can be doing, you know, and really getting the students to know more about history and learning more about history.
00:26:05.628 --> 00:26:21.936
And to me it just seems like you know, many times, like you mentioned being that it is a large language model and there are a lot of data sets that go in there, and my thinking is always well, but whose history is it sharing, you know, and what viewpoints, and so on and so forth, and it's always just concerned about the bias too as well.
00:26:21.936 --> 00:26:25.701
And so you know, for me that's just concerned about the bias too as well, and so, you know, for me that's just one of the biggest things.
00:26:25.761 --> 00:27:19.461
But also, when there might be an attachment, you know, to a certain nlm, llm and, like we see now with a lot of platforms too as well, that like character ai, when you know we can't deny that that has been in the news or it was in the news last october and so on, and when we start seeing these, uh, chatbots and starting to have parasocial relationships with them and thinking like it is another human being, like hey, I can do this at home, I can do this on my own, and that can lead into other dangerous you know aspects of using AI as well, so thank you so much for sharing that, which kind of also now, in talking about this, you know two very, excuse me, two very indifferent viewpoints.
00:27:19.461 --> 00:27:24.760
You know that could be very polarizing and I wanted to talk to you about this because I love the way that you put this in your writing.
00:27:24.760 --> 00:27:30.088
It's like the ai fight club, you know, and it just sounds like, wow, you know.
00:27:30.088 --> 00:27:31.698
So I want to talk a little bit about that.
00:27:31.698 --> 00:28:26.297
You know, as far as education and we talked a little bit about it in the in you know pre-show, where there are, you know, two sides and sometimes it could be very rightly divided.
00:28:26.297 --> 00:28:45.427
But, like I mentioned, I at least would love to see myself and I think I see myself and others see myself as just kind of like a kind of in the middle, cautious advocate, trying to bring you know both viewpoints to the table to be able to share and see and learn and just kind of you know, see how we can kind of maybe bridge some gaps there and so on.
00:28:45.427 --> 00:28:50.786
But I want to get your perspective and what was the inspiration behind this term?
00:28:50.846 --> 00:28:51.889
The AI Fight Club.
00:28:54.896 --> 00:28:56.100
Yeah, so I should be clear.
00:28:56.100 --> 00:28:57.084
This is not my term.
00:28:57.084 --> 00:29:04.008
I am borrowing it from one of the best writers on the topic of large language models as a cultural technology.
00:29:04.008 --> 00:29:05.028
His name's Henry Farrell.
00:29:05.028 --> 00:29:27.919
He's a political scientist at Johns Hopkins and he has a blog on Substack called Programmable Muda, and we'll make sure that goes in the show notes because I want to make sure that he gets the attention for having come up with this great metaphor, which is he says that it's an example of the way that these things are being polarized.
00:29:27.919 --> 00:29:29.522
Right, there's this.
00:29:29.662 --> 00:30:03.383
We talked already about the dynamic of enthusiasts and resistors, and a lot of that gets caught up in the power around Silicon Valley and the power of educational technologies, and so I think there is a way I mean, it isn't obviously just this question, there's a whole way in which these social questions and educational questions get polarized, and that happens around particular approaches to writing or particular approaches to learning comprehension, like all the sort of wars around just basic pedagogical methods, and I just think we need to back off of that.
00:30:03.824 --> 00:30:09.884
I think you said something earlier about there not being one best method or one best set of practices.
00:30:09.884 --> 00:30:30.143
We just need to sort of open ourselves up to pluralism and to think that it's perfectly okay for a student to come to my classroom and be given free reign with these tools to explore their educational potential and then go to somebody else's class and be constrained and told no, we're not going to use those tools for this educational experience.
00:30:30.143 --> 00:30:40.922
That sort of pluralism, that notion that we are trying to work towards, an understanding of this that's shared as opposed to ah, I figured this out, I'm right.
00:30:40.922 --> 00:30:43.147
I'm going to tell you what you have to do.
00:30:45.296 --> 00:30:53.728
You know, and that's something like I said, you know, in a lot of conversations that we have and, of course, on LinkedIn, you know, you always have those great conversations too as well.
00:30:53.728 --> 00:31:05.568
And you know, again, to me it's like I do definitely see that that there's like those two sides and, like I mentioned earlier, it's just like that move fast, break things, kind of fear, like your kids are missing out, you're doing them harm.
00:31:05.568 --> 00:31:08.021
And then the others that are OK, let's wait and see.
00:31:08.021 --> 00:31:15.343
And then the others that are okay, let's wait and see, let's make sure that there's more research out there, and so on, and just kind of just trying to bring those parties together.
00:31:15.343 --> 00:31:17.375
And then, just like you mentioned, understanding that there's more than one way.
00:31:17.535 --> 00:31:37.035
I know, recently I was in a conference in Puerto Rico and the keynote speaker it just seemed like everybody in the room was very quiet because of the amount of fear that was put into the educators of saying, if you haven't been doing this, if you have this and this and so on, your students are already going to be left behind.
00:31:37.035 --> 00:31:40.786
And the teachers are like, well, just kind of taking it all in.
00:31:40.786 --> 00:31:55.566
And you know, and once I went up there we had a panel and the same speaker was there and I just, you know, told the teachers, just to kind of bring some peace to them, I said, listen, you know we're all at varying levels in this you know trajectory, this journey that we're all moving to together.
00:31:55.566 --> 00:32:15.036
Of course, renee Dawson I you know she's great and she says, you know, there's the speedboats, there are the tugboats, and then there are the anchors.
00:32:15.036 --> 00:32:16.629
The speedboats are going to take off, they're going to roll with it, they're going to be able to do some great things and add it to their practice immediately.
00:32:16.629 --> 00:32:19.538
Then you've got, you know, the tugboats that are like okay, let's, let's check this out, let's see what I can do, let's wait and see, kind of attitude, but you're still moving forward.
00:32:19.538 --> 00:32:22.249
And then, of course, you've got those that will highly resist this.
00:32:22.249 --> 00:32:29.846
And, you know, slowly, as the tugboats start kind of tugging on and kind of moving away, they kind of start at least moving towards that.
00:32:29.947 --> 00:32:36.948
But that's one of the things that I always say like everybody is in a different, you know, situation.
00:32:36.948 --> 00:32:43.488
They're in a different, you know, as far as learning path is concerned, but we'll all eventually get there.
00:32:43.488 --> 00:32:48.493
But I just don't like that fear that is being put into the teachers as well.
00:32:48.493 --> 00:32:56.480
So that's something that I wanted to talk to you about, so thank you so much for sharing that, and so I want to ask you to now talking about beyond hype and fear.
00:32:56.480 --> 00:33:01.514
You know you use the words changing rather than transforming.
00:33:01.514 --> 00:33:07.143
That's one of the things disrupting or revolutionizing when discussing AI in education.