April 4, 2025

Episode 319: Rob Nelson

Episode 319: Rob Nelson

Episode 319: AI, Education, and Moving at Human Speed with Rob Nelson In this powerful episode of My EdTech Life, I sit down with educator, writer, and higher ed tech veteran Rob Nelson to explore the real story behind AI in education. Rob challenges the “move fast and break things” mentality and calls for a more human-paced, thoughtful approach to integrating AI in classrooms. We explore what it really means to Tinker Toward Utopia, how large language models are reshaping student learning (w...

Episode 319: AI, Education, and Moving at Human Speed with Rob Nelson

In this powerful episode of My EdTech Life, I sit down with educator, writer, and higher ed tech veteran Rob Nelson to explore the real story behind AI in education. Rob challenges the “move fast and break things” mentality and calls for a more human-paced, thoughtful approach to integrating AI in classrooms.

We explore what it really means to Tinker Toward Utopia, how large language models are reshaping student learning (when used intentionally), and why educators must resist pressure from hype-driven platforms. This episode is packed with clarity, caution, and hope for anyone navigating the fast-moving world of AI in education.

Links to the publications I mentioned.
Henry Farrell's blog, Programmable Mutter
Tinkering Toward Utopia by David Tyack and Larry Cuban
A Voice from the South by Anna Julia Cooper

And, of course, Rob's Blog
Website: ailogblog.com

 👇 Timestamps:

00:00 – Welcome & Rob’s background in higher ed
 04:00 – AI adoption: Hype vs. reality
 09:00 – Duck-Rabbit duality: Two ways to see AI
 12:00 – Using LLMs to support—not replace—teaching
 25:00 – The danger of humanizing AI too much
 28:00 – The AI Fight Club: Polarization in the space
 33:00 – Why rushing into AI contracts can backfire
 38:00 – Tinkering toward utopia: A better path forward
 44:00 – Final reflections & rapid-fire Q&A

🙏 Special thanks to our amazing sponsors who help make these conversations possible:

📚 Book Creator – Empowering student voice and creativity.
💬 EDU Aide – Simplifying communication, saving educators time.
🌐 Yellowdig – Building communities that spark meaningful learning.

Please show them some love!

🔗 Catch more episodes at: www.myedtech.life
Support the show: https://www.buzzsprout.com/2395968/support

💬 Let’s keep changing education—at the speed of people.

Stay Techie! 

Peel Back Education exists to uncover, share, and amplify powerful, authentic stories from inside classrooms and beyond, helping educators, learners, and the wider community connect meaningfully with the people and ideas shaping education today.

Authentic engagement, inclusion, and learning across the curriculum for ALL your students. Teachers love Book Creator.

Support the show

Thank you for watching or listening to our show! 

Until Next Time, Stay Techie!

-Fonz

🎙️ Love our content? Sponsor MyEdTechLife Podcast and connect with our passionate edtech audience! Reach out to me at myedtechlife@gmail.com. ✨

 

00:30 - Welcome and Introduction

01:32 - Rob Nelson's Background in EdTech

04:02 - Technology Adoption in Education

07:38 - The Two Sides of AI in Education

13:16 - Duck-Rabbit Metaphor for AI Perception

22:09 - LLMs in the Classroom Experience

29:22 - Anthropomorphizing AI and Historical Figures

34:03 - The AI Fight Club and Polarization

37:04 - Moving at Human Speed in Technology Adoption

43:06 - Resource Allocation and Institutional Partnerships

48:52 - Final Questions and Closing

WEBVTT

00:00:30.096 --> 00:00:33.557
Hello everybody and welcome to another great episode of my EdTech Life.

00:00:33.557 --> 00:00:39.109
Thank you so much for joining us on this wonderful day and wherever it is that you're joining us from around the world.

00:00:39.109 --> 00:00:42.189
Thank you, as always, for all of your support.

00:00:42.189 --> 00:00:44.627
As always, we appreciate all the likes, the shares, the follows.

00:00:44.627 --> 00:00:46.593
Thank you so much for interacting with our content, as always, for all of your support.

00:00:46.593 --> 00:00:47.877
As always, we appreciate all the likes, the shares, the follows.

00:00:47.877 --> 00:00:48.920
Thank you so much for interacting with our content.

00:00:49.719 --> 00:00:54.152
We really appreciate your support and I definitely want to give a big shout out to our newest supporter, book Creator.

00:00:54.152 --> 00:01:05.543
Thank you so much for supporting our mission and believing in what we're doing and bringing some amazing conversations into the education space so that we may all continue to grow together.

00:01:05.543 --> 00:01:14.004
And I'm really excited about today's conversation, as always, like I said, being able to have your own podcast and being able to look for guests.

00:01:14.004 --> 00:01:18.346
It's always amazing when things line up and you get to bring a guest on that.

00:01:18.346 --> 00:01:34.396
You follow on a certain platform and you just are really interested and intrigued by their views and what they post and you just want to bring those conversations and amplify their voices also here at our table at the my EdTech Life conversation table and I'm really excited to welcome today.

00:01:39.980 --> 00:01:40.361
Mr Rob Nelson.

00:01:40.361 --> 00:01:41.224
Rob, how are you doing today?

00:01:41.245 --> 00:01:43.112
I'm doing great Bonds, Thanks so much for having me on the show.

00:01:43.112 --> 00:01:45.680
I'm a big fan of what you do here Thank you very much, rob and I'm a big fan of what you do.

00:01:45.680 --> 00:01:48.888
And you're posting on your blog and, of course, on LinkedIn.

00:01:48.888 --> 00:02:00.165
And, of course, as you know, you know, 2022, from then on, has been just so much content for us, so much conversation sparking so many things and so many new ideas.

00:02:00.165 --> 00:02:10.171
And we're going to get into that, you know, because I really love your take, your perspectives and, like I said, being a cautious advocate and, you know, kind of being in the middle of things.

00:02:10.171 --> 00:02:21.770
You know, I always love to bring various viewpoints to the table when it, you know, excuse me, bring various viewpoints to the table when we are talking about AI.

00:02:21.770 --> 00:02:37.524
So, before we dive in, rob, for all my audience members that are out there listening at this moment and may not be familiar with your work just yet, but after today they will can you give us a little brief introduction and what your context is within the education space?

00:02:38.286 --> 00:02:38.485
Sure.

00:02:38.485 --> 00:02:45.941
So my ed tech life was as an educational bureaucrat in the provost office at the University of Pennsylvania for 18 years.

00:02:45.941 --> 00:02:47.949
So at heart I'm a teacher.

00:02:47.949 --> 00:03:14.270
I started out as a teacher, but I learned early on that teaching will burn you out, and so I made a decision when I finished my PhD at Rutgers to go into administration, and so I oversaw academic technology for the provost office at Penn for 18 years, and that meant implementing enterprise technology, things like Canvas, course evaluations, grad admissions applications, and so that's really where my professional experience lies.

00:03:14.270 --> 00:03:17.260
And then I also taught part-time and continue to teach part-time.

00:03:17.260 --> 00:03:20.368
And recently, as you said, I've made a transition.

00:03:20.368 --> 00:03:29.967
I left my job as a bureaucrat and now I write full-time and still teach on the side and do a little consulting and public speaking as a way to pay the bills.

00:03:30.909 --> 00:03:31.372
Excellent.

00:03:31.372 --> 00:03:37.272
Well, that is a great background which kind of, in a very natural way, just is a nice segue to my first question.

00:03:37.272 --> 00:04:01.173
Based on all the experience that you just finished sharing with us, I wanted to ask you how have those experiences shaped your perspective of AI in education and what experiences, from your years overseeing technologies and initiatives and so on, you know, what are some things that we may be missing in today's AI in education discussions.

00:04:02.699 --> 00:04:14.069
Yeah, I think it comes down to adoption of new technologies, which when a new technology comes out, it sort of explodes on the scene and the early adopters and enthusiasts start talking about it.

00:04:14.069 --> 00:04:22.564
It's often in the context of like, wow, this is going to change everything and it feels like it's going to be instant because it's so obvious to the early adopters that this is going to be so great.

00:04:22.564 --> 00:04:26.012
Sometimes it turns out to be great and sometimes it turns out to be kind of a dud.

00:04:26.012 --> 00:04:37.446
So, for example, I think the MOOCs the massive online courses, were a technology that everybody looked at and thought, wow, this is going to disrupt education.

00:04:37.446 --> 00:04:40.648
Places like Harvard and Penn are just going to disappear.

00:04:40.648 --> 00:04:46.742
That turned out not to be the case and I think you know we can talk about that specific case.

00:04:46.742 --> 00:05:00.630
But I think the lesson I've drawn from my years implementing technology on campus is that change happens at the speed of universities and colleges and schools, not at the speed of the technology companies and technology enthusiasts would necessarily think.

00:05:02.740 --> 00:05:04.889
You know, and that's something that's very interesting, like you said.

00:05:04.889 --> 00:05:16.791
You know, for those early adopters, it just seems like everything like this new technology and anything new that comes out, is going to be, you know, the solution to all our inefficiencies.

00:05:16.791 --> 00:05:26.192
And, of course, in education right now I'm just going to go ahead and throw it out there in the state of Texas, we're getting ready in the next couple of weeks to start, you know, state testing.

00:05:26.192 --> 00:05:44.250
Prior to this, a couple of weeks ago, we started seeing where around the district, you know, schools were purchasing specific platforms and you know, kind of like at the last minute, hoorah, you know, trying to get those grades up, and pretty much it's just a triage to make sure that the students do well.

00:05:44.250 --> 00:05:46.346
But, you know, trying to get those grades up, and pretty much it's just a triage to make sure that the students do well.

00:05:46.346 --> 00:06:10.312
But you know, one of the things is is that, from my years of experience, it just seems that, starting January, everybody's trying to find that one answer and I think sometimes, even with AI, when that came out in 2022, and even until now, you still see that and hear like, yes, this is going to change things, this is going to change things and from 22 till now, you know I feel that there still hasn't been enough research.

00:06:10.312 --> 00:06:24.528
But I'm still interested to see you know some of that research and see if grades are going up, because, of course, like you mentioned new technologies I've seen the iPads roll out and everybody's one-to-one, and this is going to revolutionize education and this is going to get those test scores up.

00:06:24.990 --> 00:06:26.232
Haven't really seen that either.

00:06:26.232 --> 00:06:34.526
Chromebooks, everybody's one-to-one, and the Chromebooks are going to be the next best thing and they're going to help our students and I personally have not seen that either.

00:06:34.526 --> 00:06:52.404
So it's very interesting that you do mention that and, especially with your experience in higher ed I did have Jason Guglia on the show yesterday and interviewing, so I want to get your perspective as far as what you have seen and experienced in higher education that divide, you know.

00:06:52.404 --> 00:06:55.331
So I want to hear what has been your experience.

00:06:57.254 --> 00:06:57.454
Sure.

00:06:57.454 --> 00:07:01.331
So I think it comes down to some of the different perspectives I was talking about.

00:07:01.331 --> 00:07:16.040
Educational technologists, the people who develop and build new technology are, I believe, in most cases earnestly trying to solve problems, but they see problems differently than the way that classroom teachers see them, and Jason's a good example.

00:07:16.040 --> 00:07:16.742
Somebody talks about this.

00:07:16.742 --> 00:07:21.302
Dan Meyer, I know you've had on the show, is another great example of somebody who sees that difference.

00:07:21.302 --> 00:07:43.271
And so I think you know what I've seen is a lot of polarization, people who are rightly concerned about the way that Silicon Valley in particular, the sort of big giant technology companies, are using their not just their financial capital, but the cultural and social power they have to sort of impose a vision of what this technology is supposed to do.

00:07:43.271 --> 00:07:47.129
And then you've got people who are resisting that, and I'm certainly among those.

00:07:47.250 --> 00:07:52.468
I think there's a great deal of concern about the social and economic context in which this stuff is happening.

00:07:52.468 --> 00:08:02.485
But at the level of classroom instruction, I think what's happening is, for the most part, teachers are coming to grips with this technology the way they have all the technologies you've talked about.

00:08:02.485 --> 00:08:12.541
You know everything that's from iPhones to iPads, to the PC, Chromebooks, laptops, going back to the earlier history of technology.

00:08:12.541 --> 00:08:24.529
Even the adoption of chalk and chalkboards Like those are changes that technology has sort of pushed on or confronted teachers with, and they've responded and turned those tools into things that are useful.

00:08:24.529 --> 00:08:26.326
I think that's the process.

00:08:26.326 --> 00:08:26.908
That's what I see.

00:08:26.908 --> 00:08:42.650
That's the positive aspect of what's happening right now is I see lots of teachers experimenting with these tools, trying to figure out what their value is as an educational tool, as opposed to what the people who built them necessarily think they should be used for, how they imagine it's going to play out Nice, Excellent.

00:08:42.931 --> 00:08:43.172
All right.

00:08:43.172 --> 00:08:52.265
So I want to kind of talk a little bit about your writing, too as well.

00:08:52.265 --> 00:09:03.544
I know that you have the AI log, you know which is where on Substack, which I follow and that you post and you share a lot of great views, but one of them that draws my, or has drawn my, attention, and I wanted to ask you about it today, was the way that you use the rabbit and the duck metaphor.

00:09:03.544 --> 00:09:12.589
We have the rabbit of glad tidings and the duck of doom, so can you elaborate on this dual perception and how it impacts education policy and decisions?

00:09:13.431 --> 00:09:13.711
Sure.

00:09:13.711 --> 00:09:21.383
So the duck rabbit is a famous example of what's sometimes called an ambiguous picture, like what you see.

00:09:21.383 --> 00:09:24.769
When you look at the picture, you can make it look like one thing or the other.

00:09:24.769 --> 00:09:49.028
And the duck rabbit is a famous one because the philosopher Wittgenstein used it in some of his work and the sort of way that teachers received that was certainly in the realm of like.

00:09:49.139 --> 00:09:50.364
Oh my God, this is changing everything.

00:09:50.364 --> 00:09:51.729
It's awful, we need to stop it.

00:09:51.729 --> 00:09:55.726
We need to stop our students from using it, but then you've got people who see it.

00:09:55.726 --> 00:09:57.025
Well, this is a new technology.

00:09:57.025 --> 00:10:02.032
It's an exciting way to understand and use knowledge.

00:10:02.032 --> 00:10:07.157
These models, these partial language models, summarize information and spit it back out and use knowledge.

00:10:07.157 --> 00:10:12.129
These models, these large language models, summarize information and spit it back out in ways that are interesting and potentially educational.

00:10:12.129 --> 00:10:18.110
And so, seeing those two aspects, this thing that's going to is very threatening to our jobs.

00:10:18.110 --> 00:10:19.525
You know AI is going to replace this.

00:10:19.525 --> 00:10:22.009
That message gets repeated over and over again.

00:10:22.009 --> 00:10:28.206
On the other hand, ai is going to save us all this work because it can do the boring stuff so that we can turn our attention to the important work.

00:10:28.206 --> 00:10:33.272
So that's the sort of swirl of what's been happening.

00:10:33.272 --> 00:10:38.609
I think that image of the duck rabbit is a nice way to say it's both those things and something else, something new.

00:10:40.503 --> 00:10:55.350
Now, with your experience and I know that you're a speaker now and you go out there and train or people go and listen I want to ask you, when you talk about this and one of the things that you mentioned that duality of this being both things what are gonna take our jobs?

00:10:55.350 --> 00:11:09.908
And then there's the other side that says no, no one's gonna take your job, only somebody else that uses AI effectively.

00:11:09.908 --> 00:11:20.447
And then one side, I see also is just real, like playing into the fear of this, like if you haven't even been using this today, you're doing your students a disservice.

00:11:20.447 --> 00:11:32.787
And then there's that side and so, like I said, what is it that you're seeing out there and how do you, you know, just kind of bring that together for yourself to say, okay, where is it that I stand on these issues?

00:11:33.809 --> 00:11:38.142
Yeah, well, I think we still have an enormously wide range of people responding to this.

00:11:38.142 --> 00:11:45.267
There are still people who have never really used one of these tools before, and so they're getting all their information secondhand.

00:11:45.267 --> 00:11:54.750
On the other hand, you have people who have been using them from the very beginning to try to do interesting or educational work with them, and so I think that challenge remains.

00:11:54.750 --> 00:12:10.254
But what I've seen over time I mean it's been two and a half years, basically, since chat TBT sort of exploded on the scene, and that's not a lot of time, but it's time enough for people to have moved away from the sort of freak out modes that we saw.

00:12:11.277 --> 00:12:28.787
And when I've been giving talks lately, I've used the image of like maybe it's a revolution, but maybe it's going to be a boring revolution that the notion that this is going to be transformative and we're going to see these things, super intelligences appear, and everything's going to change overnight just isn't how we're going to experience this.

00:12:28.787 --> 00:12:32.769
Much like things like the iPhone and the PC.

00:12:32.769 --> 00:12:55.351
They feel and are transformative and they're going to change things, but that process takes place over a period of time, there's adjustments, and what I really believe is important is that we use our critical thinking skills and analysis to think about how we use these things to improve education, to improve organizations, and that's where I think the conversation is starting to move.

00:12:56.320 --> 00:13:02.364
Good and that's excellent and that's so good to hear, because there's webinar after webinar that I'll sit on too as well.

00:13:02.364 --> 00:13:26.605
And I don't know if it's so much more in the K-12 space as it is in the higher ed space, because I'm not involved in webinars at that level, but in the K-12 space there just seems to be this sense of urgency like hey, if you're not doing this, you're doing your students a disservice, because this is the way that they're growing and this is the way that they're going to need these skills to get into college and for the jobs of tomorrow.

00:13:26.605 --> 00:13:32.168
And it's almost this attitude of move fast and break things as opposed to just simply.

00:13:32.168 --> 00:13:40.734
Like you said, it can revolutionize, it will, but maybe it's in a slower, more boring and more, you know, calm process process.

00:13:40.734 --> 00:13:44.456
But it just seems like it's a go go, go, go go mentality.

00:13:44.557 --> 00:13:58.506
And going to conferences you see some of the top platforms that are out there that are just pushing this so much and you know, I feel sometimes they may be even preying on some educators, as far as you know, onto the burnout aspect.

00:13:58.506 --> 00:14:00.176
Like this is going to save you that time.

00:14:00.176 --> 00:14:01.681
This is going to give you back that time.

00:14:01.681 --> 00:14:10.206
This is going to do this and this is going to do that, but you have to pay us, you know, I don't know X amount of dollars per license, per site and so on.

00:14:10.206 --> 00:14:17.912
And a lot of educators are like, oh my gosh, I really need this, because they feel overwhelmed and they feel like this is going to be that sense of relief.

00:14:17.912 --> 00:14:25.557
Now, in higher ed is there kind of that mentality, or maybe is there a crowd like that too as well.

00:14:26.740 --> 00:14:27.140
Absolutely.

00:14:27.140 --> 00:14:44.173
I mean, higher ed is sort of the same pressures that K-12 has in terms of you know it's cast in terms of business needs right, businesses need graduates with these skills and I think Silicon Valley I mean you mentioned the Zucker fame, zuckerberg or Facebook idea move past and break things.

00:14:44.173 --> 00:14:45.778
You mentioned the Zucker fame, zuckerberg or Facebook idea.

00:14:45.778 --> 00:14:46.519
Move past and break things.

00:14:46.519 --> 00:15:01.157
They're very much using that as a marketing term, right, as a as a way of marketing these projects and trying to raise revenue, which they have invested huge amounts of capital in this technology and they need to get a return and that's a big problem for them.

00:15:02.019 --> 00:15:05.409
It's not the problem that K-12 or higher ed is trying to solve.

00:15:05.409 --> 00:15:40.606
We're trying to figure out how to educate students for this new society, for the changes that are coming, for the way this is going to transform the work of people who work in knowledge, business and education, and so I think refocusing our attention on, well, what is it we need these tools to do and how do we get the tools to do what we want them to do, is really cutting against the grain of the way that Silicon Valley and giant technology companies and a lot of the startups who are in that sort of movement that move fast and break things, movement um are are talking about it, and so that that's that, that misalignment.

00:15:40.606 --> 00:15:55.167
I think that that I, I, I think, slowly but surely um teachers, institutions, um are coming around to uh to establishing their domain, their uh ability to control or decide how these things get deployed.

00:15:56.051 --> 00:15:56.472
Excellent.

00:15:56.472 --> 00:16:08.735
Now talking about deployment, you know, kind of going back to your writing, because if you haven't, for those listeners that are joining us today or watching us on the replay, please make sure that you do check out Rob's Substack.

00:16:08.735 --> 00:16:19.620
He has great writing there and you know a lot of these questions you know is just going a little bit deeper into the writings and that he has and has available to all of us, which is a great resource.

00:16:19.620 --> 00:16:23.033
So please, I mean you know we'll definitely link it in the show notes as well.

00:16:23.033 --> 00:16:30.308
But, rob, I wanted to ask you you know you have this series it's a two part series is what is an LLM doing in my classroom?

00:16:30.308 --> 00:16:38.669
So I want to ask you how do you perceive the role of large language models in enhancing or possibly hindering the learning process?

00:16:39.793 --> 00:16:41.336
Yeah, thanks for asking about that.

00:16:41.336 --> 00:16:46.393
So that series is actually going to wrap up, maybe tomorrow, but certainly next week.

00:16:46.393 --> 00:16:52.062
I've got one more piece to write about for that, and that was really a reflection on my own practice in the classroom.

00:16:52.062 --> 00:17:27.294
I'm a history teacher and the history I teach right now is in the grad school of education at Penn, and I'm teaching mostly aspiring educational bureaucrats, people who want to go into administration, and so I'm enormously lucky to be teaching in that environment, and one of the ways I was lucky is that, unlike a lot of teachers, I didn't have to figure out how to get one of these commercial tools to use, because there's a research center on campus that was willing to work with me so that I could use this technology in my teaching.

00:17:27.734 --> 00:17:29.238
I'm somebody who believes in.

00:17:29.238 --> 00:17:30.848
It's often called a flip classroom.

00:17:30.848 --> 00:17:35.624
I called it structured activities, and so my class is activities-based.

00:17:35.624 --> 00:17:44.874
It's very much student-focused, and so I treated the use of the LLM not as something I had to decide, but as something the students and I could work out together.

00:17:44.874 --> 00:17:49.186
How is this going to influence their work?

00:17:49.186 --> 00:17:52.112
How is it going to be valuable to them as an educational tool?

00:17:52.112 --> 00:18:00.290
How is it going to, you know, frustrate those aspirations or aims, and so we worked with that tool.

00:18:00.290 --> 00:18:05.209
It's called GPTA and it's basically an LLM-based assistant.

00:18:05.209 --> 00:18:06.813
They call it an assistant.

00:18:06.813 --> 00:18:20.414
I think of it as just a tool, just like a chalkboard or a pointer or anything else that you use in a classroom, and so we use that technology together, and those essays are reflections on how that went Excellent.

00:18:20.694 --> 00:18:21.897
So what is it?

00:18:21.897 --> 00:18:22.358
Do you know?

00:18:22.358 --> 00:18:26.135
As far as your essays that you've written, you know, and looking into that, you know.

00:18:26.135 --> 00:18:33.230
What do you see as far as this possibly hindering, or is this something that can possibly enhance the potential of learning in the classroom?

00:18:34.231 --> 00:18:34.512
Yeah.

00:18:34.512 --> 00:18:41.539
Well, I think it starts with again resisting that notion of it being like a teaching assistant or a teacher.

00:18:41.539 --> 00:18:43.666
These things are not going to replace teachers.

00:18:43.666 --> 00:18:51.510
There's just no way, and Dan Meyer is one of the best at describing that difference between what a tool is and what a teacher is.

00:18:51.510 --> 00:18:57.690
But so, taking that as our base, like, okay, this is a tool, what is it and what is it going to do for us?

00:18:57.690 --> 00:18:58.755
What value does it bring?

00:18:58.755 --> 00:19:07.357
I think what I discovered through this process is that students are very much able to make decisions about their own education.

00:19:07.357 --> 00:19:15.790
I believe that I see evidence of it in their activities, and so they were making choices, with my guidance, about how to use it, and it wasn't to write their essays for them.

00:19:16.932 --> 00:19:24.049
I am very confident that the students were not using this tool simply to replace the educational work that they needed to do.

00:19:24.049 --> 00:19:26.596
It was instead an additional resource.

00:19:26.596 --> 00:19:34.326
So one of the ways we used the tool was to add a layer of peer review, or add a layer of review to a peer review driven process.

00:19:34.326 --> 00:19:39.931
So in my classes, we write a long research essay about an institution of higher education.

00:19:39.931 --> 00:19:44.567
So one of the ways we used the LLM was to.

00:19:44.646 --> 00:19:51.405
Before they sat down with their student peer review group, they had the LLM do a review of their essay.

00:19:51.405 --> 00:20:06.067
It was trained on post-trained, I should say on my rubric, my sort of language that I use, that I want them to use too when they're evaluating a piece of writing, and the tool did that for them.

00:20:06.067 --> 00:20:09.435
I printed that out with the papers.

00:20:09.435 --> 00:20:14.191
And the peer review wasn't just about a student's reaction to the paper.

00:20:14.191 --> 00:20:26.917
It was about the student's reaction to the paper and then this initial machine review of the paper that they could incorporate into their own analysis of their peers' paper and the feedback they were going to give to the student.

00:20:58.015 --> 00:21:07.979
All right, and so for a lot of your students that are working with you in this class, did a lot of them already come in with experience using large language models?

00:21:07.979 --> 00:21:11.837
For some of them, was this their first time and what were some of the reactions there?

00:21:11.837 --> 00:21:17.180
If you can share that with us, because, again, I definitely want to get that perspective for our K-12,.

00:21:17.180 --> 00:21:25.285
You know educators that are saying you know we need to prepare them now for you know higher ed too, as well, as if higher ed is starting to adopt this.

00:21:25.285 --> 00:21:29.180
So what were some of the reactions there from some of your students?

00:21:30.241 --> 00:21:43.532
Yeah, to start with, I tried my best to make it clear that I wasn't going to be surveilling what they do, I wasn't going to be looking over their shoulder, that I wanted it to be a space of genuine experimentation.

00:21:43.532 --> 00:21:47.305
So I wasn't going to put limits on their use, tell them they couldn't use it for this, couldn't use it for that.

00:21:47.305 --> 00:22:05.584
But I did want them to come to class to share what their experience was, and initially, almost all of them had had pretty significant experience with ChatGPT and they used it, like many of us do, as just a sort of replacement for Google Search, a natural language interface to the internet.

00:22:05.584 --> 00:22:06.936
And I think that's one of the ways.

00:22:06.936 --> 00:22:17.640
I mean, that's definitely one thing that these models do is they provide a natural language interface to lots of information, including the ability to search the internet.

00:22:17.640 --> 00:22:24.541
And so we started with that sort of baseline, and the question I kept asking is what value can we get out of this tool?

00:22:24.541 --> 00:22:26.686
What educational value can we get out of this tool?

00:22:27.615 --> 00:22:32.882
And it's very clear when you frame it that way that simply having it do your work for you is not going to be educational value.

00:22:32.882 --> 00:22:42.023
There's no value there, and so, turning that from, okay, it can be used to cheat or it can be used to replace the output you need to give your teacher.

00:22:42.023 --> 00:22:43.922
How can we turn this into a process?

00:22:43.922 --> 00:22:45.718
And what are those processes look like?

00:22:45.718 --> 00:22:47.442
And I think there are lots of.

00:22:47.442 --> 00:22:50.635
You had Mike Kintz and Nick Podoletsky on the show.

00:22:50.635 --> 00:23:07.317
They're examples of people working in KT12, or very familiar with the KT12 environment who are doing this same kind of work, and so I think it's about working with your students in groups, learning as a social activity and taking that social nature and really just experimenting with the tools.

00:23:07.317 --> 00:23:14.146
And, like I said before, I think we're still in the early days of this, and so we're still finding out a lot about the educational potential for these tools.

00:23:14.914 --> 00:23:15.135
Excellent.

00:23:15.135 --> 00:23:33.561
Well, that's so good to hear and, of course, just for them, getting that experience and really seeing this as a maybe higher level experience, as opposed, like you mentioned, just, you know, kind of Google search using ChatGPT in that way, but now really going in deeper and seeing what can be done.

00:23:33.561 --> 00:23:34.124
So that's fantastic.

00:23:34.124 --> 00:23:35.185
But kind of brings me to my next question.

00:23:35.185 --> 00:23:37.511
You know, being that you are, you know, an educator too as well.

00:23:37.511 --> 00:23:42.086
I know that you've written extensively about anthropomorphizing AI.

00:23:46.775 --> 00:23:54.862
Anthropomorphizing AI Now, there was a post recently that I read, where there was somebody that posted you know, things that I will, I love to use AI for, and one of them they put the reasons was historical figures, you know.

00:23:54.862 --> 00:24:09.744
And then, of course, I had another gentleman that I saw there who's you know that I follow also as well that they were just opposed to it as far as you know, kind of going into and leaning into this, where now you're talking to this historical figure and the dangers that can come about.

00:24:09.744 --> 00:24:15.461
So what are your thoughts on that as far as anthropomorphizing, and what has been your experience with that?

00:24:17.035 --> 00:24:18.320
Yeah, thank you for that question.

00:24:18.320 --> 00:24:19.498
It's as a historian.

00:24:19.498 --> 00:24:22.747
It's something I care deeply about and thought a lot about.

00:24:22.747 --> 00:24:35.663
Some of my earliest writing was my experiments with um Conmigo's um uh tool that allows you to uh chat with a historical figure, uh, or um, a uh uh or a literary character.

00:24:35.663 --> 00:24:54.720
So John Warner who, uh, I hope you have on your show sometime because he's a he's a great, a truly great writer on this topic uh, on these topics, um, he calls it digital necromancy, in other words, the sort of sense that you can revive a historical figure using an LLM.

00:24:54.740 --> 00:24:57.045
I think that's just the wrong way to think about what these tools are, because they are they.

00:24:57.045 --> 00:24:59.537
You know, the whole project of artificial intelligence.

00:24:59.537 --> 00:25:09.782
It has been built around this metaphor that a machine thinking machine is like a human mind and it's gotten us some great new tools.

00:25:09.782 --> 00:25:17.256
But I think it's a fundamentally flawed way of thinking about this in the context of education, because, of course, you're not talking to another person.

00:25:17.256 --> 00:25:33.997
You can pretend that it is, but it's simply a machine, and I think there are just much better ways to think about how we use a cultural technology, like a large language model, than having it pretend to be a person, and so that's where I start.

00:25:33.997 --> 00:25:43.246
If all we're doing with these tools is pretending they're people, an assistant, a dead historical figure, then we're missing a lot of their potential use.

00:25:45.075 --> 00:25:47.727
Yeah, and that's something that I know, that I see often in the K-12 space there, missing a lot of their potential use.

00:25:47.727 --> 00:25:56.026
Yeah, and you know, and that's something that I know, that we I see often in the K-12 space, there are a lot of platforms out there that will offer these chatbots, and then, of course, teachers put in information and so on.

00:25:56.026 --> 00:26:05.628
And I know one of the comments was like well, this is what we can be doing, you know, and really getting the students to know more about history and learning more about history.

00:26:05.628 --> 00:26:21.936
And to me it just seems like you know, many times, like you mentioned being that it is a large language model and there are a lot of data sets that go in there, and my thinking is always well, but whose history is it sharing, you know, and what viewpoints, and so on and so forth, and it's always just concerned about the bias too as well.

00:26:21.936 --> 00:26:25.701
And so you know, for me that's just concerned about the bias too as well, and so, you know, for me that's just one of the biggest things.

00:26:25.761 --> 00:27:19.461
But also, when there might be an attachment, you know, to a certain nlm, llm and, like we see now with a lot of platforms too as well, that like character ai, when you know we can't deny that that has been in the news or it was in the news last october and so on, and when we start seeing these, uh, chatbots and starting to have parasocial relationships with them and thinking like it is another human being, like hey, I can do this at home, I can do this on my own, and that can lead into other dangerous you know aspects of using AI as well, so thank you so much for sharing that, which kind of also now, in talking about this, you know two very, excuse me, two very indifferent viewpoints.

00:27:19.461 --> 00:27:24.760
You know that could be very polarizing and I wanted to talk to you about this because I love the way that you put this in your writing.

00:27:24.760 --> 00:27:30.088
It's like the ai fight club, you know, and it just sounds like, wow, you know.

00:27:30.088 --> 00:27:31.698
So I want to talk a little bit about that.

00:27:31.698 --> 00:28:26.297
You know, as far as education and we talked a little bit about it in the in you know pre-show, where there are, you know, two sides and sometimes it could be very rightly divided.

00:28:26.297 --> 00:28:45.427
But, like I mentioned, I at least would love to see myself and I think I see myself and others see myself as just kind of like a kind of in the middle, cautious advocate, trying to bring you know both viewpoints to the table to be able to share and see and learn and just kind of you know, see how we can kind of maybe bridge some gaps there and so on.

00:28:45.427 --> 00:28:50.786
But I want to get your perspective and what was the inspiration behind this term?

00:28:50.846 --> 00:28:51.889
The AI Fight Club.

00:28:54.896 --> 00:28:56.100
Yeah, so I should be clear.

00:28:56.100 --> 00:28:57.084
This is not my term.

00:28:57.084 --> 00:29:04.008
I am borrowing it from one of the best writers on the topic of large language models as a cultural technology.

00:29:04.008 --> 00:29:05.028
His name's Henry Farrell.

00:29:05.028 --> 00:29:27.919
He's a political scientist at Johns Hopkins and he has a blog on Substack called Programmable Muda, and we'll make sure that goes in the show notes because I want to make sure that he gets the attention for having come up with this great metaphor, which is he says that it's an example of the way that these things are being polarized.

00:29:27.919 --> 00:29:29.522
Right, there's this.

00:29:29.662 --> 00:30:03.383
We talked already about the dynamic of enthusiasts and resistors, and a lot of that gets caught up in the power around Silicon Valley and the power of educational technologies, and so I think there is a way I mean, it isn't obviously just this question, there's a whole way in which these social questions and educational questions get polarized, and that happens around particular approaches to writing or particular approaches to learning comprehension, like all the sort of wars around just basic pedagogical methods, and I just think we need to back off of that.

00:30:03.824 --> 00:30:09.884
I think you said something earlier about there not being one best method or one best set of practices.

00:30:09.884 --> 00:30:30.143
We just need to sort of open ourselves up to pluralism and to think that it's perfectly okay for a student to come to my classroom and be given free reign with these tools to explore their educational potential and then go to somebody else's class and be constrained and told no, we're not going to use those tools for this educational experience.

00:30:30.143 --> 00:30:40.922
That sort of pluralism, that notion that we are trying to work towards, an understanding of this that's shared as opposed to ah, I figured this out, I'm right.

00:30:40.922 --> 00:30:43.147
I'm going to tell you what you have to do.

00:30:45.296 --> 00:30:53.728
You know, and that's something like I said, you know, in a lot of conversations that we have and, of course, on LinkedIn, you know, you always have those great conversations too as well.

00:30:53.728 --> 00:31:05.568
And you know, again, to me it's like I do definitely see that that there's like those two sides and, like I mentioned earlier, it's just like that move fast, break things, kind of fear, like your kids are missing out, you're doing them harm.

00:31:05.568 --> 00:31:08.021
And then the others that are OK, let's wait and see.

00:31:08.021 --> 00:31:15.343
And then the others that are okay, let's wait and see, let's make sure that there's more research out there, and so on, and just kind of just trying to bring those parties together.

00:31:15.343 --> 00:31:17.375
And then, just like you mentioned, understanding that there's more than one way.

00:31:17.535 --> 00:31:37.035
I know, recently I was in a conference in Puerto Rico and the keynote speaker it just seemed like everybody in the room was very quiet because of the amount of fear that was put into the educators of saying, if you haven't been doing this, if you have this and this and so on, your students are already going to be left behind.

00:31:37.035 --> 00:31:40.786
And the teachers are like, well, just kind of taking it all in.

00:31:40.786 --> 00:31:55.566
And you know, and once I went up there we had a panel and the same speaker was there and I just, you know, told the teachers, just to kind of bring some peace to them, I said, listen, you know we're all at varying levels in this you know trajectory, this journey that we're all moving to together.

00:31:55.566 --> 00:32:15.036
Of course, renee Dawson I you know she's great and she says, you know, there's the speedboats, there are the tugboats, and then there are the anchors.

00:32:15.036 --> 00:32:16.629
The speedboats are going to take off, they're going to roll with it, they're going to be able to do some great things and add it to their practice immediately.

00:32:16.629 --> 00:32:19.538
Then you've got, you know, the tugboats that are like okay, let's, let's check this out, let's see what I can do, let's wait and see, kind of attitude, but you're still moving forward.

00:32:19.538 --> 00:32:22.249
And then, of course, you've got those that will highly resist this.

00:32:22.249 --> 00:32:29.846
And, you know, slowly, as the tugboats start kind of tugging on and kind of moving away, they kind of start at least moving towards that.

00:32:29.947 --> 00:32:36.948
But that's one of the things that I always say like everybody is in a different, you know, situation.

00:32:36.948 --> 00:32:43.488
They're in a different, you know, as far as learning path is concerned, but we'll all eventually get there.

00:32:43.488 --> 00:32:48.493
But I just don't like that fear that is being put into the teachers as well.

00:32:48.493 --> 00:32:56.480
So that's something that I wanted to talk to you about, so thank you so much for sharing that, and so I want to ask you to now talking about beyond hype and fear.

00:32:56.480 --> 00:33:01.514
You know you use the words changing rather than transforming.

00:33:01.514 --> 00:33:07.143
That's one of the things disrupting or revolutionizing when discussing AI in education.

00:33:07.143 --> 00:33:10.388
So how does this perspective help teachers?

00:33:18.275 --> 00:33:19.158
Yeah, well, I think again, this is.

00:33:19.158 --> 00:33:23.109
I'm a historian and so I often go to a habit of mine that says well, look, this isn't completely new, things like this have happened before.

00:33:23.109 --> 00:33:35.810
And so the notion that what we're experiencing is some kind of transformation, the notion that what we're experiencing is some kind of transformation, that's a word that describes basically the modern experience.

00:33:35.810 --> 00:33:47.282
Like you know, the speed, the incessantness of change, both technological change and social change, is just a fact of modern life.

00:33:47.282 --> 00:33:58.575
And so what we're experiencing now, in this moment, feels not unlike what it was like to experience electricity during the decades that was being implemented, or the steam engine or the printing press, to go back even further.

00:33:58.575 --> 00:34:08.581
And so these changes, even though they feel very exciting and new, and because they are, those are genuinely new technologies.

00:34:09.282 --> 00:34:26.076
The process of social change, uh, doesn't necessarily, you know, mean it's instant or transformative, or, you know, going back to like these things, these things really did change our lives, um, uh, and, and they, they solve some problems and they create some new ones, and we're just moving through the process of figuring those things out.

00:34:26.076 --> 00:34:30.882
Same thing's going to happen with ai, and I think, for a teacher, what that means is just, you know it's okay.

00:34:30.882 --> 00:34:32.545
Like you, you've got a job to do.

00:34:32.545 --> 00:34:34.088
Tools are meant to help you.

00:34:34.088 --> 00:34:37.987
Um, don't let them, uh like, prevent you from doing your job.

00:34:37.987 --> 00:34:47.722
Don't let the pressures of learning something new, adopting a new technology, interrupt your role, which is, of course, to teach your students and to reach your students.

00:34:48.585 --> 00:34:48.947
Excellent.

00:34:48.947 --> 00:34:52.561
Well, before we kind of start wrapping up, I just have about two more questions.

00:34:52.561 --> 00:35:05.445
But I want to ask you specifically, you know, on resource allocation for higher education and, as we know and you've talked about in your blog and in your writing, about California State's university agreement with OpenAI.

00:35:05.445 --> 00:35:08.639
So what are some of the concerns that may come about?

00:35:08.639 --> 00:35:17.346
If there are other institutions that are open to this, how should they kind of proceed when moving into something like this?

00:35:18.106 --> 00:35:18.288
Yeah.

00:35:18.288 --> 00:35:21.855
So I've got two things on this that I think are really important.

00:35:21.855 --> 00:35:42.682
The first one is that these giant technology companies, especially the startups and OpenAI is really a startup and it acts like a startup and some of that goes back to the move fast and break things mentality.

00:35:42.682 --> 00:35:57.887
That's a risky proposition because there's a good chance that they're going to move fast and the thing they're going to break is you and your institution or your students, and so I think there's a great deal of care that goes into evaluating what kind of agreement to have with a company like that.

00:35:57.887 --> 00:36:12.989
I think my concern about what CSU did is they moved all their institutions onto, signed on to the enterprise agreement to bring chat GPT to campus, and I just think that's an extraordinarily risky thing to do.

00:36:12.989 --> 00:36:16.523
So that's one issue.

00:36:16.523 --> 00:36:26.965
The other issue is I think we don't as institutions whether school systems, as institutions, higher ed we don't have to sign these enterprise agreements right now.

00:36:27.454 --> 00:36:36.838
There's still plenty of time to develop a greater knowledge about what these tools are and how they can be used, and you can do that in small projects and you can do it incrementally.

00:36:37.239 --> 00:36:40.425
You don't have to have the latest, largest model.

00:36:40.425 --> 00:36:51.586
You can go get a small open source model and get a team of people in your system IT people in your system or at your institution and work with teachers to explore what value they might get.

00:36:51.586 --> 00:36:55.226
I mean again, I'm talking about my own experience here, because I had that chance to do that.

00:36:55.226 --> 00:37:08.822
I just wish that was a model that more institutions and more school systems were thinking about, as opposed to these large enterprise agreements where you're signing on for a license and you don't know what it's going to cost in two years.

00:37:08.822 --> 00:37:11.342
You don't know if the company's going to be around in two years for a license and you don't know what it's going to cost in two years.

00:37:11.342 --> 00:37:12.969
You don't know if the company's going to be around in two years.

00:37:12.969 --> 00:37:16.780
All that uncertainty, I think, goes back to the message of we can move slowly here.

00:37:16.780 --> 00:37:19.807
Our speed of change doesn't have to be Silicon Valley's.

00:37:21.356 --> 00:37:56.503
And that is a great point, rob, because that's something that I see in the K-12 space and even since 2022, all of a sudden, all of these K-12 education platforms popped up, and my biggest fear was that that, because a lot of them do tie into OpenAI, you know, through their APIs and so on, as OpenAI makes changes as their prices go up, the way that I see it is well, those platforms, their prices are going to go up, or they're going to go up enough to where that platform may not be there, and then some schools have signed on to a three-year agreement and then that platform's only there for one year, and then what can you do?

00:37:56.503 --> 00:37:58.110
You lose out on that money.

00:37:58.110 --> 00:38:24.846
And so those are some of the things that I am very cautious about and that I really look into and really just have conversations with a lot of people in leadership as far as CTOs are concerned, and even explaining it to teachers, because oftentimes, you know, we go to a conference, we see a new tool and it's always the next big shiny thing where it may have just one additional little button, but that button is going to make a difference, at least in the mind of educators.

00:38:24.846 --> 00:38:26.675
They may say like, oh, that makes a huge difference.

00:38:26.675 --> 00:38:34.329
But then the price point is expensive and again, going into the move fast and break things, you don't know if they're going to be there.

00:38:34.329 --> 00:38:38.780
And then from one year to the next, you know, prices go up seven to 11 percent.

00:38:39.320 --> 00:38:51.079
Small school districts like myself, where I'm located we may get priced out of those opportunities and we may not have those opportunities that a neighboring district has, have those opportunities that a neighboring district has.

00:38:51.079 --> 00:38:53.144
So those are some of the things there too that I see in the K-12 space that you know.

00:38:53.144 --> 00:39:03.384
That can kind of relate to what it is that you're talking about, and it's always just that big fear of the money and fear if that company is even going to be there that next year or maybe within the next two years.

00:39:03.384 --> 00:39:04.686
So excellent point.

00:39:04.686 --> 00:39:06.017
Thank you so much for sharing that.

00:39:06.017 --> 00:39:09.662
And then just to kind of wrap up here, I just wanted to talk to you.

00:39:09.662 --> 00:39:17.918
You described your approach to generative AI in one of your writings as an opportunity to tinker towards utopia.

00:39:17.918 --> 00:39:22.355
Can you tell us a little bit more about that and where your thought process is on that?

00:39:23.777 --> 00:39:23.956
Sure.

00:39:23.956 --> 00:39:39.800
So that's the title of a book probably my favorite book of educational history by David Tyack and Larry Cuban, and their argument is essentially that that should be our model, that we don't need a big transformation, we don't need a revolution.

00:39:39.800 --> 00:40:06.746
What's happened over time in the US because this is really a book about US educational history is that we've tinkered our way toward a better social system, is that we've tinkered our way toward a better social system, and I think that kind of tinkering and this is the context of what's happening in Washington today, especially in the attack on higher education is really the exact opposite of this right, the notion of moving fast and breaking things.

00:40:06.746 --> 00:40:14.901
We can actually think about incremental change and building social community around.

00:40:14.901 --> 00:40:17.106
What kind of change benefits humans?

00:40:17.106 --> 00:40:22.085
And that includes especially the humans that are in our classrooms, the teachers and their students.

00:40:22.085 --> 00:40:34.708
And so for me, that notion of tinkering towards utopia echoes a lot of the themes we've been talking about here, which is that the pace of change should happen at human speed, not necessarily the speed of our computing technologies.

00:40:35.755 --> 00:40:36.056
Excellent.

00:40:36.056 --> 00:40:37.680
Well, rob, thank you so much.

00:40:37.680 --> 00:40:40.487
This has been a very insightful conversation.

00:40:40.487 --> 00:40:46.258
Thank you so much for sharing your experience with this, obviously through your writing too, as well, for all our audience members.

00:40:46.258 --> 00:40:47.702
Please make sure that you check out the blog.

00:40:47.702 --> 00:41:00.896
We will make sure we link the blog and all the resources that were mentioned in today's talk there so you can go ahead and you yourself, you know dig deep into that, those resources, and, of course, sprinkle them on to what you are already doing.

00:41:00.896 --> 00:41:04.005
Great, so, thank you, rob, for spending this morning with us.

00:41:04.005 --> 00:41:08.724
But before we wrap up, I always love to end the show with the following three questions.

00:41:08.724 --> 00:41:12.278
So, hopefully, rob always love to end the show with the following three questions.

00:41:12.278 --> 00:41:13.240
So hopefully, rob, you are ready to go.

00:41:13.240 --> 00:41:14.383
All right, here we go, rob.

00:41:14.724 --> 00:41:15.546
Question number one.

00:41:16.146 --> 00:41:21.338
Perfect Question number one Every superhero has a weakness.

00:41:21.338 --> 00:41:27.559
For example, superman had kryptonite, which weakened him, or was a pain point for Superman as well.

00:41:27.559 --> 00:41:35.425
So I want to ask you, in the current state of education, what would you say is your current edu-kryptonite?

00:41:37.796 --> 00:41:40.401
For me it's a kind of intellectual arrogance.

00:41:40.401 --> 00:41:46.519
So it's the notion that because I write to explore, right, so that's why I write.

00:41:46.519 --> 00:42:07.276
But I can lose sight of that sometimes when I get hold of an idea and I think I figured something out, and so AI log the reason I call it the log, one reason I call it AI log is it's a log of my thoughts over time, and sometimes I can go back to something I wrote when I first started the blog and go, wow, that is so wrong and so that that.

00:42:07.276 --> 00:42:14.961
So that notion of like a kryptonite is that like, I think I figured something out because I've been able to write it down, but it's not.

00:42:14.961 --> 00:42:17.262
That's not how things work, that's not how truth works.

00:42:17.262 --> 00:42:17.883
Really.

00:42:17.883 --> 00:42:33.197
Truth is always open to revision, and that's especially true in an environment that's changing like it is, and so for me that's my kryptonite, that arrogance, and AI log is a way to sort of manage that, the risk that I get hold of some of that kryptonite.

00:42:33.938 --> 00:42:34.300
Excellent.

00:42:34.300 --> 00:42:35.543
Thank you so much, rob.

00:42:35.543 --> 00:42:36.626
That was very insightful.

00:42:36.626 --> 00:42:37.757
Thank you so much for sharing that.

00:42:37.757 --> 00:42:43.856
Question number two if you could have a billboard with anything on it, what would it be and why?

00:42:45.880 --> 00:42:50.487
This is going to speak directly to my kryptonite it's arrogance.

00:42:50.487 --> 00:42:54.032
This is going to speak directly to my kryptonite it's arrogance, but it is AIlogblog.

00:42:54.032 --> 00:43:14.327
It is my blog, because, for me, what I'm trying to do through my writing is exploring these ideas in ways that are social, and so if there's one thing I want to put out in the world, it's the thing I'm doing with my writing online, which is very much reflective of what you're doing with my tech life and what I do when I do speaking or my teaching.

00:43:14.327 --> 00:43:30.901
It's taking up these social experiences and trying to make some kind of understanding or truth out of them, and so where I do that, the sort of nexus of where I do that, the focal point of where I do that is on AIlogblog, and so that's what I would put on the billboard.

00:43:31.623 --> 00:43:33.943
Love it and don't forget to put on that QR code.

00:43:33.943 --> 00:43:36.461
That way, as people drive by, they can definitely scan it for sure.

00:43:37.123 --> 00:43:37.845
There you go, all right.

00:43:38.467 --> 00:43:39.797
All right, and the last question, Rob.

00:43:39.797 --> 00:43:46.541
If there is one person that you can switch places for a day, who would it be and why?

00:43:48.356 --> 00:43:57.865
So I'm a historian, so I'm a historian, so I'm going to take advantage of the open-ended nature of your question finds and say I'm going to go back in time.

00:43:57.865 --> 00:44:20.298
So it's not digital necromancy, You've given me this sort of superpower to transfer myself back in time, and it would be my absolute favorite historical figure that I write about a woman named Anna Julia Cooper who was a school teacher, a high school principal and eventually the president of Frelinghausen University, and she was also a writer.

00:44:20.298 --> 00:44:36.211
But because of her responsibilities as a teacher and administrator and because she was a Black woman living in a period where there were no university posts or, you know, the ability to get a writing gig somewhere, she really wrote for a very small audience.

00:44:36.211 --> 00:44:37.077
She did publish one book.

00:44:37.077 --> 00:44:48.028
It's a great book called Voices from the South, but she also wrote a lot of her later essays were written and self-published and they are amazing pieces of work.

00:44:48.536 --> 00:44:50.342
And so what I would want to do?

00:44:50.342 --> 00:44:59.994
There's not a lot that's known about her because she wasn't all that famous in her life, and so there isn't an archive like there is for other great historical figures of that time, and so there's a lot that's unknown.

00:44:59.994 --> 00:45:07.238
So I would love to go back in time and spend time in her life trying to understand something about what her experience was like.

00:45:08.702 --> 00:45:09.764
Thank you so much, rob.

00:45:09.764 --> 00:45:11.208
I really appreciate your shares.

00:45:11.208 --> 00:45:13.583
Thank you so much for this wonderful conversation.

00:45:13.583 --> 00:45:20.642
Also, and again, we'll make sure we post all the information so you can go ahead and contact Rob or follow him on all socials.

00:45:20.642 --> 00:45:33.445
Please make sure that you also follow him on Substack, his blog, and it's fantastic and you'll definitely get some great insights and, like you mentioned, just some great experiences as he continues to blog and continue to grow and continues to write.

00:45:33.786 --> 00:45:50.025
So definitely great resources, especially for this time and the time to come, because, as you know, this is going to be changing, and as fast as it may seem, but I love the way that Rob says it, you know, changing at the speed of people, you know, and it's all on us, so just let's just continue to move forward.

00:45:50.025 --> 00:46:06.717
And so, rob, thank you so much for joining us today and for all our audience members, please make sure you visit our website at myedtechlife, where you can check out this amazing episode and the other 318 wonderful episodes where, I promise you, you will find something that you can sprinkle onto what you are already doing.

00:46:06.717 --> 00:46:06.998
Great.

00:46:06.998 --> 00:46:16.576
And again, I definitely want to give a big shout out to our new sponsor, book Creator.

00:46:16.576 --> 00:46:17.619
Thank you so much for your support.

00:46:17.619 --> 00:46:22.095
Thank you so much, eduaid and Yellowdig also for believing in our mission so we can continue to bring some amazing conversations week in and week out.

00:46:22.095 --> 00:46:58.268
So thank you as always, and from the bottom of my heart, my friends, don't forget, stay techie you.
Rob Nelson Profile Photo

Writer and educator

Since 1999, I have been teaching courses in cultural and educational history that explore how gender roles, slavery, technology, and social justice movements have shaped institutions and individuals in North America. Lately, I have been writing and giving talks exploring the educational value of generative AI, how it is changing education, and what we should do about it.

For nearly two decades, my day job was at the University of Pennsylvania where I led university-wide projects to implement academic information systems for course evaluation, curriculum management, graduate admissions, learning management, and student records. Being a bureaucrat shaped how I think about technology in ways that are important to my writing and teaching.

Before Penn, I worked at Rutgers University as an academic advisor, taught first-year writing courses, served a year as a visiting assistant professor of American Studies, directed a program for Japanese students studying in the US, and pursued—and received, although it was a close call—a PhD in American History.

Before that, I lived in Athens, Ga, where I thought about writing novels or maybe for the movies. While there, I attended classes at UGA long enough that I earned a BA in comparative literature. On several occasions, I drank coffee sitting near Michael Stipe but never talked to him. I did talk to Bill Berry and Mike Mills once, though, hanging out in the back alley of the Georgia Bar.

I grew up in Augusta, GA, known as the birthplace of James Brown and the home of the Masters Golf Tournament, where … Read More