WEBVTT
00:00:30.115 --> 00:00:33.518
Hello everybody and welcome to another great episode of my EdTech Life.
00:00:33.518 --> 00:00:42.539
Thank you so much for joining us on this wonderful day and, wherever it is that you're joining us from around the world, thank you, as always, for all of your support.
00:00:42.539 --> 00:00:45.088
We appreciate all the likes, the shares, the follows.
00:00:45.088 --> 00:00:49.970
Thank you so much for engaging with our content, for giving us some wonderful feedback.
00:00:49.970 --> 00:00:51.343
Thank you so much.
00:00:51.343 --> 00:01:03.143
As you know, we do what we do for you to bring you some amazing conversations with amazing guests, to continue to nurture our education landscape and just to continue to see different perspectives and different viewpoints.
00:01:03.143 --> 00:01:12.052
And today I'm really excited to welcome back a second time guest, and I'm really excited to welcome him back because he does some amazing work.
00:01:12.052 --> 00:01:17.248
He is a great mentor also as well, and I always love catching up with him.
00:01:17.248 --> 00:01:20.305
Micah Shippey, thank you so much for coming back to the show.
00:01:20.305 --> 00:01:20.826
How are you doing today?
00:01:20.826 --> 00:01:21.349
I'm doing well, thank you.
00:01:21.349 --> 00:01:22.370
Thanks for having me Love coming back to the show.
00:01:22.370 --> 00:01:22.891
How are you doing today?
00:01:22.912 --> 00:01:23.534
I'm doing well, thank you.
00:01:23.534 --> 00:01:25.661
Thanks for having me Love coming back to the show.
00:01:26.302 --> 00:01:45.129
Absolutely, and I love having you back, my friend, and it's always great to catch up with you at conferences, just kind of talk to you, get to hang out, hear the work that you're doing, and today we're definitely going to be talking about some great work that you have just released right here 2059, future of Education.
00:01:45.129 --> 00:01:48.647
So we'll definitely make sure we link that in the show notes.
00:01:48.647 --> 00:02:03.091
But before we dive in, for those that may not be familiar with your work just yet, for all our new audience members, all our new listeners, can you give us a little brief introduction and what your context is within the education space?
00:02:03.840 --> 00:02:04.200
For sure.
00:02:04.200 --> 00:02:19.390
Number one my background 22 years of public education, teaching middle school as a classroom teacher, for all 22 years serving as an ed tech consultant, helping teachers globally in the adoption of innovation journey.
00:02:19.390 --> 00:02:21.387
I definitely view it as a journey.
00:02:21.387 --> 00:02:23.086
My first book was Wanderlust EDU.
00:02:23.086 --> 00:02:49.009
I served as an ed tech coach for the Google Innovator Academy, a little bit with the Google Earth team and as a consultant with some big tech organizations before joining Samsung three years ago as the director of education solutions, where I built out a team of world-class best in the business education coaches that serve schools in the adoption process of Samsung technology.
00:02:49.951 --> 00:02:50.393
Excellent.
00:02:50.393 --> 00:02:56.639
Well, definitely a great background and, like I said, I know you were here also we talked about Wunderlust a little bit.
00:02:56.639 --> 00:03:02.223
We definitely talked a little bit about AI, as that was coming out, and you're definitely a big advocate on change in education.
00:03:02.223 --> 00:03:04.566
And you're definitely a big advocate on change in education.
00:03:04.566 --> 00:03:09.051
I know the last time you were here, one of the most memorable quotes also was you know we're still.
00:03:09.051 --> 00:03:18.324
You know, classrooms still look the same in rows Industrial revolution type education, bell to bell, you know, and all that good stuff.
00:03:18.344 --> 00:03:23.520
But today I'm really excited because, like I said I just shared here, you just released a book 2059, the Future of Education.
00:03:23.520 --> 00:03:27.931
So I want to ask you specifically about the book title 2059.
00:03:27.931 --> 00:03:29.747
Now, that's quite a specific time frame.
00:03:29.747 --> 00:03:39.067
It's about 35 years into, of course, the future of education, and so I want to ask you about that why 2059?
00:03:39.067 --> 00:03:54.644
What was going on through your mind and your thought process, as, of course, we know, ai, uh, generative ai coming out in 2022 and now, and what you've seen in your experience, tell us a little bit behind about the story behind that, sure yeah, um.
00:03:54.743 --> 00:04:07.671
Well, I've always been fascinated by the work of futurists, often in sci-fi, for entertainment or escapism, um, but I really thought it was neat to look ahead to the future with a hyper-focus on education, because I hadn't seen that done.
00:04:07.671 --> 00:04:15.128
Predicting the future, if you will, is a challenge, and either brave or insane.
00:04:15.128 --> 00:04:16.471
I haven't decided which yet.
00:04:16.471 --> 00:04:24.495
I picked the year 2059 because when I started the book, I gave myself the same amount of time that George Orwell gave himself in 1984.
00:04:24.495 --> 00:04:25.278
And I thought, okay, that George Orwell gave himself in 1984.
00:04:25.278 --> 00:04:32.553
And I thought, ok, if George Orwell is going to predict this dystopian future 35 years ahead, why don't I do the same?
00:04:32.553 --> 00:04:40.875
And so I intended to release the book in 2024, but missed my mark by three months and, frankly, fell in love with the title.
00:04:40.875 --> 00:04:41.516
So I kept it.
00:04:42.398 --> 00:04:43.019
There you go.
00:04:43.180 --> 00:04:43.379
Excellent.
00:04:43.661 --> 00:05:18.845
Well, let's talk a little bit about that, because I know you talked about Orwell here and you're talking about the future and some of these predictions, like you said, that things that you see that might be difficult to defend later on and I know that's something that you wrote because, like you said, statistically you should be about 82 years old at that time, and so you know, hopefully, God willing, you know we're going to be talking about this and some of the things that you might need to defend, what would you say would be some of your most controversial, I guess, takes that you feel would need some defending in 2059?
00:05:18.845 --> 00:05:18.845
.
00:05:19.747 --> 00:05:42.394
I think the most controversial, honestly, would be things outside of education, like when they talk about the year 2059, they break out four major decades and 2059 is the last, of course, the farthest away, and I propose that there's great potential that we're headed towards a post-scarcity world where everyone has access to what they need.
00:05:42.394 --> 00:05:44.810
Imagine that Everyone on the planet has access to what they need.
00:05:44.810 --> 00:05:46.821
Imagine that Everyone on the planet has access to what they need.
00:05:46.821 --> 00:05:58.300
And the reason I think that will be the hardest to defend is because there's many, many variables that far out that are outside of education and would therefore impact the education outcome.
00:05:59.002 --> 00:05:59.463
Excellent.
00:05:59.463 --> 00:06:04.021
Well, that's definitely great and we're definitely going to get into that a little bit more.
00:06:04.021 --> 00:06:24.432
But one of the things that also you know as going through this and like letting you know like how really nicely laid out this is and it's just like some great bits and I'm just like it's really hard to put this down, honestly, you know, because the more you read, the more you get into it and then, of course, you kind of bring it into your own relationship and into what you're seeing.
00:06:24.432 --> 00:06:28.088
You know in the education space personally too as well, and it resonates with that.
00:06:28.088 --> 00:06:35.196
But I want to ask you about the fusion model for organizational adoption of innovation.
00:06:35.196 --> 00:06:38.043
So can you tell us a little bit about that and how?
00:06:38.083 --> 00:06:43.434
that model is developed and why it's crucial for educational transformation.
00:06:44.459 --> 00:06:55.454
Yeah, well, looking ahead at the future is very difficult and understanding change is well, change management is perhaps one of the most difficult things for organizations, not just education.
00:06:55.454 --> 00:07:05.555
So I developed the fusion model by looking at the work of Everett Rogers, who has an organizational adoption model, and Engstrom, who developed the activity theory.
00:07:05.555 --> 00:07:18.529
And the activity theory says that at each stage of each moment, we can look at the intersection of innovation, people, society, our, our rules, our work, our goals.
00:07:18.529 --> 00:07:21.560
And whenever one of those changes, it impacts the others.
00:07:21.560 --> 00:07:34.807
And so I use that as a critical lens to say you know, where are we in deciding in a journey, where we're going next and how do we get to the place that I call the pencil moment, and that's the place of routinizing.
00:07:34.807 --> 00:07:41.269
And so the pencil moment is the moment where we stop talking about a technology as a thing and we focus on the practice.
00:07:41.269 --> 00:07:52.242
So if you imagine a math classroom saying today we're going to learn math with a pencil, that doesn't really happen, but I would argue when pencils first came out, that was the language.
00:07:52.242 --> 00:07:57.021
Right, you can use your historical imagination of our teacher ancestors thinking that way.
00:07:57.021 --> 00:07:59.988
Well, we've gotten to the point where we fully adopted.
00:07:59.988 --> 00:08:07.033
It was part of our routine the pencil so I make the argument that we'll get to that point with things like AI as well.
00:08:08.279 --> 00:08:15.411
Now, once I finished the book, I instantly thought of another analogy that I did not put in the book, and that is what I call the keyboard dilemma.
00:08:15.411 --> 00:08:18.689
The keyboard is not the best layout.
00:08:18.689 --> 00:08:22.571
It was designed so that the fingers on a typewriter don't get jumbled up.
00:08:22.571 --> 00:08:24.607
That's where we got the QWERTY keyboard.
00:08:24.607 --> 00:08:26.245
That's why the letters are so odd.
00:08:26.245 --> 00:08:42.102
There's a much better keyboard called a Dvorak, and many others that we don't use because we're so stuck on the old model, and so, rather than adopting new and better like our pencil moment, we're still stuck in the keyboard dilemma on things like the QWERTY keyboard.
00:08:43.804 --> 00:08:48.912
All right, and so that kind of is a nice segue, kind of talking about the pencil moment.
00:08:48.912 --> 00:08:51.336
But I want to talk to you about specifically.
00:08:51.336 --> 00:08:57.470
You also make a distinction between integration and adoption of technology and education.
00:08:57.470 --> 00:09:22.200
So please can you elaborate on why you really focused on this distinction and how it can change our approach and maybe reframing it when we as educators, or maybe somebody in a position that gets to choose what kind of technology is going to be brought into the classroom, yeah, the integration is in the fusion model, the initial phase of organizational adoption.
00:09:22.320 --> 00:09:25.083
It's when we are seeing if something fits of organizational adoption.
00:09:25.083 --> 00:09:25.985
It's when we are seeing if something fits.
00:09:25.985 --> 00:09:32.215
Any innovation technology, a new practice, a new strategy does this thing help us?
00:09:32.215 --> 00:09:44.149
And so we integrate it into existing practice and we watch and we listen and we see is this going to work, before we make that decision or not to adopt.
00:09:44.149 --> 00:09:49.148
And if we decide to adopt in the fusion model, I refer to that as implementation.
00:09:49.148 --> 00:09:54.009
That's where we start to go down the journey of adopting towards that pencil moment.
00:09:54.009 --> 00:09:57.568
And adoption is ownership is how I think of it.
00:09:57.568 --> 00:10:01.229
It's much more personal than integrating.
00:10:02.150 --> 00:10:21.761
Nice and that actually makes a lot of sense, like you said, and especially that last part that you said just making it very personal, and I think oftentimes, you know, we confuse the two and just say, oh, this is what we're going to be adopting, rather than thinking, okay, this is actually the integration process and working its way until you kind of make it your own.
00:10:21.761 --> 00:10:35.397
And right now, I mean, I want to ask you you know, what can people in a certain role or position, for example, a coordinator for learning or a CTO, can do with so many AI applications out there?
00:10:35.397 --> 00:10:37.279
I mean, they're overwhelmed.
00:10:37.279 --> 00:10:45.254
What steps would you recommend for them to take in order to choose or make the best decision for their district?
00:10:46.160 --> 00:10:56.412
Well, I highly recommend forming a committee of advisors, not just people who agree with you, but people who, in the field, in your community, are respected for their voice.
00:10:56.412 --> 00:11:15.567
Not just the geeks or innovators like you and me, but also people who are a little slower to adopt, a little bit cautious, that want to think about something and then, in those groups, start to propose questions like let's talk about your personal struggle with these innovations, how do you feel about AI?
00:11:15.567 --> 00:11:17.306
How are you using it now?
00:11:17.306 --> 00:11:19.888
Are you using it more than you realize?
00:11:19.888 --> 00:11:29.195
And then take that personal use approach and backpedal into do you think it's going to be continued to have an impact on our lives?
00:11:29.195 --> 00:11:35.393
And, if so, it's our responsibility to start preparing our students for that impact.
00:11:35.393 --> 00:11:48.764
And so I think there's levels of trusted advisors, trusted groups, self-reflection, group reflection that can inform a better practice which will have a positive impact on our students.
00:11:49.345 --> 00:11:51.188
Excellent, that is a great suggestion.
00:11:51.289 --> 00:12:25.389
And I know I've had Dr Anika McGee also here as well and I know while she was working in a local school district that was a couple of miles away from me, I know when this generative AI tools were coming out, I know that's one of the things that she did was she gathered some teachers and was working with them and, of course, they were just kind of going through everything like what are the positives, what are the negatives of certain platforms to be able to advise and say, ok, these would be the best suited for now, at this given time, that fit these parameters Obviously, data security, privacy and all of those things.
00:12:25.470 --> 00:12:36.792
But I want to ask you, because I know you get to travel a lot, or, you know, due to your work with Samsung, and I know that you have your ear to the ground and I know a lot of educators come, you know, visit with you.
00:12:36.792 --> 00:12:39.865
You get to talk to them, you get to learn a little bit about what's going on with them.
00:12:39.865 --> 00:12:47.035
So I want to ask you, like, right now, I know that we are already you know well in from 2022 to 2025.
00:12:47.035 --> 00:12:49.347
You know there's a lot of changes in generative AI.
00:12:49.347 --> 00:12:51.645
What do you still hear?
00:12:51.645 --> 00:12:55.139
You know having your ear to the ground at these conferences or listening to the great speakers.
00:12:55.139 --> 00:13:07.855
What are still some of the barriers that teachers or any educator and professional are facing dealing with AI in education, any educator and professional are facing dealing with AI in education.
00:13:09.320 --> 00:13:14.447
Well, I think the number one barrier from what I've been hearing and what I see is understanding AI as chat GPT and that's it.
00:13:14.447 --> 00:13:20.061
So I'm on chat GPT and everything I do there is AI.
00:13:20.061 --> 00:13:24.568
Therefore, anything my students do under the idea of AI is what chat GPT does.
00:13:24.568 --> 00:13:30.960
Therefore, anything my students do under the idea of AI is what ChachiPT does, and that's incredibly inaccurate.
00:13:30.960 --> 00:13:39.620
Ai does so many things and can amplify so many great practices in the classroom that transcend one singular application.
00:13:39.620 --> 00:13:53.840
I think AI is being used in many tech industries and in practice as a prefix, like we used to use a lowercase e for e-paper or lowercase I in front of a name to identify us or attach us to an innovation.
00:13:53.840 --> 00:14:09.113
I feel like AI is the new prefix that people are just throwing around and, as a seasoned practitioner, I'm more interested in what you do with it and how it amplifies good practice and fits into our existing schema, our existing background of what good teaching looks like.
00:14:10.139 --> 00:14:10.923
Oh, excellent.
00:14:10.923 --> 00:14:16.686
I really like that and especially you meant like what you said, putting the small I in front of something.
00:14:16.686 --> 00:14:22.620
There's things a lot like iPads, things of that sort Sure sure.
00:14:23.322 --> 00:14:25.643
So, yeah, that's a very well said and very well put.
00:14:25.643 --> 00:14:30.202
So thank you so much, because that definitely, you know, resonates and just really brings that to light.
00:14:30.202 --> 00:14:34.380
So I really like there what you mentioned and you know, mentioning those barriers, like you said.
00:14:34.380 --> 00:14:34.701
You know.
00:14:34.701 --> 00:14:42.649
Therefore, you know if I'm using ChaiGPT, then this is really all that they're doing but, like you said you mentioned, you know how it augments.
00:14:42.649 --> 00:15:07.186
So, going back to that and your classroom experience, and obviously now you know, through Samsung working on a lot of innovative projects and so on, what are some of the things that you might suggest, or maybe even hear through your book that you suggest as far as being able to take what we're doing now and augmenting it and augmenting that in the classroom for our students, augmenting it and augmenting that in the classroom for our students.
00:15:07.186 --> 00:15:10.357
What might be some suggestions there that you might be able to share with our teachers or what to look out for?
00:15:10.357 --> 00:15:13.607
You know that, instead of just seeing it as chat, gpt.
00:15:14.591 --> 00:15:27.182
Yeah, I think in chapter one I actually outlined related to what we were speaking about a few minutes ago the fusion model and I show the story of AI adoption in the school and talk about initiation and implementation.
00:15:27.182 --> 00:15:29.327
So there is a clear example of what that looks like.
00:15:29.327 --> 00:15:35.452
To help amplify or unpack some of the academic speak I sometimes fall into, I'm doing my best.
00:15:35.452 --> 00:15:53.313
When it comes to looking forward to the future, I think there's a profound value in looking back at our past, you know, looking at the things that we have been unable to do in education, and how can innovation, how can new technologies support it.
00:15:53.313 --> 00:16:01.460
So, for example, I would make the argument the number one way to teach is one-on-one with an expert and an apprentice.
00:16:01.460 --> 00:16:03.845
It's the best way to learn something.
00:16:03.845 --> 00:16:07.442
They can watch you, they can give you guidance, they can talk to you about it.
00:16:07.501 --> 00:16:09.535
So your social learning is still part of that story.
00:16:10.157 --> 00:16:19.224
It's missing one component, and that's another learner that you can collaborate with and commiserate with, because you make meaning together with a peer.
00:16:19.224 --> 00:16:22.558
That's missing from that model, but it's still, I think, the best model.
00:16:22.558 --> 00:16:32.128
So if you add that other learner and you add that expert or that mentor, you start to get farther and farther away from the mentor's one-on-one support.
00:16:32.128 --> 00:16:41.215
And so 150 years ago, we decided let's put 25 to 35 people in a room with one expert, and this will be perfect.
00:16:41.215 --> 00:16:45.803
It'll be just like our factories it will use bells to get people to go in between.
00:16:45.803 --> 00:16:47.701
So we're pretty hung up on that model.
00:16:47.701 --> 00:17:04.163
So what I would say for the future is we start to look at AI as an example, is a way to provide one-on-one that also gives us access to that expert who now humanizes the learning, which is a critical point I hope we come back to.
00:17:04.163 --> 00:17:12.036
But lets me also have my peers in the room to commiserate, collaborate and make meaning of the learning.
00:17:12.036 --> 00:17:18.348
And so now that trifecta is being perfected in a way that it's never been possible.
00:17:19.974 --> 00:17:52.555
And see and that's great that you mentioned that in the book, because I know you know that's in chapter one Now in chapter two, kind of like that growing aspect, and you outline these scenarios here in education, because you talked a little bit about the 2059, which is that hyper-connected classroom 2029, the bio-integrated learner 2039, and the community learning hub in 2049, and, of course, the post-scarcity scholar.
00:17:52.555 --> 00:18:06.131
So I want to ask you which of these scenarios excites you the most and which might cause you just a bit of concern from all of these four that we've described, cause you just a bit of concern.
00:18:06.151 --> 00:18:13.077
You know, from all of these four that we've described, I'm most excited about the community learning hub.
00:18:13.077 --> 00:18:21.881
I love environment and space and how it impacts our learning and our thinking and how, when we work together to solve local problems or generate local solutions, we're having a clear, tangible impact.
00:18:21.881 --> 00:18:38.569
I love how that has potential, if we frame our mind correctly, to inform global impact as well, something like looking at the United Nations Sustainable Development Goals, access to clean water locally, coming up with a solution locally, can inform another locale, another community.
00:18:38.569 --> 00:18:50.755
That gets me the most excited, the one that scares me the most, is bio-integrated, because I view bio-integrated as problematic for us and I'd like to unpack that a little bit so people understand what I'm saying.
00:18:51.296 --> 00:19:11.118
If you think about today, there are examples of people who have a medical condition like being quadriplegic or paralysis that prevents them from using parts of their body, and there's been many highlights in the media of implants that allows them to play video games and to communicate and do things that are.
00:19:11.660 --> 00:19:12.583
It's super cool.
00:19:12.583 --> 00:19:13.685
It gets me excited.
00:19:13.685 --> 00:19:33.009
What doesn't get me excited is what comes next, because while something serves as a medical aid, the more that medical aid becomes used, the more likely it will become part of other people's lives and that will turn into an augmentation of what we can do every day.
00:19:33.009 --> 00:19:47.211
So now you know, I have a chip in the back of my head and I load up the history textbook and I no longer need to talk about it and develop skills around it, because they have the quote unquote knowledge, Because we have to be hyper judicious about these technologies in the classroom.
00:19:47.211 --> 00:19:51.342
I mean, right now we're worried about whether or not students have a phone or a smartwatch.
00:19:51.342 --> 00:19:58.605
Imagine not knowing how they know what they know, or where it came from, or if the source is a good source.
00:19:58.605 --> 00:20:00.127
That scares me a bit.
00:20:01.296 --> 00:20:18.567
Yeah, no, and that can definitely be something for sure, and especially, like you mentioned, because for me, one of my things has always been, you know, with the use of these tools and you know, as the tools continue to grow and they continue to, you know, be better each and every day, you know, and each and every week there's something else.
00:20:18.567 --> 00:20:34.210
But my concern has always been, you know, sort of like what you mentioned is, even now, when teacher using this kind of technology, the generative AI aspect, you know, teachers just seeing that first output as being gospel and just saying, yep, here we go, this is what it gave me.
00:20:34.631 --> 00:20:34.971
Here we go.
00:20:34.990 --> 00:20:37.738
This is what I'm going to do and really, you know, losing out.
00:20:37.738 --> 00:20:40.001
Like you mentioned, we still need to know that knowledge.
00:20:40.001 --> 00:20:55.132
To know that knowledge, we still need to be that subject matter expert to be able to dissect, to decipher, to make sure that this is something that is good enough content for not only your students, but that we're putting out there for them in the education space.
00:20:55.132 --> 00:21:00.207
And, like you mentioned, with something like that, it's like well, you know the history book, well, whose history is it?
00:21:00.207 --> 00:21:04.643
You know where is this coming from, and so, yeah, those are the main concerns there, especially.
00:21:04.643 --> 00:21:36.176
So I definitely agree with you on that, and but you know, we'll see how that plays out, and hopefully you know, like I said, what I love, though, coming back to the learning hub for me, for myself, being part of, you know, google Innovators Group and, of course, just getting to network with so many people is just being able to bring ideas together in that human aspect and, like you said, being able to solve something that can later lead to something else, that can later lead to even a greater project, and then, all of a sudden, that continues to grow.
00:21:36.176 --> 00:21:55.116
But it's a hub of knowledge that is everybody put together, you know, and just being connected, and it makes a huge difference, and I think that that's something that would be very beneficial, you know, as educators, to being able to find that hub, or being able to find those people that you can connect with, to continue to grow and grow in practice as well.
00:21:55.498 --> 00:21:57.863
Which brings me to that next piece.
00:21:57.863 --> 00:22:09.988
You know, that human element piece, which I do want to talk to you about because, you know, here you also mentioned out talking to teachers, letting them know it says you cannot be replaced, you know.
00:22:09.988 --> 00:22:16.429
So I want to ask you here you know what prompted you to really go in deep with them?
00:22:16.429 --> 00:22:21.407
Because I know you mentioned an ecological concept with this that is tied together.
00:22:21.407 --> 00:22:36.244
So tell me a little bit more about that, because I know that there are a lot of people out there that always say, well, no, teachers will never be replaced, but there's other people that'll say, yeah, you know AI, I mean eventually that's what it's going to come to you know it's going to replace the teacher.
00:22:36.244 --> 00:22:37.813
Why would we need classrooms?
00:22:37.813 --> 00:22:41.661
So tell me a little bit about that and that importance of that human element.
00:22:43.104 --> 00:22:48.221
Yeah, and in fact just last week Bill Gates said that Doctors and teachers will be replaced.
00:22:48.221 --> 00:23:11.076
That scares me because as a classroom teacher, I understand that human empathy and unpacking the human experience as messy as that unpacking is is critical to student learning because it's a form of modeling, an agency that they can't otherwise get from a non-human interface, or shouldn't, I should say, get from a non-human interface.
00:23:11.076 --> 00:23:20.250
You know, I talked about Trophic Cassie in the book and the story of the wolves in Yellowstone and how rivers changed direction as the wolves became extinct.
00:23:20.250 --> 00:23:26.548
And as the wolves were brought back in, the river's courses got more steadfast.
00:23:26.548 --> 00:23:30.242
And it's because, you know, without the wolves you have more deer.
00:23:30.242 --> 00:23:40.143
When you have more deer, they're eating more shrubbery on the side of rivers, which is causing the banks to get looser, which is causing the rivers to change everything.
00:23:40.143 --> 00:23:44.740
And when you start to reintroduce the deer, the ecosystem is in a state of better balance.
00:23:44.740 --> 00:23:58.284
And when we start to pull away teachers I hate to think of teachers as wolves, that's not the point we start to pull away teachers and make less teachers and perhaps more students, our riverbanks are going to fall apart.
00:23:58.284 --> 00:24:04.330
And if our riverbanks are falling apart, we become somewhat directionless and that metaphor.
00:24:04.893 --> 00:24:05.776
I think a lot about it.
00:24:05.776 --> 00:24:07.741
I haven't quite worked my brain through it.
00:24:07.741 --> 00:24:18.906
I get in the book to a degree, but it's something I continue to chew on, to meditate on, because I think it's really powerful and it's something also reinforced in the activity theory that model and triangle you'll find in the book.
00:24:18.906 --> 00:24:24.444
When you adjust one thing, it impacts many more than just your goal.
00:24:24.444 --> 00:24:29.500
It impacts the innovation, it impacts your society, it impacts the rule by which you operate.
00:24:29.500 --> 00:24:30.565
It impacts on the people.
00:24:30.565 --> 00:24:32.682
We just have to be slow and cautious.
00:24:32.682 --> 00:24:46.625
It's one of the good things about education is we tend to be slow to change and in some cases that's a good thing because it does help us to make sure that we're being very cautious about how what we do impacts children.
00:24:47.955 --> 00:24:53.555
Before I get to my next question, because that was a nice segue into you know, talking about access to agency.
00:24:53.555 --> 00:25:18.740
But right now that you just mentioned that, as far as being kind of slow and cautious, I just want to ask your thoughts, because I know that you're out there at conferences and you see a myriad of speakers that are out there on stage or presenting and one of the things, too, is that there's always a one side that will always be like oh, if you're not doing this right now, you are hurting your kids.
00:25:18.740 --> 00:25:23.701
If you're not doing this right now, forget it, they're done in the future If you're not using this.
00:25:23.701 --> 00:25:37.098
And it's almost just this kind of fear that they're putting in that now a teacher's like well, I better use it and I better hop on, even though I'm not sure what it's going to do or how it's going to work, but I don't want to ruin my kids' futures in that sense.
00:25:37.098 --> 00:25:39.982
So I want to ask what your thoughts are on that.
00:25:40.163 --> 00:25:43.267
You know, you know, and personally you, how you feel like.
00:25:43.267 --> 00:25:46.451
Are you more like, just kind of, like you said, slow and steady?
00:25:46.451 --> 00:25:50.057
You know, very just, cautious, cautious advocate, I would say.
00:25:50.057 --> 00:25:51.800
You know, that's something that I call myself.
00:25:51.800 --> 00:25:55.419
I try not to be too fast because sometimes I can get overly excited.
00:25:55.419 --> 00:26:01.128
But what are your thoughts on that mindset of hey, if you're not using it right now with your fifth graders, forget it, they're done.
00:26:01.128 --> 00:26:04.138
Hey, if you're not using it right now with your fifth graders, forget it, they're done.
00:26:04.159 --> 00:26:20.086
Yeah, one thing I learned joining the corporate world is that, unfortunately, fear, uncertainty and doubt sells, and it's one thing that we have to be cautious about as speakers is that we're not selling fear, uncertainty and doubt.
00:26:20.086 --> 00:26:30.604
In a classroom, we want to have stability and access and make sure things are equitable for our students, and being a little slower is important.
00:26:30.604 --> 00:26:39.980
I will say in all frankness, as a young teacher, if I saw something cool and I thought my kids would get a kick out of it, I would dive right in with two feet head first.
00:26:39.980 --> 00:26:42.527
It probably wasn't the best strategy.
00:26:42.527 --> 00:26:44.492
In practice, I found myself, as I got older, taking a step back, looking at them't the best strategy.
00:26:44.492 --> 00:26:44.576
In practice.
00:26:44.576 --> 00:26:51.516
I found myself, as I got older, taking a step back, looking at them, because it's not about me, including my classroom design, my environment.
00:26:51.516 --> 00:26:52.699
It's about them.
00:26:52.699 --> 00:26:54.383
How are they using it?
00:26:54.383 --> 00:26:58.500
How is it benefiting them in the short term and how is it benefiting them in the long term?
00:26:58.500 --> 00:27:01.636
How could I, as an educator, be more transparent?
00:27:01.636 --> 00:27:04.083
And that's what I call access to agency.
00:27:04.083 --> 00:27:05.866
How can I let them see me struggle?
00:27:06.674 --> 00:27:09.343
You know the TV, the projector goes down, the bulb's blown.
00:27:09.343 --> 00:27:10.385
What do I do?
00:27:10.385 --> 00:27:11.779
Put your heads down while I fix this.
00:27:11.779 --> 00:27:12.844
No, let him watch.
00:27:12.844 --> 00:27:15.201
You're starting a lesson.
00:27:15.201 --> 00:27:16.680
We're going to try something new.
00:27:16.680 --> 00:27:17.954
Not sure how it's going to go.
00:27:17.954 --> 00:27:18.976
Let's try it together.
00:27:18.976 --> 00:27:26.818
I do think one thing that's very prevalent in ed tech specifically is it's very responsive.
00:27:26.818 --> 00:27:28.558
This just came out.
00:27:28.558 --> 00:27:29.099
Let's try it.
00:27:29.099 --> 00:27:31.579
Guilty, this just came out.
00:27:31.579 --> 00:27:32.080
You got to use it.
00:27:32.080 --> 00:27:32.721
It's a new thing.
00:27:32.721 --> 00:27:33.320
Got to use this.
00:27:33.320 --> 00:27:35.201
Got to use that, I hope.
00:27:35.201 --> 00:27:55.198
With 2059, I challenge people to be less responsive and more prepared in thinking about where this could all go, how this could all shake out, so that we start to prepare ourselves with better policy and better infrastructure to provide more access for students.
00:27:56.625 --> 00:27:57.007
Excellent.
00:27:57.146 --> 00:28:16.470
I definitely agree with that and I hope it just even right now 2025, we can just get into that right now too as well, because you know, like you said it's, you see the hype and the buildup and I think a lot of teachers and I always quote a great guest that I had, renee Dawson you've got your speedboats, you've got your tugboats and you've got your anchors.
00:28:17.071 --> 00:28:49.179
One thing that I love that you mentioned, micah, is the way that you kind of stand back and observe where, before the tech was about me, I, what I wanted to do, what I wanted to share, as opposed to, like you mentioned, later on in my career, I was like no, no, this is about you and seeing how the students let them lead with the tech and let's see how they use it, let's see how it benefits them and how it helps them enhance, augment or redefine assignments or their submissions and their learning in that aspect.
00:28:49.224 --> 00:29:04.788
So that's something that I really love that you mentioned there that sometimes, as a teacher, it's okay to step back and obviously, too, it's okay to not know everything and sometimes feel vulnerable and allow students to be able to see that too as well, because that also helps them.
00:29:04.788 --> 00:29:08.196
Which kind of brings me to that concept of access to agency.
00:29:08.196 --> 00:29:11.030
Like you mentioned that it's something that's critical to learners.
00:29:11.030 --> 00:29:20.459
Can you just dive in deep a little bit more into how this can help educators also and how we can foster agency for our students?
00:29:21.464 --> 00:29:36.446
Yeah, in the book I talk a little bit about my dad and how my dad and I would work on cars together, largely out of necessity growing up, but I wouldn't say that as a result of that I learned how to fix a car, because I was the guy handing all the tools to my dad.
00:29:36.446 --> 00:29:42.346
What I learned is that it can be done, and so his agency, to repair a vehicle.