WEBVTT
00:00:30.157 --> 00:00:33.417
Hello everybody and welcome to another great episode of my EdTech Life.
00:00:33.417 --> 00:00:41.448
Thank you so much for joining us on this wonderful day and, wherever it is that you're joining us from around the world, thank you, as always, for all of your support.
00:00:41.448 --> 00:00:44.779
We appreciate all the likes, the shares, the follows, thank you, thank you.
00:00:44.779 --> 00:00:50.292
Thank you so much for sharing our content and interacting with our content, and we love your comments.
00:00:50.292 --> 00:00:53.146
I love to just get some feedback from you.
00:00:53.146 --> 00:00:59.688
So thank you so much for all of your support, and I definitely want to give a big shout out to our show sponsors.
00:00:59.688 --> 00:01:01.512
Thank you so much, book Creator.
00:01:01.512 --> 00:01:03.023
Thank you so much, eduaid.
00:01:03.023 --> 00:01:04.766
Thank you so much, yellowdig.
00:01:04.766 --> 00:01:17.344
It is because of you that this show just we continue to bring some amazing guests and have some amazing conversations, and for anybody out there who might be interested in possibly being a sponsor, please make sure you hit us up.
00:01:17.344 --> 00:01:20.745
We would love to collaborate and work with you and get your name out there.
00:01:20.745 --> 00:01:22.859
So thank you as always.
00:01:22.859 --> 00:01:25.501
And today, ladies and gentlemen, I your name out there.
00:01:25.501 --> 00:01:25.962
So thank you as always.
00:01:25.981 --> 00:01:50.106
And today, ladies and gentlemen, I have a wonderful show and, of course, I say that all the time, but the guests that I have today are people that I go way back with, and if you are looking right now at the video, you see Adam Sparks, and he's been on the show before and he's also been on an AI panel show that I've done, and now we're joined by his wife, also Alexis Sparks, and I am really excited to talk about the amazing work that they are doing.
00:01:50.106 --> 00:02:08.044
If you're not familiar with their work yet, I promise you after today you will enjoy what they are doing through their amazing platform called Short Answer, and so today we're going to have a great conversation of just pedagogy questions AI.
00:02:08.044 --> 00:02:09.812
We're going to talk about Short Answer.
00:02:09.812 --> 00:02:20.674
We're going to talk about a great partnership with Short Answer, and I won't give it, I won't spill it, but Short Answer has partnered up with some amazing people and you know I'm just excited for today's chat.
00:02:20.674 --> 00:02:24.187
So, adam, alexa, thank you so much for joining me today.
00:02:24.187 --> 00:02:25.430
Adam, how are you doing?
00:02:26.080 --> 00:02:26.501
Doing well.
00:02:26.501 --> 00:02:30.211
Thanks for having us, fonz, it's good to be back, and congratulations again.
00:02:30.211 --> 00:02:37.472
I know we're talking before this started, but now that we're officially on the record, congratulations to you, for, since we last spoke, you just finished your doctorate, which is like incredible.
00:02:37.979 --> 00:02:39.081
Yes, thank you.
00:02:39.081 --> 00:02:39.562
Thank you.
00:02:39.562 --> 00:02:50.616
I really appreciate it and again, you know, because of work, having you on the show and people like you on the show, it was just great to be able to document all of that and put that into my research and my study.
00:02:50.616 --> 00:02:57.506
I mean, I never thought that the podcast would turn into a research study, so that has been phenomenal.
00:02:57.506 --> 00:03:00.193
And, alexa, thank you so much for joining us this first time.
00:03:00.193 --> 00:03:04.623
I know that Adam has spoken so highly of you and the work that you were doing.
00:03:04.623 --> 00:03:13.752
I know that Adam has spoken so highly of you and the work that you were doing and you know, I know you were studying in Stanford, so maybe you didn't have a lot of time, you know, to be on, but I am thankful that I got to meet you at ISTE.
00:03:13.752 --> 00:03:15.054
We got to talk.
00:03:15.054 --> 00:03:21.524
You connected me with some wonderful people too as well through LinkedIn and how are you doing today?
00:03:21.544 --> 00:03:21.704
Doing?
00:03:21.704 --> 00:03:23.650
Great Thanks for having me on this podcast.
00:03:24.580 --> 00:03:24.801
Excellent.
00:03:24.822 --> 00:03:27.623
Definitely nice to have some time to do this.
00:03:27.644 --> 00:03:38.222
This is actually my first podcast, so I'm nervous, but I'm excited, all right, no you're going to be great, and I think actually, you guys are my second couple that has been on here.
00:03:38.222 --> 00:03:54.808
Big shout out to Maury Beasley, who was on the show with her husband too, and so I think, yeah, you are definitely the second couple on and it's great just to see just a wonderful EdTech couple doing some great work and, you know, it's great to just catch up.
00:03:54.808 --> 00:04:05.989
So, but before we get started and we dive in, I definitely want to get a little brief introduction and what your context is within the education space, and so we'll go ahead and start with you, adam.
00:04:05.989 --> 00:04:10.191
Give us a little bit about your, tell us a little bit about yourself and the work that you're doing.
00:04:11.040 --> 00:04:12.948
Yeah, so my name is Adam Sparks.
00:04:12.948 --> 00:04:14.627
I was a classroom teacher for seven years.
00:04:14.627 --> 00:04:30.440
I taught middle and high school social studies and English and loved it and fell in love with ed tech during that prep you know, during my time in the classroom and wanted to build it, and so I applied for master's programs and went and got my master's in learning, design and technology, where I built what became Short Answer.
00:04:30.440 --> 00:04:37.225
So it was just supposed to be my master's project but we got some grant funding and launched it about two years ago and I've been building it since.
00:04:37.225 --> 00:04:43.266
It's a K-12 writing platform that just embeds short form gamified writing practice across the curriculum.
00:04:43.266 --> 00:04:47.848
So I'm happy to share more about it and I'm honored to be building it with my wife, alexa.
00:04:47.848 --> 00:04:52.649
So, alexa, if you want to introduce yourself and your context, yeah, sure.
00:04:52.670 --> 00:04:53.692
Hey everyone, I'm Alexa.
00:04:53.692 --> 00:04:57.610
My background with education is a little bit windy.
00:04:57.610 --> 00:05:13.512
Adam and I taught English in China for a year and while I felt that was a really hard year and it taught me that you know teaching is very hard classroom management especially is very difficult Adam told me that that was like the easiest job he'd had, compared to like teaching in the US.
00:05:13.512 --> 00:05:15.889
So I gained a lot of empathy that year.
00:05:15.889 --> 00:05:23.934
I gained a lot of interest in just like ed tech and how much it was helping Adam during his teaching year in the classroom.
00:05:23.934 --> 00:05:27.288
But my background really is more in the software engineering side.
00:05:27.288 --> 00:05:37.041
I've worked at different kind of ed tech startups, in the Midwest mainly, but then I also just recently went to Stanford to study education data science.
00:05:37.041 --> 00:05:50.055
So that was where I really took my like technical expertise and combined it with the data science piece and learn more about education, research and diving into the learning sciences and things like that.
00:05:50.821 --> 00:05:51.242
Excellent.
00:05:51.242 --> 00:06:05.942
Well, that's great, and what I love about this is like, just like you said, you know, coming at it from different perspectives, you've got both the education experience, but then you know Adam and working through Short Answer, now, both of you working on this and just getting to do this together.
00:06:05.942 --> 00:06:08.026
I think that's something that's fantastic.
00:06:08.026 --> 00:06:17.089
Number one it's just great that you get to work with each other as a couple, but also just the amazing work that you're doing, and I know that the last time that we caught up was at.
00:06:17.250 --> 00:06:28.274
ISTE, and that was definitely a couple of weeks back, and but I mean just to hear and see the amazing work that you all are doing and where you've been, I mean nonstop go, go, go.
00:06:28.274 --> 00:06:48.625
It has been wonderful to see just the growth since the first time that Adam was on the show and just sharing a short answer here on the podcast, and now you know the miles that you've traveled, the work that you've put in and the countless teachers and students that you have affected in such a positive way because of your platform.
00:06:48.625 --> 00:07:11.802
So we'll definitely get into that before, but before actually, I want to touch on something that you know a little bit of serious talk here about you know the ed tech space and so on, and I know one of the things that I do appreciate about Adam is that he has always been up for discourse and up for discussion and always been able to just go back and forth and share and learn and it's just been something phenomenal.
00:07:11.802 --> 00:07:49.386
And I know when I ran into you at ISTE, adam, I said you know what I would love to do a podcast and talk a little bit about this Substack article that you posted way back when, on June 27th, that I was like whoa, like this is something that's great, and I think it's something that a lot of people need to hear, and you know the title of the Substack and we'll pop it in the show notes too it's EdTech Needs More Dialectical Thinking, and so, adam, I want to ask you right now, and start off with you as far as dialectical thinking Can you tell us a little bit about number one, what inspired this post?
00:07:49.446 --> 00:07:53.247
Because there's so much in here that I definitely want to unpack.
00:07:53.247 --> 00:08:04.735
So tell me a little bit about you know, after short answer comes out, ai is go, go, go, move fast, break things or just kind of move fast and just go everywhere.
00:08:04.735 --> 00:08:06.255
What happened?
00:08:06.255 --> 00:08:09.456
What were your thoughts now when you wrote this sub stack?
00:08:09.978 --> 00:08:17.283
Yeah, well, this is for first for context.
00:08:17.302 --> 00:08:21.053
The dialectical thinking is just maybe a pretentious way of saying just engaging with viewpoints that are different from your own.
00:08:21.053 --> 00:08:21.319
So just intentionally engaging with people that have different worldviews from you, that are different from your own.
00:08:21.319 --> 00:08:24.189
So just intentionally engaging with people that have different worldviews from you?
00:08:24.189 --> 00:08:28.670
And what inspired the post was what's funny.
00:08:28.670 --> 00:08:35.788
It was interacting with people unintentionally that had very different viewpoints from me and getting into a situation that I didn't expect.
00:08:36.461 --> 00:08:54.226
Alexa and I co-presented at a research ed conference in Delaware this past fall Because, as we have talked about on this podcast in the past, we take efficacy and growing out of our research base very seriously, in short answer, and we think it's something that, generally speaking, the H1BED tech market could do a better job of.
00:08:54.226 --> 00:08:55.250
That's a whole other conversation.
00:08:55.250 --> 00:09:04.631
So, anyway, as a part of that work, we wanted to present at a research conference, and so we were really the only ed tech folks at the conference.
00:09:04.631 --> 00:09:07.440
It was our first time attending a research ed conference.
00:09:07.440 --> 00:09:10.104
It's an organization that I've long been aware of, just as a.
00:09:10.104 --> 00:09:24.152
It's an organization built to sort of bridge research-based best practice with just daily classroom pedagogy and making research applicable to classroom contexts, so I've long relied on it for, just like guidance on best practices in teaching and learning.
00:09:24.152 --> 00:09:49.434
But this is our first time presenting at it, and we arrived there and quickly found out that K-12 ed tech is viewed as in a very negative light, I would say, and that was expressed in different ways over the course of the conference, but most clearly in the closing keynote that was recorded as a podcast and shared out, in which they essentially bashed on ed tech for an hour, and I thought that it was unproductive in a lot of different ways.
00:09:49.596 --> 00:10:17.712
But it was also interesting in that, like it was the I think, maybe the first time that I'd been to a conference to share our work at short answer and I actually walked away with, like my beliefs being challenged and like me thinking critically about things in a completely different light, that kind of burst, the bubble I was living in and helped me realize how much of a bubble EdTech is, and so I wrote that article in the lead up to ISTE, because ISTE might be the biggest bubble of them all.
00:10:17.712 --> 00:10:20.427
You know it's like the biggest celebration of EdTech in the world.
00:10:20.427 --> 00:10:25.460
To my knowledge, I think it's the biggest K-12 ed tech oriented conference in the world.
00:10:25.460 --> 00:10:41.802
Um, and it's a celebration of ed tech, as it should be, and I and I was excited to go down to and celebrate ed tech with other people that, like me, believe that ed tech has tremendous potential to improve learning outcomes for kids and and improve teachers lives and improve schools and blah, blah, blah, um.
00:10:41.802 --> 00:10:45.128
So I don't want to pretend like I like I'm not an ed tech hater by any means.
00:10:45.168 --> 00:11:22.091
I just think I learned so much from engaging from that conference and engaging with folks that had different beliefs than me, and I think, generally speaking, especially at ed tech conferences, but just in ed tech in general, we could do a much better job and we'd be much better off if we were more deliberate about engaging with people that have different beliefs from us, especially as we head into the age of AI, where there's a lot of skepticism and worry I think rightfully so around what AI means for teaching, but especially what it means for learning, and so, yeah, so I wrote the blog post just as a, hopefully as a way of kicking off conversation, and I was actually pleasantly surprised with what I experienced at ISTE.
00:11:22.130 --> 00:11:28.902
It wasn't, you know, the bubble, that I don't get me wrong.
00:11:28.902 --> 00:11:31.312
It is definitely a celebration of ed tech and there's a lot of, I think, blinders on for a lot of folks there.
00:11:31.312 --> 00:11:40.544
But at the same time, for example, I was really impressed that they had a research track that was just for researchers to get up and present, which Alexa did an excellent job of.
00:11:40.544 --> 00:11:52.485
Alexa presented with May Tan, one of her research collaborators at Stanford, on some of the research that they did on AI and its ability to provide feedback on student writing, so I wasn't even aware that ISTE did something like that.
00:11:52.485 --> 00:12:04.327
I thought that was a good step in the right direction, but I think, just in general as a whole, the K-12 ed tech market has a long ways to go in terms of more deliberately engaging with folks that have different worldviews, that don't necessarily believe in ed tech.
00:12:04.868 --> 00:12:08.712
Yeah, Alexa and I want to ask you what has been your experience.
00:12:08.712 --> 00:12:16.687
I know you know Adam's kind of been, you know, at the forefront and kind of like the face of short answer, and I know that you were involved in your studies and so on.
00:12:16.687 --> 00:12:32.761
But now that you're at these conferences and you're meeting people, talking to them and of course you know, as Adam shared his experience, like whoa, like you know, just all of a sudden, what you might have thought was that you were doing or going in the right direction.
00:12:32.761 --> 00:12:37.206
All of a sudden, like there's somebody that says, well, wait a minute, like no, what were your thoughts?
00:12:37.206 --> 00:12:38.509
What was going on through your mind?
00:13:10.490 --> 00:13:13.995
Yeah, I think that's a good question.
00:13:13.995 --> 00:13:23.210
I guess for starters it is very like palpable the difference between going to like an ed tech focused conference and working especially at the vendor booth and having conversation.
00:13:23.210 --> 00:13:33.326
People are generally more like excited to hear about what you're doing and open to what the product does and they've accepted ed tech is like a good thing.
00:13:33.326 --> 00:13:38.998
So I think research ed was just very eye opening because it was like we're starting from.
00:13:38.998 --> 00:13:55.968
Ed tech isn't default good, we actually here think ed tech is kind of default bad, proved to us how your product isn't just like more of the same thing that we're criticizing in the ed tech space and so but I was proud coming out of it because you know not to toot our own.
00:13:55.968 --> 00:13:58.200
So I, but I was proud coming out of it because you know not to toot our own horns.
00:13:58.259 --> 00:14:06.621
But, short answer, did grow out of like a research base and we are being very intentional about like how we think it should be incorporated in the classroom.
00:14:06.621 --> 00:14:19.265
We don't want to be and like we use like, for example, things, likeification, like we use gamified tactics that we're not like playing just games in the classroom to like entice kids.
00:14:19.265 --> 00:14:25.419
I think just to answer your question leaving research ed.
00:14:25.419 --> 00:14:27.864
It was jarring.
00:14:27.864 --> 00:14:57.472
I was a little bit shocked at the ending keynote, but I think overall it was good because I like to come from a like a centrist view on things in general and I think research had just kind of reminded us that we have to be in the middle, like it's not all good, it's not all bad, it's just yeah, there's both, you know exactly and and that's something that's great and right now that you mentioned that, it's something that I know through the podcast and relatively early I know that that I was like, oh my gosh, november 2022.
00:14:57.552 --> 00:15:09.037
And I was like, oh my gosh, let's go all in until March 2023 when I was doing some research for one of my doctoral studies and they're like, oh yeah, you know, write about something that's going on in ed tech right now.
00:15:09.037 --> 00:15:10.198
And I was like, okay, ai.
00:15:10.198 --> 00:15:16.822
And then I got into obviously, the privacy, the you know terms of service data and all of that.
00:15:16.822 --> 00:15:24.134
And then that was like what caused like just to kind of really pump the brakes on you know, just really moving forward and just do a lot more research.
00:15:24.134 --> 00:15:32.501
And and I always give credit to Dr Nika McGee, who that's the first person that I ever heard say cautious advocate, and I just kind of adopted that.
00:15:32.569 --> 00:15:54.677
Where you know it's, I find myself in the middle too, and that's why I love having these conversations and especially bringing people on like you who are at the front of, you know, using a platform, developing a platform and bringing that into the classroom, and just to hear you kind of say like, wow, you know, this kind of caused us to kind of take a step back.
00:15:54.677 --> 00:16:02.322
So I kind of want to ask you, as far as that kind of hearing that message, what were some of the shifts?
00:16:02.322 --> 00:16:19.817
And I'll start with you, alexa, if you don't mind sharing, and then Adam can add to it but what were some of the shifts that you've made or decided to make after you know that conference that y'all went to, as far as the platform or AI in general?
00:16:22.750 --> 00:16:23.615
That's a good question.
00:16:23.615 --> 00:16:40.124
I feel like Adam would actually be good to tackle this, just because he usually the way our workflow works is that he'll start with like from with design and ideation, and then, after he gets that initial idea out on Figma, then it comes to me and it's like let's talk about like technical feasibility.
00:16:40.124 --> 00:16:49.741
So I think he'd be better suited to answer this question just because he was the person doing the initial designs.
00:16:49.741 --> 00:16:51.655
Sounds good, adam.
00:16:52.711 --> 00:16:54.057
No, it's a straightforward answer.
00:16:54.057 --> 00:16:55.301
It's just lead with the research.
00:16:55.301 --> 00:17:10.476
And when I say lead with the research, I don't mean your users are going to give you feedback on what they want in your product and what you should design, but it means balancing what they want with what research says is best practice, which is actually sometimes harder than may seem on the surface.
00:17:10.476 --> 00:17:24.552
And so, for example, we've read a lot of the research and Alexa has done original research that is really cool and she should talk about on AI and its ability to provide feedback on students' writing, and so there has been some research that came out recently.
00:17:24.552 --> 00:17:26.676
Maybe we can link in the show notes on that.
00:17:26.756 --> 00:17:39.221
Ai is effective at providing low-stakes, daily formative writing on kids' feedback and it's pretty good at scoring writing.
00:17:39.221 --> 00:17:43.864
Just in terms of relative to how humans score it with a rubric and relative to how AI scores it with a rubric, it's not statistically significantly different from one another.
00:17:43.864 --> 00:17:56.810
So that was what initially gave us the confidence to build out our first AI feedback activity, in short answer, which is called Quick Write, which is just kids get proficiency scores based on different success criteria set by the teacher.
00:17:56.810 --> 00:17:59.599
They're working together as a group to try to hit a class score goal together.
00:17:59.599 --> 00:18:06.095
So that's the game, and so it's a cooperative game where they're reflecting and getting AI feedback, getting AI feedback and then reflecting on it together.
00:18:06.095 --> 00:18:17.294
That grew directly out of research that we read, partly as a result of going to a conference like ResearchEd, where they're, of course, advocating for leading with research-based best practice in teaching and learning.
00:18:17.294 --> 00:18:52.903
And then I also think, just because of that worldview of wanting to encourage more skepticism towards ai and not just take it at face value and not just take the outputs as true because, as we know, ai often hallucinates we also designed out a new activity in short answer called pen pals um, and in pen pals um we've created these like spicy personality ai characters that the class can pick and then those characters essentially have arguments about the kids writing, and then the class can pick and then those characters essentially have arguments about the kids' writing, and then the class discuss those arguments and vote for the AI pinpile that they agree more with.
00:18:53.871 --> 00:19:08.863
So, of course, the primary goal there is to improve learning outcomes by getting the kids to think about the different success criteria, but the secondary goal is to encourage kids to challenge AI's outputs and not just accept them on face value and debate them, and it's almost like using AI as like a foil in the classroom.
00:19:08.863 --> 00:19:12.778
It's like we have the teacher, we have the students, then we have this AI thing that we can argue with and debate with.
00:19:12.778 --> 00:19:17.538
But again, don't just take it as face value and like the AI, feedback is the end, all be all.
00:19:17.538 --> 00:19:21.853
Let's question it and challenge it, discuss it and make it a way to make the class more social.
00:19:21.853 --> 00:19:29.725
And I would say, in some ways, all that grew directly out of our experience at ResearchEd and being encouraged to lead with ResearchFirst.
00:19:29.725 --> 00:19:41.215
So as challenging as that conference was for me and as frankly frustrating as it was for me, I do think we got better as a result and I think our product got better as a result.
00:19:41.215 --> 00:19:45.295
And that's again going back to like why I was promoting dialectical thinking in that post.
00:19:46.357 --> 00:19:48.482
Yeah, no, I do think oh sorry, go ahead.
00:19:48.482 --> 00:19:49.615
Oh, no, no, no, go ahead, go ahead.
00:19:50.349 --> 00:20:13.173
I was just thinking, like now that he said that, going back to leaving research, that I do think we did double down and we still continue to double down let students connect with each other, because one of the criticisms of research at research ed was very much, like you know, technology kind of individualizes the learning process and that's not necessarily what we want.
00:20:13.212 --> 00:20:33.269
So, like I think we did kind of have leave with some kind of reinforced core principles, like focusing on the peer-to-peer element, and for me and a lot of my research at Stanford, I think I left Stanford more skeptical of AI than I entered in my two-year master's degree.
00:20:33.269 --> 00:21:35.862
So definitely for me personally, it's very important because of the hallucination, because of yeah, you know the very real flaws of AI to not have AI be this like prescriptive entity within our app where it's like students have the final, you know opportunity to say whether or not they agree with the feedback or like have a discussion with the class about why, or why, like, the output isn't good or is good um, yeah, so just kind of piggybacking off what Adam said, just kind of that idea of just like pushing it, and then all of a sudden, now, even within our circle because for the most part it's the same kind of group or you know people and so on and all of a sudden, like I've seen, many that have kind of also started pumping the brakes on this.
00:21:36.750 --> 00:21:59.864
But one of the things that I love that you mentioned is, although your platform does have that AI component, like you mentioned, it's you have the student, the teacher and that AI component, but you're really, you know, allowing the students to have that discourse, that discussion, that going back and forth and still being able to make the classroom interactive.
00:21:59.864 --> 00:22:12.778
And it's not like really, although the tech is there, it to me it doesn't seem and the way that I've seen it, it's not like really just the platform is the center of the activity.
00:22:12.778 --> 00:22:34.734
You've got a lot of things that are going on there too as well, and that's one of the things that I do enjoy, that, as we've, you know, through the years and the months that have gone by, as we talk about personalized learning and personalized learning bites, we talk about personalized learning and personalized learning, and to me it just seems like you just kind of you're putting the kids on this computer but they're not getting to do a lot of the discourse or having that talk.
00:22:34.734 --> 00:22:45.561
It's almost like I forget who the guest was that said, you know, I just get on the chat bot and just put IDK, idk, idk, and it's just going to give me the answer and there really isn't anything that I'm going to learn from it.
00:22:45.561 --> 00:22:55.483
It has no context, it doesn't know anything about me, but yet we want to put them on these devices and or, you know, with the platforms.
00:22:55.891 --> 00:22:59.019
But a platform like yours, like you said, it's for writing.
00:22:59.019 --> 00:23:03.334
You've got feedback, you've got discussion, you've got the teacher.
00:23:03.334 --> 00:23:05.359
You know, within the activity too as well.
00:23:05.359 --> 00:23:23.915
I think that's something that is great and really like good practice, as opposed to just letting the AI take care of everything, and so I think that's something that I really do enjoy, and you know the work that you are doing, and I know that I've seen a lot of great feedback on your platform from a lot of people on all social media.
00:23:23.915 --> 00:23:36.158
So I mean, I think that the fact that you're taking the time to continually listen and learn and kind of take things back a little bit and say, hey, maybe let's do it this way or do it that way.
00:23:36.158 --> 00:23:42.759
I think that's something that's very important to be able to improvise, adapt and overcome based on the situation and what's happening.
00:23:42.759 --> 00:23:49.922
So I think that you guys are doing a fantastic job with that and that shows, really, with the growth and that you guys are everywhere too.
00:23:49.922 --> 00:23:57.396
I mean, there was a point in time where you were like at every conference, like week by week, which has been great just to see that growth too as well.
00:23:58.137 --> 00:24:10.164
But going a little bit back into that conference mode, I know one of the things that we often talk about is and you mentioned it it's just kind of like, you know, the AI kind of bubble, like we're kind of in a little bubble.
00:24:10.164 --> 00:25:15.756
We're kind of like within our own little echo chambers, chambers where maybe we see a lot of focus on the tool and just really tool heavy and not, you know, responsible or responsive pedagogy.
00:25:15.756 --> 00:25:17.580
That is solid and good.
00:25:17.580 --> 00:25:27.599
So I just want to get your idea, alexa, that you know, starting with you maybe now that you've gotten to present at a conference, and seeing what is out there, what are your thoughts on that?
00:25:27.599 --> 00:25:42.815
You know, as far as seeing maybe a lot of conferences just be at first, and then you know, pedagogy second yeah, I never see it as like a practical reality, like something born out of practical practicality.
00:25:42.894 --> 00:25:50.255
like I think you know, up until two years ago has it been three years now the chat, gpt became very popular in the mainstream.
00:25:50.255 --> 00:25:59.116
There weren't tools that like made lesson planning extremely quick or like that could actually like save teachers like a meaningful amount of time.
00:25:59.116 --> 00:26:06.461
So I guess I just see when I think about like why are there so many just like tool focused presentations at conferences?
00:26:06.461 --> 00:26:08.132
I just think it was like born of necessity.
00:26:08.132 --> 00:26:12.441
It was like you know, you don't have a lot of time, we want to give you the most bang for your buck right here.
00:26:12.441 --> 00:26:14.324
Ok, here's a list of tools.
00:26:14.324 --> 00:26:16.153
I've kind of pre-vetted these for you.
00:26:16.214 --> 00:26:22.253
Whoever I am and you know usually it's teachers telling other teachers, so they're like this is what's worked for me.
00:26:22.253 --> 00:26:37.111
I want to share that with my colleagues or other teachers across the nation, and so I'm just going to put this presentation together with tools that worked for me.
00:26:37.111 --> 00:26:50.096
Um, so I don't really, you know, and unfortunate you know that's unfortunate because you don't get into like the meaty pedagogy or things but I do kind of see it as something that was just really born out of, like just the practical, like it just seems like a natural response to a profession that had you know, had a lot of time-related burdens.
00:26:50.096 --> 00:26:52.340
How about you, adam?
00:26:53.442 --> 00:26:54.865
yeah, no, I think that's.
00:26:54.865 --> 00:26:55.671
I think you're right, alexa.
00:26:55.671 --> 00:27:02.278
It's like, um, at a conference you're given like a 45 minute session and that short amount of time it's it's hard to cover.
00:27:02.278 --> 00:27:04.388
And I feel this a lot when I present.
00:27:04.388 --> 00:27:06.214
It's like because we always want to lead with pedagogy.
00:27:06.214 --> 00:27:13.461
You certainly don't want to make it so theoretical that the teachers walk away feeling like you just talked at a 10,000 foot level about theory and they didn't get anything practical.
00:27:13.461 --> 00:27:51.978
So it's because you have such a time constrained environment, it kind of lends itself to like yeah, let me just quickly just hit you with a fire hose of tools, tools, tools and ways that you can tools, tools, tools, and that's like teachers like that, frankly, like a lot of teachers like that, because they're like like I'm here to find to kind of put tools in my tool belt, um, and unfortunately, taken to the extreme though, it does just result in presentations that are just sort of vendor sessions, especially where we're just we're just hawking different products without really talking about the problems that they're solving, and we're kind of putting the cart before the horse in that sense.
00:27:51.978 --> 00:28:02.837
And um, and I think, yeah, I think it's been unproductive, um, but, but I also think, moving forward, I I do see this getting slowly better.
00:28:03.379 --> 00:28:09.621
Um, I was encouraged when ascd and isty, two very large, influential organizations, merged.
00:28:09.621 --> 00:28:42.870
Um, asccd is the kind of big curriculum instruction professional development organization in the United States, iste, of course, the big ed tech one, and the merging of those two things I think is a good thing and sort of symbolic and something that I'm seeing in school districts across the country, which is that there used to be silos and there still are silos of like the ed tech coaches are over here and then, like the teaching and learning curriculum instruction folks are over here and there's certainly a lot of overlap, but they're separate spheres and I'm increasingly seeing that there's not, we're not doing separate spheres anymore.
00:28:42.870 --> 00:28:52.192
It's like you're just an instructional coach, technology is like an assumed part of what you do, and so we're going to leave with best practice in teaching and learning and we'll talk about tech.
00:28:52.192 --> 00:28:58.438
But we're going to talk about tech as a means to actualize best practices in teaching and learning, not as this separate thing.
00:28:58.438 --> 00:29:00.142
That is an end in and of itself.
00:29:00.850 --> 00:29:13.103
And I think if I have one major criticism of K-12 ed tech conferences it would be that it's oftentimes you go into presentations and you walk away feeling like ed tech is an end in and of itself.
00:29:13.103 --> 00:29:13.954
It's not.
00:29:13.954 --> 00:29:16.250
It's not a tool, it's not a means to an end of learning.
00:29:16.250 --> 00:29:19.801
It's like, no, no, no, just use the ed tech and then we will have been successful.
00:29:19.801 --> 00:29:30.845
And it's like well, I think we've lost the thread here a little bit, but you know that's a criticism, but I also don't want to be too negative because I do think it's getting better for all the reasons I just said.
00:29:31.185 --> 00:29:47.906
Yeah, no, and you know one thing that I do agree with you on that you know just the fact that you know, ascd and ISTE came together and I had the opportunity to be there and actually moderate a panel and be and being able to see district leaders there.
00:29:47.906 --> 00:30:01.100
You know, before it's, you know you go to a presentation, it's only the ed tech people because it, but now it's, you get a nice mix of people that are, you know, those curriculum and instruction departments coming year.
00:30:01.100 --> 00:30:02.281
That's what it's.
00:30:02.281 --> 00:30:32.566
We're moving more towards where our amazing, you know content coordinators now, before they used to always, you know, just focus on the how, you know how to teach things, that pedagogy.
00:30:32.566 --> 00:30:38.947
Well, now, you know, we're, we're slowly like rounding them out and now they're going to be doing some of the ad tech as well.
00:30:38.947 --> 00:30:51.206
But, like you said, it's, we're going to show you the how to teach it, the pedagogy, and then the tech that will supplement, you know, that learning, and I think that's something that's great and fantastic that I'm really looking forward to.
00:30:51.716 --> 00:30:53.423
And, like you said, you know, things change.
00:30:53.423 --> 00:31:02.187
You know we are seeing a lot of those conferences where it's tech first and or tool driven, and now I think we will start seeing that shift.
00:31:02.187 --> 00:31:16.123
So it's really exciting just to have been able to see everything that's been happening since, you know, november 2022 and till now, and how it's still just continually evolving and moving forward.
00:31:16.123 --> 00:31:29.027
So I want to ask you now you know, because I did see and for our audience members maybe that don't follow you on social media, please make sure that you follow Short Answer on social media as well and make sure you follow Adam and Alexa.
00:31:29.027 --> 00:31:33.767
You just, you know, go to their website, get on their socials, because they share so many great things.
00:31:33.767 --> 00:31:42.210
But I know one of the big things that I wanted to talk to you about is your new partnership, so can you tell us a little bit about that and how that came about?
00:31:43.095 --> 00:31:43.576
Yeah, yeah.
00:31:43.576 --> 00:31:55.909
We just formed a partnership with Edger Protocols, which is this amazing kind of thing I don't want to describe them incorrectly it's multi-acid, but it's multi-acid.
00:31:55.909 --> 00:32:18.140
But it's essentially an organization that promotes high quality pedagogy across K-12 through what are essentially like lesson frames, and I could go into more detail of all of them, but the first lesson frame that we have embedded in Short Answer as a part of this partnership is called Random Emoji Power Paragraph, which is one of their more popular edge of protocols.
00:32:18.140 --> 00:32:27.442
And so in Random Emoji Power Paragraph, which is one of their more popular edge of protocols, and so in Random Emoji Power Paragraph, there's a random emoji generator, as it sounds like from the title, and so we've embedded that in short answer.
00:32:28.154 --> 00:32:31.741
And the challenge for the kids becomes no matter what, you can cover any content with this.
00:32:31.741 --> 00:32:34.816
So we do it in history do science, math, like this, whatever, but whatever content.
00:32:34.816 --> 00:32:48.247
You do science math electives, whatever, but whatever content you're teaching, you're going to be challenged to write one paragraph, five sentences, and for each sentence we're going to generate a random emoji and you have to connect your learning in some way to that random emoji.
00:32:48.247 --> 00:32:50.149
So one for each.
00:32:50.149 --> 00:32:51.690
So there's five emojis.
00:32:51.690 --> 00:32:52.691
You've got five sentences.
00:32:52.691 --> 00:32:59.682
You're going to connect each sentence to each emoji and as you're doing that, you're having to find how to fit that together into a coherent paragraph.
00:32:59.682 --> 00:33:12.230
That doesn't sound like total gibberish, which is fun for the kids because it's really hard, but it's also challenging kids to think analogically, making analogies to different things and challenging with the writing structure in the process.
00:33:12.230 --> 00:33:22.421
So it's perfect for us, given our focus on writing and their focus and and uh, its approach and how fun it is right, because we want to be a sort of gamified fun writing platform.
00:33:24.425 --> 00:33:31.815
So uh yeah, so we just built rep sorry, random emoji paragraph short for the short version that is called rep uh into short answer.
00:33:31.815 --> 00:33:37.316
Um, as our first of hopefully several more edge protocols that will be embedded right into short answer.
00:33:37.316 --> 00:33:39.801
This all came about, um, through our.
00:33:39.801 --> 00:33:43.729
Actually I connected with a teacher in California.
00:33:43.729 --> 00:33:44.918
His name's Robert Mayfield.
00:33:44.918 --> 00:33:46.201
He teaches at Ripon High School.
00:33:46.201 --> 00:33:47.846
He's an incredible social studies teacher.
00:33:47.846 --> 00:33:55.779
He was using short answer quite a bit and he also does a lot with edgy protocols and he was presenting at Spring Q.
00:33:55.779 --> 00:34:08.226
Q is California's big epic ed tech conference in Palm Springs every spring and he asked me to co-present with him on how he uses short answer and I was going to help him with that, but then also on how they use short answer to actualize edge protocols.
00:34:08.226 --> 00:34:10.981
And that was kind of my first foray into this world.
00:34:11.021 --> 00:34:12.326
I'd always heard about edge protocols.
00:34:12.326 --> 00:34:18.068
My cousin, michael Krambeck, who teaches social studies here in Nebraska, was always a huge advocate and was always trying to get me to use it when I was teaching.
00:34:18.068 --> 00:34:22.592
But I just never did and the more I learned about it the more I believed in it both.
00:34:22.592 --> 00:34:28.686
Just like the pedagogy is actualizing how much teachers raved about it and how much teachers love edger protocols.
00:34:28.686 --> 00:34:30.096
That was the biggest swing thing for me.
00:34:30.096 --> 00:34:37.525
It was just like being in those rooms at Q and talking to all these teachers that use edger protocols every day and how much time it saves them and how effective it is for their kids.
00:34:38.027 --> 00:34:46.000
And I actually wrote about edge protocols in the um in the blog post we referenced earlier the dialectical thinking one, because something I love about edge protocols is they always lead with results.
00:34:46.000 --> 00:34:48.005
It's not just research they're leading with.
00:34:48.005 --> 00:34:56.045
Like here are my kids prior to using these and here are where my kids are at after using them and like you can see the growth, growth.
00:34:56.045 --> 00:35:04.641
So I loved everything about it and so we were super excited to form that partnership with John Carripo, who's the head of edge protocols, and Marlena Hepburn, who helps John out quite a bit.
00:35:04.641 --> 00:35:10.240
And yeah, I'm happy to get into any more detail with that if you want, but super excited to.
00:35:10.240 --> 00:35:13.925
We announced that a couple of weeks ago to kick off the new school year.
00:35:14.485 --> 00:35:15.820
No, and that's something that's fantastic.
00:35:15.820 --> 00:35:17.255
The new school year?