May 23, 2025

Episode 324: Charlie Meyer

Spotify podcast player badge
Goodpods podcast player badge
Apple Podcasts podcast player badge
Amazon Music podcast player badge
Pandora podcast player badge
RSS Feed podcast player badge
Spotify podcast player iconGoodpods podcast player iconApple Podcasts podcast player iconAmazon Music podcast player iconPandora podcast player iconRSS Feed podcast player icon

Episode 324 – Charlie Meyer: The AI Hype, EdTech Snake Oil & What Teachers Actually Want

 In this episode of My EdTech Life, I sit down with Charlie Meyer, small business owner and creator of Pickcode, to unpack the growing wave of AI solutionism in education. From being shouted at over not using AI to asking hard questions like, “Would you trust a vibe-coded site with your bank info?”, Charlie brings the heat, the honesty, and the humor.

If you’re an educator, policymaker, EdTech builder, or investor—this is a conversation you don’t want to miss.

Timestamps:
00:00 - Intro & shoutout to sponsors
02:00 - Who is Charlie Meyer? From CS major to classroom teacher to Pickcode
05:00 - The AI hype: Charlie gets yelled at for not using AI
08:00 - Small biz vs. VC startups: Who’s actually listening to teachers?
10:30 - The silver bullet myth: Why AI promises fall flat in classrooms
15:00 - Where do we draw the line on AI delegation?
17:30 - Real talk on student-teacher relationships & AI disruption
21:00 - AI feedback loops: Are students and teachers both being sidelined?
22:30 - Spotting snake oil: How to vet EdTech products built on buzz
24:00 - If AI tools were honest: “I don’t know you. I don’t care. I’m a matrix in a data center.”
26:00 - Pedagogy in a silo: Personalized ≠ human
28:00 - What Charlie would tell an AI founder who’s never taught
31:00 - Why most teachers aren’t asking for AI—and that’s OK
33:00 - The accountability gap: Who’s vetting these tools?
36:00 - “Move fast and break things” ≠ Classroom values
40:00 - AI tools vs. real classroom pain points
42:00 - Why Pickcode solves real problems without AI
45:00 - Vibe coding exposed: Would you bank on AI-written code?
54:00 - Final thoughts: The bet on GPT-6 & the future of AI in EdTech
57:00 - Charlie’s lightning round: kryptonite, billboards, and his dog’s perfect life
01:01:00 - Final reflections & stay techie!

🔗 Explore More
🌐 Visit Pickcode: https://www.pickcode.io
💬 Connect with Charlie Meyer
🎧 Catch every episode: https://www.myedtech.life

🙌 Special thanks to our sponsors:
Book Creator | Eduaide.ai | Yellowdig | Pocketalk

📣 Don’t forget to like, comment, and share if this episode challenged your thinking or made you laugh. Let’s build smarter, safer EdTech—together.

Peel Back Education exists to uncover, share, and amplify powerful, authentic stories from inside classrooms and beyond, helping educators, learners, and the wider community connect meaningfully with the people and ideas shaping education today.

Authentic engagement, inclusion, and learning across the curriculum for ALL your students. Teachers love Book Creator.

Support the show

Thank you for watching or listening to our show! 

Until Next Time, Stay Techie!

-Fonz

🎙️ Love our content? Sponsor MyEdTechLife Podcast and connect with our passionate edtech audience! Reach out to me at myedtechlife@gmail.com. ✨

 

00:30 - Welcome and Introduction

02:30 - Charlie Meyer's Background in Education

05:08 - AI Solutionism and Conference Confrontation

11:57 - Educators Being Sold To vs Being Heard

16:00 - Silver Bullets and Overpromising Tech

21:09 - Student-Teacher Relationships vs AI

28:45 - Signs of EdTech Snake Oil

33:14 - Advice for AI Founders Entering Education

46:31 - Pickcode: A Purposeful EdTech Solution

54:49 - Vibe Coding Dangers for Students

01:01:19 - Final Questions and Closing Thoughts

WEBVTT

00:00:30.115 --> 00:00:33.497
Hello everybody and welcome to another great episode of my EdTech Life.

00:00:33.497 --> 00:00:42.229
Thank you so much for joining us on this wonderful day and, wherever it is that you're joining us from around the world, thank you, as always, for all of your support.

00:00:42.229 --> 00:00:44.487
We appreciate all the likes, the shares, the follows.

00:00:44.487 --> 00:00:50.874
Thank you so much for just interacting with our content, giving us some of your wonderful feedback.

00:00:50.874 --> 00:00:59.640
We definitely take that to heart so we can always improve and bring you some amazing conversations so we can continue to grow in our education space together.

00:00:59.640 --> 00:01:05.887
I would definitely love to give a big shout out to our sponsors right now at this moment Book Creator, thank you so much.

00:01:05.887 --> 00:01:09.290
Eduaid and Yellowdig and Pocket Talk.

00:01:09.290 --> 00:01:16.947
We really appreciate your support and believing in our mission of connecting educators, one conversation at a time.

00:01:16.947 --> 00:01:23.971
So thank you so much for your support and today I am excited to welcome Charlie Meyer to the show.

00:01:23.971 --> 00:01:25.784
Charlie, how are you doing today?

00:01:26.647 --> 00:01:27.009
I'm good.

00:01:27.009 --> 00:01:28.525
You know you said it's a beautiful day.

00:01:28.525 --> 00:01:31.927
It's not here in Boston it's pouring rain, but you know it's all good.

00:01:31.927 --> 00:01:34.909
It's a nice day to be inside, kind of do some stuff on Hangout online.

00:01:34.909 --> 00:01:39.090
So I really appreciate the invite to the show and so excited to get into it.

00:01:39.780 --> 00:01:40.341
Excellent.

00:01:40.341 --> 00:02:07.545
Well, charlie, I am excited to get to talk to you because I know that you and I are both on LinkedIn and I ran into one of your posts two weeks ago and I was just like wow, it kind of falls in line with how I may be feeling and how many may be feeling, and looking at some of the interactions and comments on your post, I was like you know, this is a great conversation piece just to bring up, and so I just want to thank you for joining me here today.

00:02:07.545 --> 00:02:14.048
But before we get started, I definitely would love to give you an opportunity to introduce yourself to our guests.

00:02:14.048 --> 00:02:18.611
You being a first time guest, I would love for my audience to get to know who Charlie is.

00:02:18.611 --> 00:02:24.207
So, charlie, give us a little brief background and what your context is within the education space, sure.

00:02:24.449 --> 00:02:30.626
Yeah, so I did an undergrad degree in computer science and math and that's where I got my first actual teaching experience.

00:02:30.626 --> 00:02:40.901
So I was like a teaching assistant and that was like favorite job I've ever had Probably still better than my current job, which I do also like, but being a teaching assistant.

00:02:40.901 --> 00:02:43.866
So I was helping students with their coding projects and stuff undergrads and that was awesome.

00:02:43.866 --> 00:02:45.368
So I would stay up till like midnight in the being a teaching assistant.

00:02:45.368 --> 00:02:47.393
So I was helping students with their coding projects and stuff undergrads and that was awesome.

00:02:47.393 --> 00:02:50.018
So I would stay up till like midnight in the computer labs like helping people with their code or whatever.

00:02:50.018 --> 00:02:51.526
So that was a lot of fun.

00:02:51.780 --> 00:02:57.292
Spent a few years software engineering, covid hit and then I kind of was just like, oh, I'm going to.

00:02:57.292 --> 00:03:04.526
So I was living in New York, moved to Boston, where I'm from, and then I got into teaching.

00:03:04.526 --> 00:03:09.346
So I saw like I guess it was a Google ad for some teaching master's program and I was like you know what?

00:03:09.346 --> 00:03:10.669
I always have wanted to do this.

00:03:10.669 --> 00:03:18.033
So I talked to the folks from there, signed up, got in, started with that and then spring of 2021.

00:03:18.360 --> 00:03:22.146
So probably the worst time in human history to become a teacher.

00:03:22.146 --> 00:03:29.264
I became a long-term sub in a math classroom locally here and it was a real eye-opener.

00:03:29.264 --> 00:03:35.867
You know I always wanted to get into teaching and teaching real students in a real classroom, especially in the chaos.

00:03:35.867 --> 00:03:37.612
That was like the end of COVID there.

00:03:37.612 --> 00:03:44.251
I mean it was a total shock to my system, but that prepared me well.

00:03:44.251 --> 00:03:59.252
I ended up full-time as a you know the teacher of record for a couple of years teaching computer science here outside of Boston and then as a side project I started working on PickCode, which is my current full-time job, and that's a coding platform for teachers to use to run their computer science classrooms.

00:03:59.252 --> 00:04:02.890
Now I'm posting on LinkedIn and trying to get out there and market and do all that kind of stuff.

00:04:07.780 --> 00:04:08.040
Excellent.

00:04:08.040 --> 00:04:18.947
So, small business owner, you know, like we said, we're going to talk a little bit about that and that you know just, I guess, putting yourself out there as a small business owner and not quite like a founder or something like that yet, but you know it's very interesting take.

00:04:18.947 --> 00:04:22.045
I know we talked a little bit about that in the pre-show, but we'll get into that.

00:04:22.045 --> 00:04:33.105
But definitely want to talk about this post and we'll definitely put the link in the show notes so people can go ahead and also see it, interact with it and make it visible for them to see.

00:04:33.105 --> 00:04:35.730
But we talked a little bit about AI.

00:04:35.750 --> 00:04:42.031
Solutionism is one thing that you talked about in this post and about grading being hard.

00:04:42.031 --> 00:04:43.242
So let's go ahead and use an AI agent.

00:04:43.242 --> 00:04:43.887
Student teacher ratios are bad.

00:04:43.887 --> 00:04:44.593
Let's go ahead and use an AI agent.

00:04:44.593 --> 00:04:45.540
Student teacher ratios are bad.

00:04:45.540 --> 00:04:47.747
Let's go ahead and use an AI agent.

00:04:47.747 --> 00:04:49.305
Lesson planning takes time.

00:04:49.305 --> 00:04:51.146
Let's go ahead and use an AI agent.

00:04:51.146 --> 00:04:57.608
So that's just to give some people a little bit of context, and that's very small context as far as what the post is.

00:04:57.608 --> 00:05:07.362
But, charlie, tell us a little bit more about where this post came from, what inspired it and what your thoughts are on AI and education right now.

00:05:08.163 --> 00:05:13.120
Yeah, so this post actually came kind of in response to me getting yelled at at a conference.

00:05:13.120 --> 00:05:15.166
So I was there exhibiting for PitCode.

00:05:15.166 --> 00:05:25.531
I just had my booth there with my laptop and my monitor and I was showing off the product and this guy walks up and he's like oh, have you heard of this kind of AI new thing of this week or whatever?

00:05:25.531 --> 00:05:26.704
And I was like no, I haven't heard of it.

00:05:26.704 --> 00:05:32.447
And this guy goes into like a tirade about how AI is the savior and it's going to solve all of our problems.

00:05:32.447 --> 00:05:45.860
And on my booth you know my take, and we'll get into this a little bit more is, for the very least, what teachers need in terms of the computer science space is just a nice quiet place for students to write their code and submit it and get that off to the teacher.

00:05:45.860 --> 00:05:50.771
And one of the things that we do is, you know, we avoid using AI there.

00:05:50.771 --> 00:05:56.971
So there was a popular tool that had AI autocomplete for the student code, which obviously doesn't work well.

00:05:56.971 --> 00:06:01.822
So yeah, so the poster is actually in response to me getting yelled at at a conference, which is kind of crazy.

00:06:01.822 --> 00:06:03.026
It's never happened to me before.

00:06:03.026 --> 00:06:07.766
I was exhibiting, I had my booth, I had my monitor, I had my banner and everything.

00:06:07.766 --> 00:06:16.860
And one of the things it says on my banner is that we don't provide AI autocomplete in our coding platform to students, which is normal.

00:06:16.860 --> 00:06:18.264
That's what teachers have asked for.

00:06:18.264 --> 00:06:25.029
There was a platform that kind of has pivoted towards being for professionals and AI first.

00:06:25.029 --> 00:06:26.584
So shout out to Replit.

00:06:26.584 --> 00:06:30.786
It's like, yeah, cool company, they're doing cool stuff, but you know they pivoted out of education.

00:06:30.786 --> 00:06:35.370
That's fine, but one of their pivots was into like kind of AI vibe coding.

00:06:35.370 --> 00:06:43.526
And AI vibe coding doesn't really work for beginner students who are trying to learn, you know, their first 10 lines of Python Just doesn't make a lot of sense.

00:06:43.526 --> 00:06:50.625
So that's one of the things that says on our banner.

00:06:50.644 --> 00:06:53.033
And this person starts accusing me of gatekeeping this magical technology that's going to transform education.

00:06:53.033 --> 00:06:58.029
And he's accusing me of coming from this place of privilege where I'm trying to stop people from using AI.

00:06:58.029 --> 00:07:07.004
And it's like you don't know me at all and I understand that you're excited about AI, but like, let's just take a deep breath, and that's what I said in the conversation.

00:07:07.004 --> 00:07:08.730
You know this guy's kind of like shouting and so on.

00:07:08.730 --> 00:07:10.382
Let's just take a deep breath.

00:07:10.382 --> 00:07:16.048
Like I'm not trying to offend anybody, but you know, this is what teachers have requested is a tool where there is no AI.

00:07:16.048 --> 00:07:25.430
For this, you know one use case and that's what I built and that's what the company is and that's one of the things that we do in our marketing and, like, I'm not trying to, like, ruin anybody's life here.

00:07:25.711 --> 00:07:30.060
It was a very strange situation, but that's kind of where it came from.

00:07:30.060 --> 00:07:35.449
But then you know, you're also at these conferences and just every other booth is a new AI thing and that's frustrating.

00:07:35.449 --> 00:07:36.913
I just I don't really.

00:07:36.913 --> 00:07:42.411
You know we'll get into more about about that kind of stuff, but I think there's a lot of places where we can just build technology.

00:07:42.411 --> 00:07:44.336
I think there's a lot of places where we can just build technology, just regular technology.

00:07:44.336 --> 00:07:44.999
It doesn't have to be AI.

00:07:44.999 --> 00:07:49.110
I think there are plenty of gaps in what teachers need and we can figure those things out.

00:07:49.110 --> 00:07:53.845
And so, yeah, what I said in the post is right If AI is your hammer, you know everything's in there.

00:07:53.845 --> 00:07:54.767
Also, make that effect.

00:07:55.470 --> 00:08:00.987
Oh, yes, and I think it was just very powerful because it really falls in line with what I see and I don't know.

00:08:00.987 --> 00:08:17.314
Sometimes, honestly, charlie, I feel like you many times, kind of like an outlier, and I'm just kind of standing out looking at everything and I'm just seeing everything, the hype and, like you mentioned, you know seeing reactions from people in this way also as well, the way that that gentleman or that person reacted towards you.

00:08:17.314 --> 00:08:31.225
So, you know, calling out solutionism, you know, is pretty bold and, like you said, hey, this is what is needed or this is what teachers asked for.

00:08:31.225 --> 00:08:32.028
So I want to ask you here the follow up.

00:08:32.028 --> 00:08:35.881
It's like right now, you know, and I know how I feel about it, but maybe you see it too but right now are educators being sold to or are they being heard?

00:08:36.865 --> 00:08:48.898
Well, I mean, yeah, you know I'm going to answer that question, which is like, yeah, I mean it's a sales pitch and so part of what I get into that post and you know this kind of ties into my whole ethos is like I talked to you pre-show and it's like, oh, I'm not the founder of Pitcode.

00:08:48.898 --> 00:09:01.042
I like to say I'm a small business owner and a small business owner is someone who, you know, talks to the community and figures out what they need and provides a service that makes sense and, you know, is responding to feedback.

00:09:01.042 --> 00:09:04.090
And that's how I see myself and founder.

00:09:04.250 --> 00:09:07.144
You know, no offense to founders, but a startup is defined by growth.

00:09:07.144 --> 00:09:13.534
So if you read anything about startups, it's are you compounding month over month, are you growing by 50%?

00:09:13.534 --> 00:09:15.442
And that's how you get your next round of funding.

00:09:15.442 --> 00:09:21.033
And growth at all costs is different than meeting the needs of teachers.

00:09:21.033 --> 00:09:26.490
So, like VC stuff in in education stuff, stuff to see.

00:09:26.490 --> 00:09:33.801
I mean I understand there are some, you know, very impactful companies out there that are VC backed in their startups and they do the right thing.

00:09:33.801 --> 00:09:43.708
So I'm not like calling out the whole industry, but when there's this much hype, this much energy and this much money going into one flavor of one thing.

00:09:43.708 --> 00:09:45.754
I think it's right to be skeptical.

00:09:46.740 --> 00:10:14.086
Yes, absolutely no, and I agree with you and, like you said, there are plenty of platforms and right now, like I said, being in the education space, going to conferences, you do see some of the up and coming or they're trying to start something, but then you also see already who's at the top and who's really staying there, and a lot of it is just, you know, they're backed by backed by investors in that sense, so which is great and great for them and what they're doing and just continuing to grow and grow and grow.

00:10:14.086 --> 00:10:20.090
But I want to ask you you know now being a small business owner, you know starting up Pitcode.

00:10:20.090 --> 00:10:38.033
I want to ask you, as far as what you're seeing in your industry, because you're out there, you're creating, you're out there, too, as well why do you think that so many AI tools are framed as magic bullets for classroom problems and what would you think is the danger of this in the future, possibly for them?

00:11:09.769 --> 00:11:17.621
Yeah, I mean, I think you know like a silver bullet sells, I mean, if your job becomes 70% easier, that's awesome.

00:11:17.621 --> 00:11:19.235
And of course, I want to buy that, right?

00:11:19.235 --> 00:11:27.710
If I could work two hours a day instead of eight hours a day, I mean sure I'll buy that every day of the week, but it's just not true, right?

00:11:27.710 --> 00:11:38.975
And I think part of what's happening with AI is there's this promise of these scaling laws and GPT-5 is going to be better than GPT-4 and GPT-6 is going to be smarter than Einstein.

00:11:38.975 --> 00:11:52.317
And I follow a lot of tech news and all of the reviewers in all of those videos always say buy the thing based on what it does today, not what the company promises it'll do in a couple of years.

00:11:52.317 --> 00:11:56.825
So if there is some great advancement in AI, I think AI is cool.

00:11:57.451 --> 00:12:00.817
I use it for, you know, getting stuff done in my own life.

00:12:00.817 --> 00:12:07.201
I put my LinkedIn posts through AI to see if I had any typos and, you know, sometimes I find some typos and that's useful.

00:12:07.201 --> 00:12:08.711
But that's kind of what it does, right?

00:12:08.711 --> 00:12:18.014
It doesn't, you know, it doesn't magically solve every situation and every problem, and I think when you're trying to, you know, fit a square peg into a round hole with everything.

00:12:18.014 --> 00:12:20.039
It doesn't make a lot of sense.

00:12:20.039 --> 00:12:27.932
At least, and I think there's just kind of this when you talk about the hype and you talk about the over promising under delivering, I think that's what happens, is it's like you know?

00:12:27.932 --> 00:12:34.520
Hey, you know GPT-6, once we plug GPT-6 into this thing, like, let me tell you, this is going to be awesome, but GPT-6 isn't out.

00:12:34.520 --> 00:12:36.982
So why do I need to buy this thing today?

00:12:36.982 --> 00:12:38.804
Because it doesn't actually do.

00:12:38.804 --> 00:12:40.692
It doesn't do what it's you know says on the label.

00:12:42.174 --> 00:12:45.001
Excellent, yes, and that's that's what I see a lot.

00:12:45.001 --> 00:12:46.130
You know that's happening.

00:12:46.130 --> 00:12:53.461
And, of course, you're people, and especially in education, and I guess you know this came out and I had Jennifer Manley on the show.

00:12:53.461 --> 00:12:55.625
I just released that episode yesterday.

00:12:55.625 --> 00:13:11.163
Actually, we talked a little bit about that and her being in the classroom and her actually being a computer science teacher and teaching about LLMs and, you know, artificial intelligence back in 2017, 16 and on, you know, doing that.

00:13:11.163 --> 00:13:18.317
She's just said like this is the first time that I have ever seen like this mass adoption in education.

00:13:18.317 --> 00:14:12.583
Immediately she says normally, you know she used to work with writing curriculum and things of that sort and really you kind of like pilot that very slowly and then you kind of see how it works, but now it's, you know, november 2022 came out and it was like boom, like everybody's using it, and it's that I guess, like you mentioned, that silver bullet effect of like, oh, now I can create my worksheets a lot faster, now I can create 30 questions a lot faster than I used to, this accurate and does this fall to the standards and understanding like this is like Dr Emily Bender says it's a synthetic text extruding machine, which is really, you know, again, going back to a mathematical equation, probabilities this is what it wants, and so sometimes I feel like you know that hype was there of that instant like solution.

00:14:12.583 --> 00:14:30.033
And of course, you know getting into the education system and you know seeing right now the way that teachers are really battling and, of course, with funding and not having enough teachers and enough coverage and classrooms being, you know, as big as they are, you know they're looking for a solution.

00:14:30.033 --> 00:14:49.562
It's almost like we're grasping at whatever we can to help us but at the end of the day, is it really helping us really educate the students or is it just helping us create more content that we're just giving to them to kind of just either keep busy and just have something going all the time rather than diving in deep?

00:14:49.562 --> 00:14:56.720
So those are some of the things there that we definitely talk about and kind of like it's a nice segue for me to ask this next question.

00:14:56.779 --> 00:15:06.134
As far as you know, ai replacement or reinforcement and you know, one of the things that you mentioned here on your post was, for example, talking about the classroom.

00:15:06.134 --> 00:15:09.798
If grading is hard, great, let's just get by an AI agent.

00:15:09.798 --> 00:15:12.457
Now I can give immediate feedback.

00:15:12.457 --> 00:15:16.501
Student turns in an assignment and I just let the AI grade it and give that feedback.

00:15:16.501 --> 00:15:20.378
Student-teacher ratios, like you said great, let's get an AI agent.

00:15:20.378 --> 00:15:23.379
And lesson planning of course, let's get an AI agent.

00:15:23.379 --> 00:15:25.697
So I want to ask you about that.

00:15:25.697 --> 00:15:34.105
So where do you draw the line between what is helpful automation and what is harmful delegation?

00:15:38.029 --> 00:15:38.932
helpful, automation and what is harmful?

00:15:38.932 --> 00:15:39.393
Delegation.

00:15:39.393 --> 00:15:42.841
Yeah, I mean, I think that you know, like, ai isn't the first technology that's come into the classroom.

00:15:42.841 --> 00:15:44.325
So you know, scantron is an un.

00:15:44.325 --> 00:15:49.718
There's no arguing that Scantron is a very, very helpful tool.

00:15:49.718 --> 00:16:04.957
When I'm, you know, when I was a teacher, I used a lot of spreadsheets to you know, analyze, you know differences between the grades, between the different classes, and hey, did I properly teach this topic in this class versus this class, like doing some quick math on a spreadsheet.

00:16:04.957 --> 00:16:08.620
That's great, like I, I need Google sheets to do my job and it's helpful.

00:16:08.620 --> 00:16:11.955
And you know now I don't use Scantron, but it was Google Forms or whatever.

00:16:11.955 --> 00:16:15.321
And hey, it's automatically shuffling the question over so students can't cheat.

00:16:15.321 --> 00:16:17.966
This kind of stuff is excellent.

00:16:17.966 --> 00:16:25.240
So can we think about these places where there shouldn't have to be an ethical question around it?

00:16:25.240 --> 00:16:33.024
That's already a red flag if we're asking the question at all and we're doing this based on kind of untested research.

00:16:33.024 --> 00:16:40.170
Right, we haven't had 10 years of this stuff um being around where we can do real research and figure out if this is the right thing to do.

00:16:40.170 --> 00:16:49.615
Um, again, as you say, you know, we're kind of just mass adopting, uh, this stuff and, uh, you know, mass adopt Scantron.

00:16:49.615 --> 00:16:54.663
That's fine because no one's ABCD, there's no student data privacy issues there.

00:16:54.663 --> 00:16:57.694
It used to take me 20 minutes to grade this multiple choice test.

00:16:57.694 --> 00:17:01.673
Now I just feed it into the Scantron machine and that's great.

00:17:01.952 --> 00:17:17.947
But fundamentally altering student-teacher relationships by trying to replace the relationship aspect with some sort of bot just because it's well-spoken, aspect, with some sort of bot just because it's well-spoken that's where I see the line is.

00:17:17.947 --> 00:17:25.659
If the tool is in any way touching the student-teacher relationship, we should be touching that thing with a 30-foot pole.

00:17:25.659 --> 00:17:29.173
That's very scary and that needs research and that needs to be tested Stuff.

00:17:29.173 --> 00:17:34.056
Hey, you know I generated a worksheet about addition and subtraction and you know that's cool's cool, no worries.

00:17:34.056 --> 00:17:37.721
Like that's not fundamentally changing the relationship between students and teachers.

00:17:37.721 --> 00:17:40.452
So the teacher facing stuff I'm less worried about.

00:17:40.452 --> 00:17:43.539
Um, it's really the student facing stuff, um.

00:17:43.539 --> 00:17:45.653
So that's what I'm talking about in terms of ai.

00:17:45.653 --> 00:17:54.177
Tutors scare me, ai assessment of student work scares me, but any of these things that are student facing altering that relationship, because that relationship is.

00:17:54.177 --> 00:18:01.211
In my long two year teaching career, what I found to be the single most important thing is that social aspect of things.

00:18:01.773 --> 00:18:03.417
Yeah, and I think that's very important.

00:18:03.417 --> 00:18:12.695
It's that communication, having that discussion and, obviously, as a teacher too, when you do know your students, then you're able to immediately improvise, adapt and overcome.

00:18:12.695 --> 00:18:24.004
So you have your lesson and, as we all know in teaching and I've had that experience where you create this amazing lesson and you're ready for that next day and all of a sudden, it just does not quite go according to plan.

00:18:24.004 --> 00:18:35.519
So it's just a matter of improvising, adapting and overcoming when you do know your students, and then you can go ahead and pivot and just say okay, we're going to go ahead and change the lesson a little bit, or also in the way that we provide instruction.

00:18:35.519 --> 00:18:52.895
And that's one of the things that now and I want to go back to, like the AI agent and the automation component I have been having a lot of conversations with a lot of colleagues of mine in this space, a lot of peers and so on, talking about them and the fear for me, and I guess I never saw that consequence until now.

00:18:52.895 --> 00:19:04.643
But I mean, we're talking a lot about automation and it just started very slowly and it just kind of creeps in, and even before AI, where it's like hey, we want this platform to connect to our Google Classroom.

00:19:04.643 --> 00:19:09.077
All right, just for the raw stream aspect, ok, it'll go ahead and automate that for you.

00:19:09.077 --> 00:19:13.194
That's one less task for a teacher to do on their own manually.

00:19:13.194 --> 00:19:18.040
Then all these platforms started coming out and say hey, we integrate with Google Classroom.

00:19:18.040 --> 00:19:29.291
Fantastic, so now teachers can go ahead and log in, they import their rosters and it's just seamless.

00:19:29.311 --> 00:19:51.817
One of the things that I have a problem with, or have been seeing, or not a problem, it's really just that concern that I have is because of just the world of automation making jobs a lot easier for teachers is the fact that now the platforms they assign an assignment, like you said, and say, okay, student submits, they turn it in and of course, that goes into Google Classroom or whatever LMS you may be using.

00:19:51.817 --> 00:19:52.961
It goes into that gradebook.

00:19:52.961 --> 00:19:57.556
Well, that grade book syncs to the student information system grade book.

00:19:57.556 --> 00:20:03.273
And now it just seems like, well, that's what you got, that's your grade, and it.

00:20:03.273 --> 00:20:14.454
To me it's just like are teachers going back and looking at those grades and what I'm hearing and seeing is, hey, whatever you got and whatever it says in Google Classroom, that's your grade.

00:20:21.210 --> 00:20:22.535
And I remember when I first started I had to input stuff by hand.

00:20:22.535 --> 00:20:29.939
So what that did is I was grading tests and then I had my stack of students that maybe didn't do well and I said, ok, this small stack, I need to talk to these students and see how I can remediate.

00:20:29.939 --> 00:20:39.225
Either reteach or it's a conversation where we have to fix a misconception and now it just seems like that, that disconnect that you're talking about in that relationship.

00:20:39.225 --> 00:20:44.431
It's like well, I assigned it, that's what you're, that's the grade that you got, that's what's going in Google classroom and that's it.

00:20:44.431 --> 00:20:46.012
Like you, you can't.

00:20:46.012 --> 00:20:48.956
Teachers are not asking like how can I help you?

00:20:48.956 --> 00:20:50.239
Let's remediate.

00:20:50.239 --> 00:20:55.746
And students are just like what's going on here, like I didn't even get a chance, you know at all whatsoever.

00:20:56.227 --> 00:21:12.583
As you talk about hurting relationships, I think that that is very dangerous and I think that's what we're seeing and I'm observing and I'm very cautious about, because I'm seeing a lot of my friends who have kids that are saying that the teacher doesn't even bother, you know giving a remediation.

00:21:12.583 --> 00:21:20.201
Or when I talk to them it's like, hey, well, that's what the system gave them, that's what they got, and I'm just like what's going on here?

00:21:20.201 --> 00:21:24.320
So, yeah, you're absolutely right, you know things like that, even just that basic.

00:21:24.320 --> 00:21:28.721
We definitely need to really look into that and say, hey, how can we better our practice?

00:21:28.721 --> 00:21:39.920
And now with AI, it's like, well, you turned in an essay to me that was probably maybe to some extent, could have been written with some AI help, but now you're giving some AI feedback.

00:21:40.681 --> 00:21:43.655
So, in reality, like, who's learning?

00:21:43.655 --> 00:21:44.920
Are we all learning?

00:21:44.920 --> 00:21:47.147
Are we giving positive feedback?

00:21:47.147 --> 00:21:54.521
So you're just grading what the AI says and we're losing out a lot of that personal time where we can help our students.

00:21:54.521 --> 00:21:57.554
And, of course, using this tool, it makes it easy.

00:21:57.554 --> 00:22:01.782
It's like, hey, all right, I'm done, saving me time, but at what cost?

00:22:01.782 --> 00:22:09.359
And that's really what concerns me, and I know I might've gotten off a little bit on that, but I do want to talk about that because that's one of the things that you talked about.

00:22:09.359 --> 00:22:12.941
As far as the snake oil, too, I want to talk a little bit about that.

00:22:12.941 --> 00:22:21.410
You mentioned, you know, the telltale signs of an edtech product that looks innovative but is built on shallow pedagogy and empty promises.

00:22:21.410 --> 00:22:23.355
So tell me a little bit about that.

00:22:24.356 --> 00:22:32.817
I think that getting real teacher experience in the product development cycle is really like what, what helps here here?

00:22:32.817 --> 00:22:41.391
Because, like, just because you went to school and had a good time, you know, and this seems on, hey, you know, oh, I remember my teacher complaining about grading.

00:22:41.391 --> 00:22:42.913
Okay, well, now we're going to automate grading.

00:22:42.913 --> 00:22:53.262
Um, you know, that seems good, but if you haven't gone through and actually been in a classroom, um, you know, I I don't really trust.

00:22:53.262 --> 00:23:20.805
Uh, what what you know, your marketing website says, and so it's very easy to market this stuff, and I think what really would help us determine whether one of these tools was good or not, or was being applied properly or not, is the tools don't know anything about the students, they don't know anything about the classrooms, their matrix multiplication in some data center, and they're very well spoken, um, but what if they had to be truthful?

00:23:20.805 --> 00:23:27.112
And at the beginning of every time it goes, do, do, do, do, do with that, you know, I know you had dan meyer in the show and he made the do do, do, do.

00:23:27.112 --> 00:23:36.583
If every time it went, do, do, do, do, at the beginning it says I know nothing about you, I am a set of matrices in a data center a thousand miles away.

00:23:36.583 --> 00:23:37.894
I don't care about you.

00:23:37.894 --> 00:23:38.698
I don't know your name.

00:23:38.698 --> 00:23:43.359
For data privacy reasons, I'm not allowed to know your name and I don't care.

00:23:44.171 --> 00:23:47.294
And then it gave the response and it was truthful.

00:23:47.294 --> 00:23:50.355
In that way, I think that there would be a little bit of a difference.

00:23:50.355 --> 00:23:51.533
It's like oh, wait a second.

00:23:51.533 --> 00:23:53.160
You know, am I gonna?

00:23:53.160 --> 00:24:04.250
Couldn't care less whether I give you a D or an F or a A or an A plus.

00:24:04.250 --> 00:24:08.544
Would you put that in front of a student?

00:24:08.544 --> 00:24:09.990
Because that's what's going on.

00:24:09.990 --> 00:24:11.035
It's just not being truthful.

00:24:11.035 --> 00:24:13.802
It's skipping that little part of the answer.

00:24:14.424 --> 00:24:19.583
So if we forced it to give that as the beginning of the answer, would we adopt that tool?

00:24:19.583 --> 00:24:30.968
I don't know, but if, when you fed the Scantron into the thing, it said, I don't care whether it's A, b, c or D, it's like, yeah, okay, scantron machine, like that's fine.

00:24:30.968 --> 00:24:34.542
So I would respect the Scantron machine for saying that.

00:24:34.542 --> 00:24:36.488
I wouldn't respect an AI tutor for saying that.

00:24:36.488 --> 00:24:39.060
So that's kind of I don't know.

00:24:39.060 --> 00:24:46.790
I need to write more kind of expand on that at some point, but I think that's really where we should draw the line is is you know, does it?

00:24:46.790 --> 00:24:49.634
If it doesn't care at all, which it doesn't.

00:24:49.634 --> 00:24:52.884
We need to make sure that we remember that it doesn't care at all.

00:24:53.425 --> 00:24:57.321
Yeah, no, and I agree with you and I think you know a lot of these platforms.

00:24:57.321 --> 00:25:00.890
It's you know, again talking about personalization.

00:25:00.890 --> 00:25:10.829
So really, it's taking all those problems in pedagogy or pedagogy practice and trying to put them into this AI platform.

00:25:10.829 --> 00:25:27.254
Sometimes what I always say is, it may not always work and because with that personalization, really what I feel is you're just putting the student in a silo, because now it's just them by themselves on a Chromebook or whatever device, speaking to this.

00:25:27.254 --> 00:25:46.144
You know, matrix, like you mentioned and that really does has no context of who the student is, who the student you know is coming from, what their background is, or just any context at all whatsoever, and it's just going to go ahead and give them an answer and you don't know whose answer that is.

00:25:46.144 --> 00:25:47.391
Whose history is it that they're repeating?

00:25:47.391 --> 00:25:59.480
Or who are they really talking to when they say like, hey, I'm going to talk to Harriet Tubman and I'm going to ask her about this, this or that which I know, and I'm going to bring up episode 319, where Rob Nelson mentioned digital necromancy and you know talking about.

00:25:59.701 --> 00:26:04.730
You know that that's very dangerous to me too as well, because you really don't know is this, is this.

00:26:04.730 --> 00:26:11.200
You know really that history, or whose history, is this, is this, this side, this side, this side and so on, and whose version?

00:26:11.200 --> 00:26:35.921
And so that's a danger too as well, because teachers aren't vetting or maybe not be vetting as well as they should there that, as an educator, I need to know what I am putting in front of the student so that it falls in line, obviously, with the standards that I need to teach for, whatever state I may be in, because at the end of the day, they're going to go ahead and get tested on that at the end of the year.

00:26:35.921 --> 00:26:56.844
So sometimes I feel that with the tools that we have or that are out there right now, sometimes we miss out on what we really need to be teaching and we go into something that is not even going to be taught or tested at the end of the year or just even for a formative or summative assessment, and so those are some of the things, too, that we need to consider.

00:26:57.826 --> 00:27:21.951
But again, going back to I know a lot of people always mention, you know, ai can also enhance bad pedagogy, because, as it is, if maybe you may be struggling at pedagogy and now you're adding this AI tool, then that could definitely just heighten that and just make things a little bit worse, if not better, you know, or very slightly, maybe a little bit better.

00:27:21.951 --> 00:27:24.067
So lots of things to think about there.

00:27:24.067 --> 00:27:27.809
But now I want to ask you as far as founders are concerned.

00:27:27.809 --> 00:27:43.682
Now, I know that you're a small business owner working with Pitcode, but, of course, going into the space now where you're going to conferences, you're presenting your product, bringing it out there and, of course, you get to see everything around you and all the apps that are there.

00:27:43.682 --> 00:27:55.717
But I want to ask you, in your position and with your experience, what would you say to an AI founder who's never taught but is pitching tools to fix the classroom?

00:27:55.717 --> 00:28:00.612
And I mean, we can get a little spicy, but a little constructive obviously constructive here as well.

00:28:00.612 --> 00:28:02.605
But what are some of your hot takes there?

00:28:04.391 --> 00:28:11.684
Stop, just go sell to someone else, I think is what I would say.

00:28:11.684 --> 00:28:14.731
I mean, like, obviously they're like I don't want it.

00:28:14.731 --> 00:28:16.302
You know, that's the spicy version of it.

00:28:16.302 --> 00:28:19.369
But pause, how about pause?

00:28:19.369 --> 00:28:23.724
Pause sounds a little kinder Pause for a long time.

00:28:23.724 --> 00:28:29.606
Go talk to a hundred or a thousand teachers and really understand their pain.

00:28:29.606 --> 00:28:31.656
You don't have to go and teach, that's fine.

00:28:31.676 --> 00:28:35.086
Teaching is a really tough job, as we may all know.

00:28:35.086 --> 00:28:38.121
If we're listening to this podcast, it was a little bit too hard for me to do.

00:28:38.121 --> 00:28:39.903
Really exhausting.

00:28:39.903 --> 00:28:41.605
I mean, it's just a crazy job.

00:28:41.605 --> 00:28:48.455
But if you want to start some AI company, first of all it's very difficult to sell in schools.

00:28:48.455 --> 00:28:57.350
So, just if you're trying to make the most money, you might just want to go sell to accountants, because you don't have to go through a purchase order and approval and a data privacy plan.

00:28:57.350 --> 00:29:00.186
Just go sell software to accountants, that's going to be OK.

00:29:00.186 --> 00:29:01.550
That's the first thing I'll say.

00:29:01.550 --> 00:29:07.963
But also you have to understand what's going on in the classroom for real.

00:29:07.963 --> 00:29:09.248
And I think one thing that's a little tough.

00:29:09.288 --> 00:29:10.701
I want to just go back to this a little bit.

00:29:10.701 --> 00:29:22.201
We kind of are talking about this as like what happens in the doomsday scenario when the snake oil gets sold and the snake oil gets adopted, and I think the numbers would bear out.

00:29:22.201 --> 00:29:29.290
I'm not sure to what extent this is happening in a ton of classrooms, because the teachers that I talk to they are I've never.

00:29:29.290 --> 00:29:35.807
I mean, part of the reason why we haven't done any AI anything on our platform is that I've never heard from a single teacher that they want a single AI anything.

00:29:35.807 --> 00:29:39.528
So I mean I have never.

00:29:39.528 --> 00:29:52.054
I answer every single customer support email almost you know, or actually just hired someone to do a little bit more support for us but I answer almost every single feature request and I've never heard a request for any AI anything.

00:29:52.054 --> 00:30:00.000
So I think there's a disconnect between the funding and the investment and the founders and what's going on for real in the classroom.

00:30:00.000 --> 00:30:10.059
So one thing that I think is like a positive note is we talk in terms of what happens if these tools that we disagree with get adopted, but hopefully they're just not getting adopted.

00:30:10.421 --> 00:30:21.301
That would be my hope, and so for that brand new person who wants to go to Y Combinator and start their startup and education, pause, pause deeply, take your three months at Y Combinator and just spend the entire time talking to teachers.

00:30:21.301 --> 00:30:33.695
And if you talk to teachers for three months and they have some concrete pain points that can be solved that don't affect the student teacher relationship and are really there, I don't know every possible business idea out there.

00:30:33.695 --> 00:30:37.607
I was only a teacher for a couple of years, only ever taught math and computer science.

00:30:37.607 --> 00:30:40.785
So if you can find some pain points in a real market there, then go for it.

00:30:40.785 --> 00:30:43.592
I'm a little bearish, as market people would say.

00:30:43.592 --> 00:30:55.096
I'm a little pessimistic on that being a you know of those really that many areas existing where you really need AI in particular, and so I'll just I'm going on a little bit of a tangent.

00:30:55.115 --> 00:30:59.185
But on the AI point, I saw this on Twitter and I don't think it was a joke.

00:30:59.185 --> 00:31:08.300
The person said when will there be an AI agent for reminding me when someone's birthday in my calendar is coming up?

00:31:08.300 --> 00:31:13.594
And it's like that does not require artificial intelligence.

00:31:13.594 --> 00:31:19.405
We have had the technology for doing calendar reminders for like 50 years.

00:31:19.405 --> 00:31:23.232
It doesn't require artificial anything.

00:31:23.232 --> 00:31:24.782
It just requires that.

00:31:24.782 --> 00:31:30.929
You know, not all of us are programmers, but if it's seven days until the birthday, send the email about the birthday.

00:31:32.051 --> 00:31:37.206
So you know you could also just decide to get into ed tech, talk to the thousand teachers and do something that's completely unrelated to AI.

00:31:37.206 --> 00:31:41.063
That's probably a decent way to go as well, because there's plenty of problems still to be solved.

00:31:41.063 --> 00:31:48.734
Not everything has to be AI, and if you talk to any of these, you know kind of startup accelerators or whatever they would say.

00:31:48.734 --> 00:31:51.525
Don't start with the technology and apply it to the problem.

00:31:51.525 --> 00:31:53.932
Start with the problem and find some technology.

00:31:53.932 --> 00:31:57.789
So people are doing it in reverse and doing this completely uncritically.

00:31:57.789 --> 00:32:03.692
So don't call yourself an AI founder and then try and find some way to slam this into education.

00:32:03.692 --> 00:32:10.641
Find some real problem out there, talk to a thousand teachers, figure out what they need and then go build a solution to that.

00:32:10.641 --> 00:32:11.904
You know those common pain points.

00:32:11.904 --> 00:32:13.980
That's what I would tell someone who wants to start a business.

00:32:14.481 --> 00:32:29.569
I love it and I think you kind of already answered my next two questions, which is great, you know, because I think that you did it very well, because I was going to ask you, you know, what are some AI startups or what are some of the things that AI startups are missing or willfully ignoring?

00:32:29.569 --> 00:32:32.344
And, like you mentioned here too, it's just really getting that teacher feedback.

00:32:32.344 --> 00:32:42.464
A lot of startups or founders are not talking to teachers and really what they're seeing, and I feel oftentimes is you've had those major players that came out, you know, maybe in 2023.

00:32:42.464 --> 00:32:48.183
And then all of a sudden, the ones that did well have that backing and are continuing to grow.

00:32:48.183 --> 00:33:03.853
What I'm seeing is there's others that are coming in and kind of really trying to do the exact same thing that they are, but the only difference is is that they have the backing and they just continue to grow, while the others are just kind of off to the side, which, you know it's very interesting.

00:33:03.893 --> 00:33:12.807
Then I've seen where the smaller ones, you know, are picking up more traction because they're, you know, sometimes they say that you know there were teachers by teachers.

00:33:13.228 --> 00:33:19.861
There's one company that's out there that they're still teaching and they're still doing this, you know, in a similar way to the way that you are doing things.

00:33:19.861 --> 00:33:48.050
So they're getting that first person experience, not only in the classroom, but using it, as opposed to being far removed and then trying, like you said, trying to put the tech to fix a problem instead of finding the problem to fix the tech and giving us a plethora of tools that we may not really need but it just looks good, because it's like you've got a hundred tools, but do I really need them, you know, and so those are some of the things that I see there.

00:33:48.050 --> 00:33:59.290
So, again, being very honest and cautious about you know who you're building for, how you're building for them, and then really looking at those blind spots, so how or what would be?

00:33:59.290 --> 00:34:11.436
One way, charlie, that you would think in your experience that you've had, is one way that we can hold these builders or these founders accountable for what it is that they are creating.

00:34:12.800 --> 00:34:24.994
I think to some extent, buyer beware maybe is what can happen If we just don't buy the products, then the companies don't have to exist or they'll pivot and they'll do something better.

00:34:24.994 --> 00:34:28.704
So we need to be.

00:34:28.704 --> 00:34:30.969
I mean, I have never.

00:34:30.969 --> 00:34:35.304
So just to be clear, I have never purchased a piece of tech software in my life.

00:34:35.304 --> 00:34:37.849
In my classroom the mice were broken.

00:34:37.849 --> 00:34:43.071
In the classroom I went, used my credit card, bought a bunch of mice on Amazon the cheapest ones I could find.

00:34:43.071 --> 00:34:52.008
So I've never gone through and tried to get a requisition or a purchase order for anything and I've never been on the buying end of any ed tech product.

00:34:52.008 --> 00:34:57.226
And I know it's tough integration folks, they don't want to get left behind and that's all well and good.

00:34:57.226 --> 00:35:02.750
And I don't want to make anyone mad, of course, because I am selling to these exact same people.

00:35:02.750 --> 00:35:08.106
So, with all due respect to everyone involved, let's pause and think.

00:35:08.106 --> 00:35:13.486
Can we get the teachers involved in the buying process and can we say let's pause, let's take this slow.

00:35:13.486 --> 00:35:15.390
Is this, you know?

00:35:15.390 --> 00:35:16.392
Can we pilot this?

00:35:16.392 --> 00:35:17.341
Is there any way that we can?

00:35:17.341 --> 00:35:18.484
You know, without paying?

00:35:18.484 --> 00:35:22.245
Can we pilot this and see if it actually moves the needle on any outcomes?

00:35:22.245 --> 00:35:31.793
Because if it doesn't, then we don't need to buy, we don't need to put in the purchase order and we don't need to give a misguided company revenue.

00:35:31.793 --> 00:35:34.445
That's going to keep them going and attract more investor interest.

00:35:35.107 --> 00:35:42.307
I think part of it is that also, just there's so much money around for the AI type companies.

00:35:42.307 --> 00:35:46.174
Me and I run this business and I have tried to get investment.

00:35:46.174 --> 00:35:47.583
I have tried to get grants.

00:35:47.583 --> 00:35:50.391
I've entered competitions and I have lost all of them.

00:35:50.391 --> 00:35:51.442
I have not.

00:35:51.442 --> 00:35:55.303
You know I'm not like you know we're not dying for investors, but you know I've tried.

00:35:55.303 --> 00:35:58.552
And when you see, hey, you know our next accelerator batch.

00:35:58.552 --> 00:36:00.564
What is every single company doing?

00:36:00.564 --> 00:36:03.028
They're doing AI agent for this, ai agent for that.

00:36:03.028 --> 00:36:04.893
You enter some ed tech tool.

00:36:04.893 --> 00:36:10.883
You know some sort of competition.

00:36:10.883 --> 00:36:11.385
What's winning the prize?

00:36:11.385 --> 00:36:11.968
It's AI, this AI, that AI, that.

00:36:11.968 --> 00:36:22.051
So it's kind of, when the investment is all coming in and it's all pointed toward the AI, it's kind of no surprise that that's what we're getting, as you know, the new solutions on offer, and that's why every conference booth is filled up with an AI company.

00:36:22.592 --> 00:36:24.875
No, oh man, I agree with you so much.

00:36:24.875 --> 00:36:27.948
It just seems like those initials really it's just marketing.

00:36:27.948 --> 00:36:29.501
Artificial intelligence is just really.

00:36:29.501 --> 00:36:36.926
It's just that to me and I learned that real quick and I know sometimes people really get upset, you know, but it really is.

00:36:36.926 --> 00:36:46.454
It's like I remember even just seeing like, oh, you know, you've got AI, you know refrigerator, where now it'll just change the temperature depending on whatever it is.

00:36:46.454 --> 00:36:50.487
Or you've got your AI laundry, your washer.

00:36:50.487 --> 00:36:55.503
Now, based on how much weight it has, it'll go ahead and go as long as it needs.

00:36:55.503 --> 00:36:58.590
I was like I could just put it in, put in the cycle.

00:36:58.590 --> 00:37:00.341
I'm sure my clothes is going to be clean.

00:37:00.341 --> 00:37:02.346
I haven't had dirty clothes in a long time.

00:37:02.346 --> 00:37:11.303
And it's not AI, but they just pop that on and it's just like great, we need to buy it because everybody else has it and it's just so much of a hype on that.

00:37:11.303 --> 00:37:18.007
So I think sometimes we need to definitely slow down a bit, like you mentioned, take that pause and say okay, in the longterm.

00:37:18.408 --> 00:37:31.013
Number one is this company going to be around based on and my biggest concern is this, and I've always had this concern because a lot of the companies they plug in to OpenAI or other APIs that are out there.

00:37:31.013 --> 00:37:40.366
Eventually that cost needs to go up because they're producing so much and people are prompting so much or putting so much stuff in that.

00:37:40.366 --> 00:37:43.233
As that goes up, well, now we need to adjust prices.

00:37:43.233 --> 00:37:52.746
And even in my space that I work in, every year it seems like platforms are going up from anywhere from 7% to 11% from one year to the next.

00:37:52.746 --> 00:38:05.295
And we get that because it's more storage that needs to be used and, of course, they have to work on amplifying their servers and whatnot and all that stuff that goes in the back end and, obviously, earnings.

00:38:05.295 --> 00:38:06.876
You got to pay your people and so on.

00:38:06.876 --> 00:38:09.138
But it just really gets scary.

00:38:09.218 --> 00:38:19.871
But my thing is I always want to reflect and say am I taking a risk in getting into a three-year deal with this company that may not even be around next year?

00:38:19.871 --> 00:38:23.944
And then I'm out that money or something may happen with data.

00:38:23.944 --> 00:38:27.182
I mean, how, how can I make sure that my data is safe?

00:38:27.182 --> 00:38:30.129
More student data, even you know.

00:38:30.129 --> 00:38:42.453
So slowing down is a great thing, which kind of brings me to that next segment here that I want to talk about is one of the things that you put in your post, which is move fast and break things has no place in the classroom.

00:38:42.453 --> 00:38:48.164
Move fast and break things has no place in the classroom.

00:38:48.164 --> 00:38:48.424
So what does?

00:38:48.444 --> 00:38:49.206
move with care, look like for you.

00:38:49.206 --> 00:38:50.728
Yeah, I mean, I think that's a great point.

00:38:50.728 --> 00:38:54.642
I mean I move slowly just because I get tired.

00:38:54.642 --> 00:39:05.032
I can't work a 16 hour day, so part of it is just being a little bit more human about it, I guess, just like this hyper growth mindset is what's the problem.

00:39:05.032 --> 00:39:15.809
I think and again, I don't know to what extent this is working it's working on the investment side and garnering investment interest and getting a lot of companies on there with fancy logos and fancy marketing.

00:39:16.461 --> 00:39:22.866
I would hope and I think you're kind of speaking to this as like I've got this three-year contract with this company Is it going to exist in a couple of years?

00:39:22.866 --> 00:39:32.520
And I would be a little bit scared if that company's going to be around, because I think these are a lot of false promises that when you have that hyper growth mindset and all that investment coming in.

00:39:32.520 --> 00:39:41.598
And again I say this, I'm not an economist, but I'm on the ground trying to sell stuff and what sells is stuff that actually solves problems for teachers.

00:39:41.598 --> 00:39:57.841
We're going through renewals right now and teachers are happy with our software because it actually solved the pain points for them that they had and it didn't add weird complexity for them with a bunch of AI nonsense that's shipping off their student data to you know God knows where data center.

00:39:57.841 --> 00:40:15.111
So, yeah, I mean, I think giving back to all of this slowing down is really about rejecting the hyper-growth mindset and saying what is the problem that teachers face?

00:40:15.111 --> 00:40:16.672
What are the problems that students face?

00:40:16.672 --> 00:40:18.173
What are the problems that families face?

00:40:18.173 --> 00:40:20.815
Can we build those relationships?

00:40:20.815 --> 00:40:28.765
And, from an ed tech provider side, can we build lasting relationships with schools that matter and can we actually?

00:40:28.765 --> 00:40:33.422
Are we selling to them or are we working with them to solve their problems?

00:40:33.422 --> 00:40:56.432
Those are two different things and I think if I was an investor trying to actually make my money back, I would bet on the company that is going into things with that mindset, whether it be through having educators on staff or whether it be through carefully working with, partnering with districts, doing pilots, being slow, giving stuff.

00:40:56.432 --> 00:41:01.816
Give the software away for free for a little bit to test it out and see what happens.

00:41:01.916 --> 00:41:03.597
Actually measure outcomes right.

00:41:03.597 --> 00:41:05.078
Measure real outcomes.

00:41:05.078 --> 00:41:10.691
Something that really drives me crazy is pointing to a single anecdotal.

00:41:10.691 --> 00:41:15.747
Here's one classroom, here's one lesson that was enhanced by AI.

00:41:15.747 --> 00:41:22.208
We can talk to Harriet Tubman and Ben Franklin, who have dead people.

00:41:22.208 --> 00:41:26.770
Oh, that's so cool, and now we have to buy this thing.

00:41:26.770 --> 00:41:32.402
Then he's an interesting lesson plan.

00:41:32.402 --> 00:41:36.554
Don't get me wrong, and I'm not a history teacher, so I can't speak to whether that would be an effective lesson plan or not, but it sounds cool.

00:41:36.554 --> 00:41:48.323
And so you look at the marketing website and says we, you know, here's a video of 10 minutes of amazing engagement with students talking to ghosts, and so therefore you need to go and buy this stuff.

00:41:48.323 --> 00:41:50.267
That shouldn't be how it goes.

00:41:50.267 --> 00:41:56.006
We should be sitting down and quietly and carefully measuring outcomes.

00:41:57.509 --> 00:42:00.703
If you need to do something brand new and untested, you need to measure outcomes.

00:42:00.703 --> 00:42:04.072
Or you can do something obvious, like a Scantron.

00:42:04.072 --> 00:42:05.911
You don't need a study to tell.

00:42:05.911 --> 00:42:08.503
Like a Scantron, you don't need a study to tell you that a Scantron is useful.

00:42:08.503 --> 00:42:12.593
It's just obvious on face value, it's just logical.

00:42:12.593 --> 00:42:26.389
I think if you spent time doing a study on that, you'd be wasting your time, because it's like yes, in fact we've measured, and it takes longer to grade the multiple choice tests by hand than it does with the Scantron, so that one you don't need to study.

00:42:26.600 --> 00:42:30.947
But on things where you do need to study, you need to study, and so go and do your study.

00:42:30.947 --> 00:42:36.689
So you know the tools, the problems that we're solving at PicCode.

00:42:36.689 --> 00:42:41.921
We're just making it so you don't have to email zip files of code around, you know.

00:42:41.921 --> 00:42:46.626
We just automatically have it so that the student's code is uploaded and runs in the cloud and you don't have to install anything.

00:42:46.626 --> 00:42:57.387
And I feel very confident deploying that software into classrooms without a study because it's like, yeah, uploading and downloading zip files is terrible and we don't want to.

00:42:57.387 --> 00:43:10.976
So, like you know, I'm not going to do a study, I'm just going to go on my gut instinct as a person who spent some time in the classroom, that uploading, downloading zip files isn't fun and is bad and waste time and leads to worse outcomes.

00:43:10.976 --> 00:43:13.625
That's my gut instinct and I'm not going to do a study.

00:43:13.625 --> 00:43:21.687
But I imagine if a study was done we could measure with a stopwatch and having the automatic upload happen would save some time.

00:43:22.680 --> 00:44:22.757
One of my last questions here before we get into our last three questions that I always end, and we'll actually we'll talk a little bit more about Pitcode too as well, because I got a question especially with vibe coding being so popular right now and it's like you know, and I get it, you know, but we'll talk about that.

00:44:22.797 --> 00:44:40.949
But my last question to you is and this could be, maybe this could be that moment, and I know you've had some really great points but maybe there's this one thing that you've always wanted to share, or just say, or always on top of mind but if you had a mic at an AI and EdTech Investor Summit, what's the hard truth?

00:44:40.949 --> 00:44:47.452
You'd want every founder and funder to hear about building their next tool.

00:44:49.400 --> 00:44:55.606
Yeah, I mean I've kind of already got into it, but it's like I don't know.

00:44:55.606 --> 00:44:58.070
I mean I don't want to say something that's wrong.

00:44:58.070 --> 00:45:04.215
So I'm speaking and I'm reviewing what's going on based on the capabilities of the tools today.

00:45:04.215 --> 00:45:14.335
So, based on the capabilities of the tools today and the way the world works and the way that society functions, I predict that these tools have little to no value.

00:45:14.335 --> 00:45:15.787
That's the spiciest version of it.

00:45:15.787 --> 00:45:19.971
So, therefore, you are wasting your time, you're wasting your money, you're going to lose your money.

00:45:19.971 --> 00:45:22.326
That's the spicy version of it.

00:45:22.800 --> 00:45:34.920
Now, I think the bet is that the technology gets a whole lot better, and so GPT-6 is now smarter than Einstein, right, ah, okay, well then maybe I was wrong.

00:45:34.920 --> 00:45:37.606
So I don't like someone.

00:45:37.606 --> 00:45:40.079
You know, maybe someday Pitco does a little bit better.

00:45:40.079 --> 00:45:45.664
And you know, people want to go back and they want to clip this podcast when GPT-6 comes out and say, charlie, we're really, really wrong.

00:45:45.664 --> 00:45:48.519
Um, cause, gpt-6 was smarter than Einstein.

00:45:48.519 --> 00:45:50.625
And look at this guy who was um.

00:45:50.625 --> 00:45:56.981
You know, there's so many examples in history of folks saying that technology isn't going to work for this and that reason, and they've proven wrong.

00:45:56.981 --> 00:46:00.126
But here I just don't.

00:46:00.126 --> 00:46:04.151
It doesn't add up for me, and this is mostly based on gut.

00:46:04.151 --> 00:46:24.422
I don't have a lot of studies to go off of so, based on my gut right now, based on the capabilities of the tools as they exist with regards to education, regards to how society works and what our expectations are for teachers and the way that classrooms work with respect to teacher and student relationships, I just think these tools are doomed to fail.

00:46:25.235 --> 00:46:27.860
And that's the spiciest version of it and that's what I'd say.

00:46:27.860 --> 00:46:34.568
I'd get up there and I'd say that and I would look really, really stupid if GPT-6 comes out and it's smarter than Einstein.

00:46:34.568 --> 00:46:36.581
But my bet is that it won't.

00:46:36.581 --> 00:46:57.005
And I'm ready to be proven wrong because if it is, if the, if GPT-6 is smarter than Einstein, then that's an awesome world to live in and GPT-6 will solve global warming and it'll cure cancer and it'll do all of these things and it's you know, honestly, I'm fine being wrong, I'll be proven wrong and I'll have some egg on my face and that's okay, but that's the bet.

00:46:57.284 --> 00:47:03.643
So in that room, that's what I'd say with the mic is you are betting on, you know, open AI to come out with a thing that's smarter than Einstein.

00:47:03.643 --> 00:47:07.842
If you bet correctly, congratulations.

00:47:07.842 --> 00:47:10.086
You probably made a billion to a trillion dollars.

00:47:10.086 --> 00:47:14.505
That's awesome, and if that's not correct, then you've wasted everyone's time.

00:47:14.505 --> 00:47:19.780
So you know it's a gamble and that's, I mean, that's what investing is and that's what starting a company is.

00:47:19.780 --> 00:47:21.615
To some extent is a little bit of a gamble.

00:47:21.615 --> 00:47:24.322
So you know, more power to them, honestly.

00:47:24.322 --> 00:47:28.077
But more power to them as long as you're being careful, which some people aren't doing.

00:47:28.077 --> 00:47:32.963
That's where you have to kind of speak up is hey, more power to you, except wait a second.

00:47:32.963 --> 00:47:36.688
You're messing with student data, you're messing with student privacy, you're messing with student teacher relationships.

00:47:36.688 --> 00:47:38.110
So that's where it's not okay.

00:47:38.494 --> 00:47:46.526
I'm with you on that, and especially with the data components, which is really scary, you know, and consent, and that's a whole other thing that we didn't get into.

00:47:46.606 --> 00:47:52.104
But of course, our focus was really just on your post and just getting your thoughts.

00:47:52.104 --> 00:47:55.550
You know, on what it is that you're seeing and it's just very interesting.

00:47:55.550 --> 00:47:56.773
You know the experience that you had.

00:47:56.773 --> 00:48:12.157
And then, just going back to the experience that you had at this conference, and you know and again, this is what I love about this show that you know we always bring in both sides and we just definitely want to always have a balanced conversation and just make sure that we all learn from each other and putting everything out there.

00:48:12.157 --> 00:48:14.141
And so thank you so much, charlie.

00:48:14.141 --> 00:48:25.409
And one last thing that I do want to talk about is obviously talking a little bit about Pitcode and as far as this project that you're working on, this business, tell us a little bit more about it.

00:48:25.409 --> 00:48:35.221
And just for my audience of computer science teachers that are out there, you know what is it, what is Pitcode, what can it do, and you know what are some of the benefits.

00:48:36.244 --> 00:48:36.505
For sure.

00:48:36.505 --> 00:48:41.943
Yeah, well, I appreciate you letting me have a little time to chat about it, because obviously that's my main focus.

00:48:41.943 --> 00:48:47.751
So yeah, so what we provide is an online platform for running your computer science classroom.

00:48:47.751 --> 00:48:58.876
So if you imagine it's a Tuesday afternoon, we need to run a coding lesson in Python about converting Celsius to Fahrenheit, there needs to be a question of where is the software running that's going to.

00:48:58.876 --> 00:49:00.601
So they're typing in Python code.

00:49:00.601 --> 00:49:03.146
Where are the students typing in the Python code?

00:49:03.146 --> 00:49:04.760
How is the Python code running?

00:49:04.760 --> 00:49:09.960
When the student is finished writing their code or as they're writing their code, can I see what they're working on?

00:49:09.960 --> 00:49:28.380
And so we provide the solution there where the students as the teacher, you've created the lesson, you distribute that to the students, you just click assign in the system and it goes out to all your students after you've rostered them and then, as they're typing in real time, you've got a dashboard where you can see everything they're working on and you can collaborate with them.

00:49:28.380 --> 00:49:37.929
You can set up collaborative work environments for the students where they can work together on the exact same code at the exact same time, and so we just want to simplify that process.

00:49:38.375 --> 00:49:43.860
So, for a school district who's got an old computer lab that they're trying to get rid of, or they're switching to Chromebooks.

00:49:43.860 --> 00:49:45.664
You know, there you go.

00:49:45.664 --> 00:49:45.865
There's.

00:49:45.865 --> 00:49:47.938
Your solution is that you can run it all in the cloud with us.

00:49:48.720 --> 00:50:04.724
And if you're a teacher who's tired of, you know, taking your students code and hoping that they email you the right zip file or uploading the right thing to Google Classroom, all we're trying to do is save you five 10 minutes a class.

00:50:04.724 --> 00:50:06.230
We're not solving world hunger and we're not going to transform education.

00:50:06.230 --> 00:50:07.737
That's why I like to think of ourselves as a small business.

00:50:07.737 --> 00:50:09.503
We're not some hyper growth startup.

00:50:09.503 --> 00:50:11.673
We're trying to save five 10 minutes.

00:50:11.673 --> 00:50:13.798
We're trying to make it a little bit less frustrating.

00:50:13.798 --> 00:50:28.726
Coding is difficult enough to learn, it's difficult enough to teach and we want to simplify where we can be confident that we're doing so in a way that respects teachers, respects what they want and, you know, do that with their input.

00:50:28.726 --> 00:50:30.422
So a lot of our focus now.

00:50:30.422 --> 00:50:41.403
We kind of built the first version of the tool based on my classroom experience and now we work mainly off of what you know, our teacher partners and partner schools ask us to do.

00:50:41.403 --> 00:50:48.461
So it's a lot of getting feedback, collecting that all together and coming up with how we adapt the product from there.

00:50:48.481 --> 00:50:51.008
I love it Simple yet effective.

00:50:51.008 --> 00:51:07.867
You know, in that sense and that's one of the things you know, even when I go do sessions with teachers, I always say look, if I can just save you five to 10 minutes just to, at the end of the day, turn off your lights and just decompress, I think that I've done a good job at just helping you with that, because I don't want to add more to their plate.

00:51:07.867 --> 00:52:02.467
But one of the things that I do love about this, about PitCode, is the collaborative aspect, and one of the things that I have seen slowly, you know, is the loss of discussion and collaboration in classrooms, where it has become more of the teacher being the subject matter expert and just simply delivering the content but not allowing students to create content, but now be able to collaborate and have that communication between one another in solving code or writing code together, programming together, and just the fact, too, that even if you're on a Chromebook, this will work because everything's on the cloud, which would really save teachers time, and not only that, but districts time in having to upgrade their labs and spending thousands of dollars on the hardware, where those thousands of dollars can be allocated for some bigger projects and, you know, classroom maintenance and things of that sort and infrastructure.

00:52:02.467 --> 00:52:17.101
But I think that this is fantastic, charlie, and I'm just so thankful, like because it you know, for me less is more and I've always been a follower of the kiss philosophy, which for me is a little bit different than most people hear it.

00:52:17.101 --> 00:52:23.637
For me it's just keep it simple and streamlined, and I think that PICCode does the streamlining part very well.

00:52:23.637 --> 00:52:26.624
So thank you so much for sharing a little bit about PICCode.

00:52:26.684 --> 00:52:31.865
And my last question I know vibe coding is a big thing right now and I know you talked a little bit about it.

00:52:31.865 --> 00:52:46.204
And I know for vibe coding, I know it's the ability to have access, to be able to, possibly like for somebody like me who does not have a computer science background, but just to be able to say, hey, I want to create this dashboard that might have this and so on and so forth.

00:52:46.204 --> 00:52:58.126
So I know that it gives me that access to be able to do something similar to that, as though it might not be as robust as an actual code, but it kind of gives me an idea there.

00:52:58.126 --> 00:53:01.735
But I want your take on it as far as students.

00:53:01.735 --> 00:53:08.449
I know you talked a little bit about it in the beginning, especially when they're learning Python, that this might not be good for them.

00:53:08.449 --> 00:53:10.500
Tell me a little bit more about that.

00:53:12.184 --> 00:53:16.125
Oh gosh, so do we have like 45 minutes to two hours left to talk?

00:53:16.125 --> 00:53:18.784
Because this is really what maybe drives me up the wall.

00:53:18.784 --> 00:53:19.936
I mean, I just don't.

00:53:19.936 --> 00:53:30.543
I feel bad for people who are getting in this trap that there's this shortcut where they can again.

00:53:30.543 --> 00:53:32.086
I don't know, I might be proven wrong.

00:53:32.086 --> 00:53:42.898
So I want to caveat everything with, like I might just be wrong because the technology just gets so great, but we have to review stuff and we have to think about stuff based on how it exists today, not what it's going to do in the future.

00:53:42.898 --> 00:53:46.467
So, as of today, may 10th 2025, it doesn't work.

00:53:46.467 --> 00:53:48.717
It's a false promise.

00:53:48.717 --> 00:53:49.360
It just doesn't.

00:53:50.282 --> 00:53:52.367
You can make an exciting demo, which is cool.

00:53:52.367 --> 00:53:55.706
You can create interesting designs that's cool.

00:53:55.706 --> 00:54:00.764
You can create a somewhat functional clone of some other thing that exists, that's all well and good.

00:54:00.764 --> 00:54:13.105
But so this was actually what set off the person at the conference I asked would you put your banking information into a website that was vibe coded, would you?

00:54:13.105 --> 00:54:16.115
That is a question I wouldn't Okay.

00:54:16.115 --> 00:54:18.606
Well then, does this make any sense at all?

00:54:18.606 --> 00:54:28.469
It doesn't right If you want to make a fun little game or you want to make a different version of Wordle or whatever neat, that is cool.

00:54:28.469 --> 00:54:30.021
I'm not going to deny that that's cool.

00:54:30.021 --> 00:54:37.039
But for serious software, if it was written entirely by an AI, I'm going to be very wary.

00:54:37.039 --> 00:54:45.545
And I don't think that a company advertises on their marketing site oh, this was vibe coded, by the way and it should say there should be a little banner on the top.

00:54:45.545 --> 00:54:53.130
If people were telling the truth, there'd be a little banner on the top saying this website was completely generated by you know a probability matrix in some data center.

00:54:53.130 --> 00:54:55.412
Would you like to now enter your bank information?

00:54:55.412 --> 00:54:58.177
And it's like well, wait a second.

00:54:58.177 --> 00:55:00.724
Now, when we think about it like that, maybe this isn't such a good idea.

00:55:00.724 --> 00:55:05.383
So for games and toys like it's awesome, it really is, like I actually think it's sweet.

00:55:05.383 --> 00:55:16.182
And so you know, when it comes time for us to demo some stuff with pick code in terms of its capabilities, sometimes I'll ask GPT for 30 lines of Python code to plug in to do something interesting.

00:55:16.182 --> 00:55:20.501
And in that kind of toy examples it's neat and no problems.

00:55:20.501 --> 00:55:22.547
But I don't think it's the right way to build systems.

00:55:22.914 --> 00:55:28.287
And then, from the student perspective, I mean just simple questions.

00:55:28.287 --> 00:55:29.675
I think can solve a lot of our problems.

00:55:29.675 --> 00:55:33.885
Would you teach a third grader how to multiply with a calculator?

00:55:33.885 --> 00:55:40.338
Do you teach them how to multiply on paper because it gives you the fundamentals and understand.

00:55:40.338 --> 00:55:40.701
You know.

00:55:40.701 --> 00:55:57.865
You understand what multiplication actually is and what the process is and why you might want to multiply, and you do some word problems and you do them carefully and then eventually you get to graduate to use the calculator, and so it's like the same exact thing, but somehow the AI is just very well-sp spoken and it's confused us, um.

00:55:57.865 --> 00:56:02.885
And then there is again the caveat that if it's smarter than einstein, then I'm totally wrong.

00:56:02.885 --> 00:56:06.403
And so that's where it's scary and that's where that's where the confusion really comes in.

00:56:06.403 --> 00:56:10.181
Is that, um, we've been promised by open ai and etc.

00:56:10.181 --> 00:56:12.184
That that einstein's coming for us?

00:56:12.184 --> 00:56:17.847
Um, if that's true, then yeah, you know, maybe if einstein wrote the banking app, then yeah, I'll put in my password.

00:56:17.847 --> 00:56:22.407
But you know, if it's not Einstein, then I'm a little worried.

00:56:22.954 --> 00:56:39.903
Well, thank you so much for that take, you know, because that's something that I've been seeing a lot of, and some interesting takes like on TikTok and videos that are just hilarious, you know, and there was this one gentleman who was like, oh, you know, he was doing some vibe coding in a company and he was like, oh well, I'm done, you know, he's done for the day.

00:56:39.903 --> 00:56:40.565
He's like what do you mean?

00:56:40.565 --> 00:56:41.626
It's like two o'clock, he goes.

00:56:41.626 --> 00:56:42.266
Yeah, but he goes.

00:56:42.266 --> 00:56:45.809
I ran out of GPT queries that I can ask.

00:56:46.090 --> 00:56:47.152
And he just walks away.

00:56:47.152 --> 00:57:00.842
So he's done for the day, because you know that's vibe coding plan to be able to work all day or something.

00:57:00.842 --> 00:57:06.431
But anyway, so before we wrap up and I know we've had a great conversation, but I always, charlie, love to end our conversations with the last three questions and I always provide them for you on the invite.

00:57:06.431 --> 00:57:21.628
So hopefully you got to see a little bit about that, and so I think that the first one you spoke a lot about possible edu-cryptonites right now that you saw, but maybe you have another one in mind or maybe just expound on something that you've already said.

00:57:21.628 --> 00:57:29.034
But uh, as you know, every superhero has a weakness or a pain point, and for superman, kryptonite was his weakness or pain point.

00:57:29.034 --> 00:57:41.565
So I want to ask you, in the current state of education or maybe we'll, we can change it to you can either say education or in the current state of ai in education, what would you say is your current edu kryptonite?

00:57:43.267 --> 00:57:49.121
yeah, I think that I am um, so it's interesting because I've sounded like a skeptic this whole time.

00:57:49.181 --> 00:57:52.617
But I'm very easily excited um, which maybe hasn't come across.

00:57:52.617 --> 00:58:05.952
But uh, like when I come up with some new idea for some new product enhancement to pick code, or I talk to a teacher and we have one good conversation, I want to throw away the plan for the month and just start working on that one new thing.

00:58:05.952 --> 00:58:12.934
And when we have responsibilities to schools and stuff like that, I have to go and curb that instinct.

00:58:12.934 --> 00:58:15.423
So I have one good conversation with the teacher.

00:58:15.423 --> 00:58:21.206
I have to say, hey, wait a second, we have to slot this into our product roadmap and like, hey, we might be able to do that in the fall.

00:58:21.206 --> 00:58:24.327
But you know right now in the spring, where we're working on X and Y Um.

00:58:24.327 --> 00:58:28.759
So curbing that excitement is actually something that I always need to work on.

00:58:28.759 --> 00:58:39.717
Um, and so that you know it's always it's a challenge for me, uh for sure, to to kind of stay on the plan and stay focused, even when there's like kind of distractions or exciting stuff going on.

00:58:40.459 --> 00:58:40.880
Excellent.

00:58:40.880 --> 00:58:46.918
Number two is if you could have a billboard with anything on it, what would it be and why?

00:58:49.262 --> 00:58:56.543
So this one I would say I mean probably just like don't take yourself so seriously, and this applies to myself.

00:58:56.543 --> 00:58:59.719
And so, like you know, you see, you know people.

00:58:59.719 --> 00:59:03.427
If you go and follow me afterwards, you'll see a very serious LinkedIn.

00:59:03.427 --> 00:59:09.719
You'll see, oh, there's very serious stuff going on in education and there I have a very serious, cool new update for you on this product.

00:59:09.719 --> 00:59:16.579
But I think it's important to take a deep breath and say you know, at the end of the day, what am I doing?

00:59:16.619 --> 00:59:26.836
I'm helping kids do their homework and if the website goes down, you know we need to get it back up, but you know kids are going to be able to take a half hour break from class and we'll get the website back up and working.

00:59:26.836 --> 00:59:29.780
And if we don't make that next sale, it's okay.

00:59:29.780 --> 00:59:33.244
And if you know there's some error, it's all right.

00:59:33.244 --> 00:59:37.648
And if I slip and fall on a banana peel, I'm not an idiot.

00:59:37.648 --> 00:59:46.219
I'm just gone unlucky that day and just kind of overall chilling out and not taking yourself so seriously.

00:59:46.219 --> 00:59:49.547
So if you happen to come across me on LinkedIn, you will not get that impression.

00:59:49.547 --> 00:59:56.981
But that is something that I try and work on and focus on is, like you know, take everything with a grain of salt and chill out a little bit.

00:59:57.521 --> 00:59:58.565
There you go, Love it.

00:59:58.565 --> 00:59:59.266
I like that.

00:59:59.266 --> 01:00:03.586
All right, Charlie.

01:00:03.586 --> 01:00:09.141
Last question If you could trade places with one person for a?

01:00:09.162 --> 01:00:09.222
day.

01:00:09.222 --> 01:00:09.784
It could be anybody.

01:00:09.784 --> 01:00:10.427
Who would it be and why?

01:00:10.427 --> 01:00:11.030
Can I have two answers?

01:00:11.030 --> 01:00:11.594
Yes, of course.

01:00:11.594 --> 01:00:37.525
Okay, so if, for intellectual curiosity reasons, I would like to trade places with, like Sam Altman or someone to figure out if they actually do have the Einstein coming, because, like that's, you know, that's obviously very interesting and I would like a sneak peek into that, so that's like, intellectually, I would like to, you know, just be in the room there where I can see, six months, 12 months out, is something actually going to happen or is this all a bubble?

01:00:37.525 --> 01:00:40.324
So I'd like the spoiler alert to that.

01:00:40.324 --> 01:00:42.070
But probably what?

01:00:42.151 --> 01:00:47.595
Practically what I would rather do is just trade places with my dog, and this is along the lines of don't take yourself seriously.

01:00:47.595 --> 01:00:56.235
Like you have some editing to do, where he would make a noise in the middle of this podcast, and he just gets to chill, walk around and, you know, get some treats.

01:00:56.235 --> 01:00:58.867
And you know he just gets to chill, walk around and, um, you know, get some treats.

01:00:58.867 --> 01:01:06.378
And you know me and my wife like to hang out with the dog and he's uh, he's well-loved, and he just gets to hang out on the couch and, um, to some extent I'm kind of jealous of that.

01:01:06.378 --> 01:01:11.148
So, um, that's my, maybe other, uh, that's my other point there.

01:01:12.195 --> 01:01:14.278
All right, well, I'll definitely have to do some editing there.

01:01:14.278 --> 01:01:18.447
As you notice, we saw my camera that just kind of like slid off here real quick.

01:01:18.467 --> 01:01:19.088
Yeah, no worries.

01:01:19.489 --> 01:01:19.889
It's all good.

01:01:19.889 --> 01:01:34.079
Like you said, we don't take ourselves seriously, but anyway, charlie, it's been a pleasure, you know connecting and I know that you know I just started following you based on that post that just popped up, but I'm just thankful that we did get to connect.

01:01:34.079 --> 01:01:40.706
I'm thankful that you accepted the invite here to be on the podcast and just really share your experience, your takes.

01:01:40.706 --> 01:01:51.409
What you're seeing, and it just seems like everything that you discussed today also is something that I am in line with and, you know, obviously, with the concerns, but yet there's still that hopeful excitement about things.

01:01:51.409 --> 01:01:53.557
But it's just really right now.

01:01:53.557 --> 01:02:18.297
I'm one of those that really likes to take things very slow to make sure that we cover all details Because, again, even in my experience in where I work as a coordinator, looking at apps, so many times it's like I'm always trying to poke holes on things just to make sure that we are getting the best possible product and they're putting their best possible version of their product out to really help.

01:02:18.297 --> 01:02:22.776
So you know that's why it's just considering myself a cautious advocate in those sense.

01:02:22.776 --> 01:02:27.340
But it's been a pleasure and it's been an honor to have you here on the show and just to hear your take.

01:02:27.340 --> 01:02:28.525
So thank you so much.

01:02:28.525 --> 01:02:29.570
I really appreciate it.

01:02:29.610 --> 01:02:34.541
And for our audience members that are checking this show out, please make sure that you follow Charlie on LinkedIn.

01:02:34.541 --> 01:02:39.309
Please make sure that you go check out PickCode oh, so you can learn a little bit more about PickCode.

01:02:39.309 --> 01:02:54.423
And make sure that you go to our website, to myedtechlife, where you can check out this amazing episode and the other 323 wonderful episodes where I promise you that you will find a little something just for you that you can sprinkle onto what you are already doing great.

01:02:54.423 --> 01:03:00.916
So thank you, as always, and thank you to all our amazing sponsors Book Creator, yellowdig, edu8, pocket Talk.

01:03:00.916 --> 01:03:07.298
We really appreciate your help and your support and, my friends, until next time, don't forget, stay techie.

01:03:07.298 --> 01:03:36.527
Thank you.
Charlie Meyer Profile Photo

Founder

Charlie did an undergrad degree in computer science and spent four years as a software engineer. He then got a degree in teaching and spent two years as a high school CS teachers. During his time teaching, he started Pickcode, an online platform for teaching CS. Today, Pickcode is used by dozens of schools across the US and Europe.