Nov. 16, 2025

How AiDigiCards Help Kids Learn AI Safely ft. Amber Ivey | My EdTech Life Ep.343

How AiDigiCards Help Kids Learn AI Safely ft. Amber Ivey | My EdTech Life 343 

In Episode 343 of My EdTech Life, I sit down with Amber Ivey to explore AiDigiCards, a new screen-free way to introduce kids to AI literacy, curiosity, and critical thinking. Amber breaks down the journey from her AI-for-adults work, to her children’s book AI Meets AI, to her new hands-on learning system for kids ages 4–8.

We talk about parent concerns, privacy, LLM overreliance, ethical design, and why AI learning should start at the kitchen table, not on a tablet. If you're an educator, parent, or anyone thinking about the future of AI readiness for kids, this conversation is packed with insight and practical takeaways.

Timestamps

00:00 – Introduction
 02:00 – Why Amber focuses on AI for kids
 05:30 – From books to workshops
 09:15 – Introducing AiDigiCards
 10:30 – Why screen-free AI learning matters
 12:00 – Parent concerns about AI
 14:00 – Protecting curiosity and creativity
 17:00 – The four C’s of AI literacy
 21:00 – Kids spotting real vs fake
 26:00 – “Are kids too young for AI?”
 27:00 – Teach kids AI… or AI teaches them
 29:00 – The problem with unsafe AI tools
 35:00 – Amber’s personal journey
 37:30 – What keeps her up (in a good way)
 40:00 – AiDigiCards Kickstarter
 46:00 – Thank you to our sponsors! 

🔗 Learn more about AiDigiCards: https://aidigicards.com
🔗 Follow Amber Ivey: LinkedIn
🎧 Listen to Episode 317 (Amber’s first appearance)

A massive thank-you to our sponsors: Book Creator, use code MYEDTECHLIFE for 3 months of premium access.

Thank you EduAide, Yellowdig, and Peel Back Education for supporting these conversations.

Peel Back Education exists to uncover, share, and amplify powerful, authentic stories from inside classrooms and beyond, helping educators, learners, and the wider community connect meaningfully with the people and ideas shaping education today.

Authentic engagement, inclusion, and learning across the curriculum for ALL your students. Teachers love Book Creator.

Support the show

Thank you for watching or listening to our show! 

Until Next Time, Stay Techie!

-Fonz

🎙️ Love our content? Sponsor MyEdTechLife Podcast and connect with our passionate edtech audience! Reach out to me at myedtechlife@gmail.com. ✨

 

00:10 - Welcome And Sponsor Acknowledgements

01:16 - Introducing Repeat Guest Amber Ivy

02:45 - Amber’s Background In AI And Government

06:07 - From Adult AI Work To Kids’ AI Literacy

08:43 - The Book And AI Digitals Journey

12:22 - Why Screen-Free AI Learning Matters

16:57 - Inside The ABCs Of AI Card Deck

21:22 - Curiosity, Confidence, And Critical Thinking

24:21 - Parents’ Fears, Access, And Safety

28:30 - Teach Kids AI Before AI Teaches Them

32:14 - Ethics, Wrappers, And Kid-Safe Tools

35:49 - Personal Reflections And Motivation

37:41 - Hopes, Risks, And Equity In Access

40:59 - Kickstarter Details And How To Back

42:43 - Lightning Round And Closing

WEBVTT

00:00:10.400 --> 00:00:14.080
Hello, everybody, and welcome to another great episode of My Ed Tech Life.

00:00:14.080 --> 00:00:16.719
Thank you so much for joining us on this wonderful day.

00:00:16.719 --> 00:00:22.160
And wherever it is that you're joining us from around the world, thank you as always for your continued support.

00:00:22.160 --> 00:00:24.559
We appreciate all the likes, the shares, the follows.

00:00:24.559 --> 00:00:28.640
Thank you so much for engaging with our content and sharing it with others.

00:00:28.640 --> 00:00:31.760
And I am so excited about today's episode.

00:00:31.760 --> 00:00:35.439
Obviously, our episodes are made possible by our wonderful sponsors.

00:00:35.439 --> 00:00:43.200
And I want to give a big shout out to Book Creator, EduAid, Yellow Dig, and our newest sponsor, Peelback Education.

00:00:43.200 --> 00:00:53.679
Thank you so much for believing in our mission to bring these conversations to life to help our educators continue to grow professionally and personally as well.

00:00:53.679 --> 00:01:00.320
And if you're interested in being a sponsor, please make sure you reach out to me at myedtech.life at gmail.com.

00:01:00.320 --> 00:01:03.280
And we can definitely get you on the show too as well.

00:01:03.280 --> 00:01:04.879
But again, enough about that.

00:01:04.879 --> 00:01:06.879
I am so excited about today.

00:01:06.879 --> 00:01:09.280
We have a repeat guest.

00:01:09.280 --> 00:01:17.519
And sometimes you may say, well, Fonz, we've already seen that guest before on your episodes, but but you don't know what happens between episode to episode.

00:01:17.519 --> 00:01:29.599
And this is why I am so excited to welcome to the show Amber Ivy, who is joining us, and she has an amazing product project that she has brought to life.

00:01:29.599 --> 00:01:36.159
And it's something that is going to be a game changer for our young kids here in the age of AI.

00:01:36.159 --> 00:01:43.840
So Amber was originally on her show on March 17th of 2025 or episode 317.

00:01:43.840 --> 00:01:49.280
Look it up, we'll link it because you definitely need to check that one out too as well as we dive in deep.

00:01:49.280 --> 00:01:57.040
But this one is more for me like a celebration, and I'm celebrating the work that Amber is doing because it is truly exciting.

00:01:57.040 --> 00:01:59.599
So, Amber, welcome to the show again.

00:01:59.599 --> 00:02:02.400
How are you doing on this wonderful evening?

00:02:02.799 --> 00:02:04.239
I am so excited to be here.

00:02:04.239 --> 00:02:05.840
One, thank you for having me back.

00:02:05.840 --> 00:02:12.400
I'm super excited to just update you on what's been going on because a lot has happened in since March.

00:02:12.400 --> 00:02:18.960
And I cannot believe we're already in November and almost at the holidays, but we are here and I'm super excited to chat with you today.

00:02:19.280 --> 00:02:19.840
Exactly.

00:02:19.840 --> 00:02:21.120
And I'm excited too.

00:02:21.120 --> 00:02:28.479
But before we dive in, Amber, because I am really excited, and I know you and I are already warmed up and ready to go for this chat.

00:02:28.479 --> 00:02:36.240
But uh before we dive into that, there may be some new listeners and that may not be familiar with your work just yet.

00:02:36.240 --> 00:02:38.560
They haven't checked out episode 317.

00:02:38.560 --> 00:02:45.120
So if you can give us a little introduction and what your context is within the education AI space.

00:02:45.439 --> 00:02:46.960
So I have a weird background.

00:02:46.960 --> 00:02:54.960
My background is in data and performance management and AI, but not for edutainment or education or ed tech, more so for government.

00:02:54.960 --> 00:03:02.000
So I started my work in that space and I've been helping government use those tools to make decisions for the people they serve.

00:03:02.000 --> 00:03:11.520
Then insert Chat GBT, um, the the year, the month, November, actually what, three years ago now, that changed all of our lives.

00:03:11.520 --> 00:03:20.000
Where um I immediately saw as I was talking about these things for adults in my day job, and even I had a podcast focused on talking about these topics.

00:03:20.000 --> 00:03:24.159
I saw immediately parents and adults had a weird reaction, right?

00:03:24.159 --> 00:03:27.520
As we get older, things get a little bit more scary.

00:03:27.520 --> 00:03:38.960
But I do know my niece, she says, hey, Alexa, all the time, and was talking to um AI and uh smart speakers way before she had access to a laptop or any other smart device.

00:03:38.960 --> 00:03:42.159
So I realized I probably needed to start a little bit earlier.

00:03:42.159 --> 00:03:47.759
So I now focus on AI for adults through my um day job, and then at night, AI for kids.

00:03:48.000 --> 00:03:48.639
I love that.

00:03:48.639 --> 00:03:58.800
And so that is very interesting because the two worlds that you are working in have really helped bring this to light and just bring everything together for you.

00:03:58.800 --> 00:04:19.040
And I love the fact that as in your working with adults, you're taking a lot of those topics, a lot with those skills, and the way that you're able to translate them and for for a four-year-old, for an eight-year-old, for a 12-year-old, for some parents, and make it uh easy to understand for them.

00:04:19.040 --> 00:04:20.879
I think that that is something fantastic.

00:04:20.879 --> 00:04:23.920
And you mentioned your podcast, and your podcast is great.

00:04:23.920 --> 00:04:25.040
It is AI for kids.

00:04:25.040 --> 00:04:27.279
If you haven't checked it out, make sure you check it out.

00:04:27.279 --> 00:04:28.639
We'll link it in the show notes.

00:04:28.639 --> 00:04:38.959
Make sure you subscribe, or you can also visit Amber's website at AI Digitels, where you will find all things AI uh for kids and the work that Amber is doing.

00:04:38.959 --> 00:04:51.439
But Amber, going back to this, and I know you, you know, got from the previous show, we talked a little bit more about obviously the importance of teaching our students the ethics of AI and using it properly.

00:04:51.439 --> 00:04:54.480
And we've seen so much happen in that landscape.

00:04:54.480 --> 00:05:08.959
But now you have come up with a wonderful, uh, let's say, non-screen-free way for students or young kids and parents to be able to learn these concepts.

00:05:08.959 --> 00:05:12.959
So tell us about this project and what brought this to fruition.

00:05:13.439 --> 00:05:17.920
So it all started for me back to when I told you November happened.

00:05:17.920 --> 00:05:26.319
So before all that, I don't know if I talked about this last time we were on the podcast, but I had an AI for adult podcast around AI.

00:05:26.319 --> 00:05:29.439
My voice is also trained in an AI avatar.

00:05:29.439 --> 00:05:32.480
Like, this is back in like 2021 before it was cool.

00:05:32.480 --> 00:05:45.439
And um, her name is Clara, and she travels the world and she was interacting with kids to help kids like get more access to AI because the theory is a real human's voice, but it sounds a little bit more robotic.

00:05:45.439 --> 00:05:46.800
So they asked me to be a boy.

00:05:46.800 --> 00:05:49.040
So she travels the world from here to Dubai.

00:05:49.040 --> 00:05:51.600
She was on Telemundo, like she lives a life.

00:05:51.600 --> 00:05:55.360
So I've been in this space, let projects on this space for some time.

00:05:55.360 --> 00:05:57.839
So 2022 happened November.

00:05:57.839 --> 00:06:00.160
Everyone was like, This is happening.

00:06:00.160 --> 00:06:03.040
Let's ban it in schools or let's um access it.

00:06:03.040 --> 00:06:07.279
Like the pendulum for where people were was all over the place.

00:06:07.279 --> 00:06:25.600
And in that moment, I realized back to like my niece using things like Alexa or playing in some of these tools like Roblox that have AI integration, or thinking about Netflix or YouTube that literally, sorry, there's like algorithms and AI components in there that are actually like saying to our kids what they should watch or shouldn't watch.

00:06:25.600 --> 00:06:27.439
So I was like, I need to meet kids where they are.

00:06:27.439 --> 00:06:36.319
And it's starting with the book, AI Meets AI, about a little girl named um Addy Iris who meets a robot named Jazz, who was lost from JHU.

00:06:36.319 --> 00:06:41.759
I'm right near JSU, and JSU has some amazing AI um labs there and robotics labs.

00:06:41.759 --> 00:06:50.160
And I really wanted kids to see themselves as not just users and hey, Alexa or okay, Google of these technologies, but could see themselves as creator.

00:06:50.160 --> 00:06:51.519
The book did well.

00:06:51.519 --> 00:06:56.160
I self-published, which is did well shockingly, without a publishing company.

00:06:56.160 --> 00:06:59.279
So yes, still selling, still doing well in this category.

00:06:59.279 --> 00:07:07.600
And then I was like, wait, something else needs to happen here, which is where AI Digitales came from, which was like, I want to talk about AI through stories.

00:07:07.600 --> 00:07:11.759
Started that, started doing workshops for kids, switching from adults to kids.

00:07:11.759 --> 00:07:20.319
And then also, same time, I still do workshops in the day for AI for adults as well, but also started doing it for kids in schools and things like that.

00:07:20.319 --> 00:07:22.879
And then I was like, all right, the book is not enough.

00:07:22.879 --> 00:07:23.839
What is next?

00:07:23.839 --> 00:07:35.600
The workshops happen, then launched the podcast, AI for kids, which really focuses on interviewing both adults and kids about these technologies and getting into like the nitty-gritty of like, why is it good?

00:07:35.600 --> 00:07:36.399
Why is it bad?

00:07:36.399 --> 00:07:37.279
How can you do it?

00:07:37.279 --> 00:07:38.240
How can you use it?

00:07:38.240 --> 00:07:44.800
And allowing kids another screen-free way, because they're just audio, screen-free way to hear and understand what it is.

00:07:44.800 --> 00:07:50.560
And fast forward after that, I was like, okay, I can't be in all places with these workshops, just not possible.

00:07:50.560 --> 00:07:55.279
People may or may not find the book as well, but what is something else I can get into the hands of folks?

00:07:55.279 --> 00:08:02.399
And as I was interviewing people, and we also have a series on our podcast that we wrap next Tuesday called The ABCs of AI.

00:08:02.399 --> 00:08:03.680
I was like, there's something there.

00:08:03.680 --> 00:08:10.560
And I had heard really good feedback from those um different uh podcasts, how people play them in their classroom, how parents are learning.

00:08:10.560 --> 00:08:14.160
Someone sent me a video of their four-year-old talking about an algorithm.

00:08:14.160 --> 00:08:16.639
First of all, I could not say algorithm at four.

00:08:16.639 --> 00:08:17.920
This four-year-old could.

00:08:17.920 --> 00:08:19.920
And I was like, all right, there's a there there.

00:08:19.920 --> 00:08:22.560
So, like drum roll, please.

00:08:22.560 --> 00:08:26.720
That's when I came up with the idea of something called AI Digicars.

00:08:26.720 --> 00:08:30.639
Now I know you're wondering what's going on with all this AI, Digi, Digi, or whatever.

00:08:30.639 --> 00:08:32.639
AI Digitales, AI Digicars.

00:08:32.639 --> 00:08:46.240
But what AI Digicars are, they're a screen-free way for kids to be introduced to AI through um like storytelling, activities, like, but take it away from the screen.

00:08:46.240 --> 00:08:49.519
We don't have to throw kids on a screen to learn about um AI.

00:08:49.519 --> 00:08:53.840
Think about all the things we learned in school through in theory classes where it was just reading a book.

00:08:53.840 --> 00:08:57.840
Think about all the songs you learn about, like, think about Schoolhouse Rock.

00:08:57.840 --> 00:08:59.120
How did you learn about a bill?

00:08:59.120 --> 00:09:04.720
The first time you learned about a bill was through that song, I'm just a bill, or whatever it was, Barney cleanup song.

00:09:04.720 --> 00:09:08.879
A lot of things happen through song and activity that stick in our mind.

00:09:08.879 --> 00:09:20.000
Whereas something that we may have like learned in the book, we can't say on page 365 of this biology book, I remember this thing, but you do remember what it means to like be in the lab or be interactive.

00:09:20.000 --> 00:09:22.399
So I'm like, we can do that with AI.

00:09:22.399 --> 00:09:23.840
So let's start with the deck.

00:09:23.840 --> 00:09:38.799
Our first deck is the ABCs of AI, mostly because we were already been talking about the topic and we're offering kids the ability to interact with AI, 130 different cards, five cards per letter in five different modalities because kids learn differently.

00:09:38.799 --> 00:09:43.919
And we're trying to keep the education and entertainment together so that kids can remember this.

00:09:43.919 --> 00:09:45.840
And we're like the we're the beginning of this.

00:09:45.840 --> 00:09:51.840
So as the kids get the concepts later, like when we sung the Bill song, we didn't know what the heck a Bill was or what was going on.

00:09:51.840 --> 00:09:56.320
But then when we got to like later on in life, like, oh, I know what this is and I can connect it.

00:09:56.320 --> 00:09:59.039
So we're like the very beginning and basics of that.

00:09:59.039 --> 00:10:07.600
First deck is ABCs of AI for kids ages four to eight, and we hope to expand to nine to twelve in our next deck that's focused on the ABCs as well.

00:10:07.840 --> 00:10:08.480
I love it.

00:10:08.480 --> 00:10:13.519
And you know, one of the things that I do want to highlight that you did mention is the screen time.

00:10:13.519 --> 00:10:33.840
And you and I had talked about a little bit warming up for the chat, how parents are limiting that screen time and are now opting from the fancy, wonderful computers, supercomputer phones, and now going back to what would be a flip phone or a phone that is really restricted as far as what the student can access.

00:10:33.840 --> 00:10:39.519
They may be able to access, you know, just the important apps, you know, the phone numbers and so on.

00:10:39.519 --> 00:10:51.919
But I think that you doing interviews, knowing parents, hearing their thoughts, and seeing that many parents have that, uh, I guess that fear because of what is happening.

00:10:51.919 --> 00:10:57.360
And we hear a lot of stories through, of course, character AI and a lot of uh AI chatbots.

00:10:57.360 --> 00:11:03.279
So being able to provide them with a way to learn the concepts.

00:11:03.279 --> 00:11:31.039
And like you said, that taking those hard concepts that you would normally work with with adults and really, I guess, molding them in such a way, not necessarily dumbing them down because that's not what we're doing, but putting them and packaging them in a way that a four to eight-year-old and a parent can understand and they can have conversations together in multiple modalities and or activities in multiple modalities, I think is something that is fantastic.

00:11:31.039 --> 00:11:50.320
And I'd love to hear a little bit more as far as for four through eight-year-olds and the parents that would be using these cards and interacting with them at home, what would you like to be their biggest takeaway after, you know, playing with the cards and learning the concepts there?

00:11:50.559 --> 00:12:02.000
I think um, to your point of around the screen time, I think a lot of people assume that the only way for a kid to learn about AI, which I see a lot of adverse reactions to the idea of AI for kids.

00:12:02.000 --> 00:12:07.120
So normally people are like, AI for kids, no, keep the AI away from our kids, which I understand 100%.

00:12:07.120 --> 00:12:10.320
But for me, it's like the way you introduce it.

00:12:10.320 --> 00:12:13.519
Should we hand a four-year-old an app in a chat bot?

00:12:13.519 --> 00:12:15.360
No, absolutely not.

00:12:15.360 --> 00:12:31.919
Can we sit at the table and play a game with the four-year-old that talks about an algorithm or talks about ethics or teaches them some of the basics of like what is an AI robot or what have you, and help them think about like things that could be fake online or whatever it is to help them get early understanding?

00:12:31.919 --> 00:12:33.039
We totally should.

00:12:33.039 --> 00:12:37.200
And I understand parents, and I hear this all the time, and there's studies that showed us as well.

00:12:37.200 --> 00:12:39.840
Parents are banning or reducing screen time.

00:12:39.840 --> 00:12:43.919
Like 70% of parents in a recent survey are reducing screen time.

00:12:43.919 --> 00:12:46.080
There's also these screen-free movements.

00:12:46.080 --> 00:12:55.519
Um, there are parents um in communities, either to your point, either in affluent communities, or who like even Gen Z and Gen A are also saying, I want my life back.

00:12:55.519 --> 00:12:59.919
Let's just have a regular flip phone, because in case of emergencies, you should have that.

00:12:59.919 --> 00:13:04.240
We do not need to have smartphones in um in classrooms.

00:13:04.240 --> 00:13:08.879
I hear teachers complain all the time about like having a smartphone, they have to fight for the attention of that.

00:13:08.879 --> 00:13:20.879
The dopamine that's uh that's being released and all these things that we know are happening with screen time addiction, with the loneliness epidemic, with these different um apps that are like building relationships with our kids.

00:13:20.879 --> 00:13:21.919
Now they're in toys.

00:13:21.919 --> 00:13:24.080
Like Mattel is about to put AI into toys.

00:13:24.080 --> 00:13:28.159
There's a bunch of little um AI tools out there that exist, right?

00:13:28.159 --> 00:13:32.080
So if we know these things are coming, we want to make sure kids understand it.

00:13:32.080 --> 00:13:38.480
I want to put this back into the parents' hands and the kids' hands for when they're ready and they can still learn about these concepts.

00:13:38.480 --> 00:13:42.960
So they don't feel like they're behind, but they allow them to do it in a way that they're comfortable with.

00:13:42.960 --> 00:13:50.159
Like the thing I say is like AI learning or AI literacy literacy should start at the kitchen table, not on a tablet.

00:13:50.159 --> 00:13:53.679
And right now, a lot of boot camps will hand your kid a tablet.

00:13:53.679 --> 00:13:57.759
We don't want our kids in those ecosystems, the data being trained on them.

00:13:57.759 --> 00:14:01.360
Our cards don't train on your data, our cards don't even know who you are.

00:14:01.360 --> 00:14:05.840
Like, unless you write your name on it and you know whatever it is, we have no idea, right?

00:14:05.840 --> 00:14:09.600
And we will not have that data on your family, but your family will have a good experience.

00:14:09.600 --> 00:14:13.679
And this is deck just deck one of many that that are gonna come.

00:14:44.009 --> 00:14:44.809
That is wonderful.

00:14:44.809 --> 00:14:46.330
And I absolutely love that.

00:14:46.330 --> 00:14:51.289
That again, you're you're providing a wonderful resource for parents.

00:14:51.289 --> 00:15:06.409
Uh, when I worked with parents and we would do our technology Tuesdays, and we would talk about digital safety, digital citizenship, all of those things, it it's the parents, there is a need there, but sometimes they feel overwhelmed because they don't know where to start.

00:15:06.409 --> 00:15:08.409
There is so much information.

00:15:08.409 --> 00:15:21.769
But when you're able to condense it and be able to provide them with whether literature or some resources that they can go ahead and take in and have those conversations with their child, I think that's something that's fantastic.

00:15:21.769 --> 00:15:33.529
And here they have something that is ready-made where it is not something that is very ominous to them and very scary to them that they will feel uncomfortable with.

00:15:33.529 --> 00:15:35.929
It's something that they can all do together.

00:15:35.929 --> 00:15:41.049
And like you said, sitting at the kitchen table together as a family, going through the cards.

00:15:41.049 --> 00:15:46.250
A is for algorithm, you know, Q is for quantum, you know, and things of that sort.

00:15:46.250 --> 00:15:58.409
And talking about that where later on, as the students continue to grow and parents are listening to the news or listening to whatever's out there, they start making those connections too, as well, like you did mention before.

00:15:58.409 --> 00:16:09.769
And so I love that that the cards will also, you know, pique uh parent and child's curiosity, their creativity, but will also help them in their confidence as well.

00:16:09.769 --> 00:16:11.049
So I want to, yeah.

00:16:11.049 --> 00:16:17.370
So I want to ask you, you know, was that the idea to link these three things uh into AI literacy?

00:16:17.370 --> 00:16:28.889
Because right now, at least in the world that I'm in in the education space, and I know you hear it too, the term AI literacy, but but there's so many definitions, and sometimes you're like, like, what does this even mean?

00:16:28.889 --> 00:16:34.409
It's you say it's like you say a word so many times that it loses its meaning and you don't know what it is.

00:16:34.409 --> 00:16:46.409
But kind of like I want to ask you, you know, I because I think like the curiosity, creativity, and that confidence component really build nicely into the this wonderful exercise in AI literacy.

00:16:46.649 --> 00:16:47.129
100%.

00:16:47.129 --> 00:16:58.970
The one of the things I hear the most from parents or teachers, this is also for teachers though, like having the classroom, and I know teachers are really excited about it as well, but I hear they're afraid of, I call them the four C's.

00:16:58.970 --> 00:17:04.970
They're afraid of AI taking the curiosity, AI taking away confidence, AI taking um critical thinking.

00:17:04.970 --> 00:17:06.410
I forgot my other C.

00:17:06.410 --> 00:17:07.450
There's another C there.

00:17:08.250 --> 00:17:09.610
Collaboration, curiosity.

00:17:10.330 --> 00:17:11.769
Yes, curiosity is the four C.

00:17:11.769 --> 00:17:13.930
They're afraid of AI taking those things.

00:17:13.930 --> 00:17:18.410
And in reality, like we're seeing early studies that show like critical thinking does go away.

00:17:18.410 --> 00:17:22.569
These other things do go away when we use AI and rely on it too much.

00:17:22.569 --> 00:17:32.490
And AI, in this sense, I'm saying talking about LLMs, there are a lot of different types of LL, excuse me, AI, but LLMs is like the main one that we're all talking about right now.

00:17:32.490 --> 00:17:45.690
So, how can we make sure that kids using AI or learning about AI and LLMs or these other um foundational components to the AI that is publicly open to us, how do we make sure to be flipping on its head?

00:17:45.690 --> 00:17:53.529
So for us, instead of taking away curiosity, critical thinking, confidence, um, and I forgot my fourth C.

00:17:53.529 --> 00:17:54.730
I will get it later.

00:17:54.730 --> 00:18:00.569
But we've taken away these, the C's, we flip it on its head to make sure that we give that back.

00:18:00.569 --> 00:18:04.490
So think about if you have a definition car, there's a definition up there.

00:18:04.490 --> 00:18:13.289
But the thing that's really cool about it is beyond the definition that's easy to understand for a kid or a parent, the parent doesn't have to feel like anxiety, like helping a kid.

00:18:13.289 --> 00:18:18.490
I remember like thinking about helping a kid with a homework and homework and seeing that things have changed since we went to school.

00:18:18.490 --> 00:18:19.609
And then like math is new.

00:18:19.609 --> 00:18:23.049
Like if I hear about new math one more time, I'm gonna, in my head, it's gonna explode.

00:18:23.049 --> 00:18:25.129
But like how kids are doing things are different.

00:18:25.129 --> 00:18:26.890
Our cars that doesn't matter.

00:18:26.890 --> 00:18:29.529
You can be a parent who's never understood AI concept.

00:18:29.529 --> 00:18:46.569
You can talk about the definition, and under every single activity, definition, or whatever the main modality is for the card, there's a question or something that you can do that further enhances that, that allows the conversation to continue so that every time you interact, it could be a different discussion and whatever.

00:18:46.569 --> 00:18:49.930
And that allows the card deck to continue to grow with the kid.

00:18:49.930 --> 00:18:56.410
The other piece I didn't say um is that at some point, parents do want their kids to get online, right?

00:18:56.410 --> 00:19:00.490
Or um we also target this for kids who do not have access to screens.

00:19:00.490 --> 00:19:05.769
So this is for individuals who don't want their kids to be on screens or who do not have access, right?

00:19:05.769 --> 00:19:14.009
But when we get to the point where you think your kids are ready or your access increases, we have digital expansion packs that are coming on board as well.

00:19:14.009 --> 00:19:30.490
So then you can go online and a very um like working with partners who we trust and and different um modalities or not modalities, different um tools and things that we trust for you to try things out with your kid in a very secure and safe environment when you're ready.

00:19:30.490 --> 00:19:31.769
But that's totally optional.

00:19:31.769 --> 00:19:35.129
You can still learn about all the AI concepts without ever going online.

00:19:35.129 --> 00:19:36.890
So that's what I'm excited about.

00:19:36.890 --> 00:19:43.849
And um, I just can't wait to get it in the hands of human beings and have kids out there playing around and testing them.

00:19:44.089 --> 00:19:49.129
And that's something that's very exciting because I think that this is something so refreshing.

00:19:49.129 --> 00:19:55.609
Because again, going back and talking to about screen time, and I myself, you know, working as uh an assessment coordinator.

00:19:55.609 --> 00:19:57.129
I mean, talk about screen time.

00:19:57.129 --> 00:20:02.730
The students have to practice continually on computers, a lot of the curriculum, a lot of the books.

00:20:02.730 --> 00:20:08.650
Uh like pretty much it's like you're you're on a Chromebook all day long for the most part.

00:20:08.650 --> 00:20:26.970
And as teachers, like you mentioned before, having to fight for that attention and being able to, I mean, it's it's almost like really like you're creating TikTok videos like right then on the spot, because you have three seconds to get them hooked into what you're doing and probably about five minutes at most of attention span.

00:20:26.970 --> 00:20:28.569
And then, all right, let's go.

00:20:28.569 --> 00:20:49.210
But uh, I mean, having something like this, and and again, now with this younger generation and these four and eight-year-olds that may get these hands on these cards, or four through eight-year-olds that may get the hands in these cards and the parents, I think that this is something fantastic now because then a lot of those students are already going to learn the concepts, they're learning actively.

00:20:49.210 --> 00:21:03.690
And again, once as they continue to grow and move through school, and then of course, the uh availability of those expansion packs where now that card can come to life as in, for example, where I'm learning about algorithm here.

00:21:03.690 --> 00:21:09.930
What does that look like on a computer, on a computer program, and things of that sort, and understanding those things.

00:21:09.930 --> 00:21:18.089
And then you really start to build that literacy, not only just the knowledge of the definition, but also being able to see it in action.

00:21:18.089 --> 00:21:20.250
And I think that that is something that is fantastic.

00:21:20.250 --> 00:21:37.129
That as they continue to grow and mature, you know, the parents might feel still a little bit better that their child at an early age knows the difference or can tell, maybe just even outputs, understanding like what is a real output, what is not a real output.

00:21:37.369 --> 00:21:38.569
That is one of our works.

00:21:38.809 --> 00:21:39.129
Yeah.

00:21:39.129 --> 00:21:55.210
And especially now, Amber, with the way that video is getting that for many young, young adults that are and young kids that are have been on living on YouTube for a very long time, even for them, it's very hard to tell what is real and what is not.

00:21:55.210 --> 00:22:03.369
And so I'm glad that there is a wonderful resource, a tool that parents can use and work together with their parents.

00:22:03.369 --> 00:22:13.769
So I want to ask you too, just be on your experience, because I know that there was a lot of research that had to have gone into producing these cards and talking to a lot of parents and so on.

00:22:13.769 --> 00:22:27.690
So, within that scope, what were some of the more surprising ethical or privacy concerns that you might have observed or heard from parents when, you know, you're talking about the cards and what they can do.

00:22:27.690 --> 00:22:33.930
So tell us a little bit about that just so we can, as educators, also hear a little bit about what parents are thinking.

00:22:34.250 --> 00:22:38.809
So I'm gonna think one of the, let me answer the surprise, and I'll talk about some of the other things I heard.

00:22:38.809 --> 00:22:43.210
One of the things that's interesting is like this hype around AI.

00:22:43.210 --> 00:22:53.129
Um, and also around the hype, like parents are a little bit concerned because their kids know more than them, which I don't think is a bad thing, and what that means.

00:22:53.129 --> 00:23:01.129
So, because of the hype and because AI is integrating a lot of their devices, parents are literally saying, like, hey, my kid is using this thing.

00:23:01.129 --> 00:23:02.569
I don't know what they're doing.

00:23:02.569 --> 00:23:03.529
They know it better than me.

00:23:03.529 --> 00:23:08.009
I've also seen this like on the ground, um, doing like live workshops with kids.

00:23:08.009 --> 00:23:15.049
I can show them an image, a video, they're more likely to identify a fake than an adult, which is good.

00:23:15.049 --> 00:23:16.490
And it's actually amazing.

00:23:16.490 --> 00:23:19.450
I think we don't give kids as much credit as we should.

00:23:19.450 --> 00:23:31.609
And this is like, I'm not talking about middle schoolers, like this is in even my elementary um classes where kids literally, when I do the same like type of test with adults to kids, kids get it right more often.

00:23:31.609 --> 00:23:34.009
So I think we don't give kids as much credit.

00:23:34.009 --> 00:23:36.650
They're growing up in this world, a little bit more skeptical.

00:23:36.650 --> 00:23:37.450
They kind of know.

00:23:37.450 --> 00:23:40.410
And then I ask them, I'm like, how do you know this isn't AI?

00:23:40.410 --> 00:23:42.970
They're able to, or this is AI or is real.

00:23:42.970 --> 00:23:45.129
It has this, it has the lighting is wrong.

00:23:45.129 --> 00:23:46.490
I'm like, the lighting is wrong.

00:23:46.490 --> 00:23:47.849
Second grade.

00:23:47.849 --> 00:23:52.569
Or um, if you look at the whiskers of the cat, like a cat would and a cat would never let you get that close.

00:23:52.569 --> 00:23:53.769
Like the different images I show.

00:23:53.769 --> 00:23:55.210
You could never take a picture of a cat.

00:23:55.210 --> 00:24:02.329
And I'm sitting here like, okay, we think that is that they don't get it, but that's the only world they know.

00:24:02.329 --> 00:24:07.930
And I think we forget that when you grow up in the world that you're presented, you adjust to it.

00:24:07.930 --> 00:24:10.970
And I think the kids have done an amazing job of adjusting to it.

00:24:10.970 --> 00:24:20.329
However, along the same side of things, is I am learning that kids aren't treating it like they should in certain ways.

00:24:20.329 --> 00:24:28.970
And that um, like I always and I've talked to people about this, but my rule is like take the old school rules and apply them to apply them to the technology.

00:24:28.970 --> 00:24:38.410
What I mean by that, you learn very early on, offline, don't talk to strangers, don't give out your address, don't send anyone your image, don't like do any of those things.

00:24:38.410 --> 00:24:40.650
Same things apply to an AI tool.

00:24:40.650 --> 00:24:46.490
So don't talk to strangers in AI means don't go download an app that you don't know anything about and no one has vetted.

00:24:46.490 --> 00:24:48.809
That's a stranger, you don't know who that is.

00:24:48.809 --> 00:24:53.369
Don't don't take a picture of your face to put into a different tool.

00:24:53.369 --> 00:24:55.529
Don't share your address, don't do any of those things.

00:24:55.529 --> 00:25:02.650
So it's also like um parents are trying to figure out how to take those skills that they already shared and brought it in.

00:25:02.650 --> 00:25:05.369
And the other piece is that kids have been doing this forever.

00:25:05.369 --> 00:25:10.970
When there's a bunch of rules around something, kids are gonna figure out how to get around it and they're gonna share how to get around it.

00:25:10.970 --> 00:26:07.490
And parents are spending a lot of time on a lot of different threads and a lot of different community groups as well, where parents are spending a bunch of time like trying to share each other the tips that the kids have already figured out.

00:26:07.490 --> 00:26:17.410
It's actually better if we just talk to the kid and like make it an open space for them to share what's happening and for that dialogue to happen, and you'll go much further.

00:26:17.410 --> 00:26:26.850
The parents who I've seen like give access with with protection around it have seen better outcomes with those um different things that we're dealing with in this generation.

00:26:26.850 --> 00:26:32.610
But the main one that's most surprising to me is they get it way better than we do, even at elementary school level.

00:26:32.850 --> 00:26:33.570
That is great.

00:26:33.570 --> 00:26:35.170
And that's so good to hear, you know.

00:26:35.170 --> 00:26:37.170
And again, because you're absolutely right.

00:26:37.170 --> 00:26:43.410
Sometimes we don't give our students, you know, that that benefit that they do or can tell.

00:26:43.410 --> 00:26:54.289
I mean, right now, when you mentioned that as far as lighting, I was like, I would have never thought of maybe even looking at lighting or focused on, you know, from my era, from what I know, and I may miss it.

00:26:54.289 --> 00:26:59.970
And they don't because they they know the video games, the way the lighting works, whatever it is.

00:26:59.970 --> 00:27:00.450
Oh, sure.

00:27:00.450 --> 00:27:01.970
And so, yeah, very true.

00:27:01.970 --> 00:27:11.250
The other thing that I love is the the way that you talked about, you know, creating an open space and just to be able to have a discussion, that safe space where let's talk about these questions.

00:27:11.250 --> 00:27:17.250
And again, going to back to the cards, being able to open that up and have those discussions.

00:27:17.250 --> 00:27:31.009
But I want to ask you too, in in your experience, what what are some of the things that you would like to dispel or to correct for educators and parents when they say, AI for kids, wait, wait, no way.

00:27:31.009 --> 00:27:33.650
Like they're too young, they'll learn it later.

00:27:33.650 --> 00:27:38.050
What are some things that you wish that you can share with a lot of parents or educators?

00:27:38.050 --> 00:27:40.450
And right now is the time to do it on our podcast.

00:27:40.690 --> 00:27:46.610
So I will say this um the federal government at the federal government level is really pushing for AI literacy.

00:27:46.610 --> 00:27:48.450
We have a bunch of different curriculum that are out there.

00:27:48.450 --> 00:27:51.650
We also use some of this curriculum in our actual cars as well.

00:27:51.650 --> 00:27:54.370
We don't consider ourselves an education tool.

00:27:54.370 --> 00:27:55.810
We're an edutainment tool.

00:27:55.810 --> 00:28:00.210
So both you're introduced to it, but also you're entertained and interacting in that way.

00:28:00.210 --> 00:28:10.610
So um curriculum like day, um day of AI, um, AI for K12, like those different curriculums are out there for those who are a little bit nervous about that.

00:28:10.610 --> 00:28:17.250
The thing I would say to your direct question is you can either teach them AI or AI would teach them about AI.

00:28:17.250 --> 00:28:38.930
So we can choose either one because in the real, the reality is we all know that if you and I talk about any uh product right now, and there's an ad on my phone, as soon as we open our phone, if you have your um mic on for certain apps, we will have a product uh that's scrolling and telling us.

00:28:38.930 --> 00:28:55.090
Now imagine when private sector or big tech companies are deciding they want your kids and their attention because the reality is, and this is something that I think is important that I've learned even before this, is that when you get a kid at a young age, you have them as a customer for life.

00:28:55.090 --> 00:28:56.370
What do I mean by that?

00:28:56.370 --> 00:29:05.250
How many of us are now as millennials or whatever Gen X, Gen Y, boomers or whatever, how many people are going back to find the things they got from their childhood?

00:29:05.250 --> 00:29:12.930
Or how many people are going now to show their kids Aladdin live action because that's also what they what they um watch growing up?

00:29:12.930 --> 00:29:19.490
And how many people have spent so much money trying to relive your childhood because they realize if I get you as a kid, I have you for life.

00:29:19.490 --> 00:29:23.410
Animals from childhood right now are going on eBay for hundreds of dollars.

00:29:23.410 --> 00:29:24.529
I wish I would have saved.

00:29:24.529 --> 00:29:25.890
Save my stuffed animals.

00:29:25.890 --> 00:29:41.570
So you have Google, you have Chat GPT, you have all these different um AIs, even Grok, who should not be in the kid business, not dissing grok anyway, but they shouldn't be in the kid business.

00:29:41.570 --> 00:29:43.090
They have other stuff they got going on.

00:29:43.090 --> 00:29:49.650
But all these different tools are going to kids because if they catch a kid at this age, they have a customer for life.

00:29:49.650 --> 00:29:54.610
And at the end of the day, it started as research, but it is really about money now.

00:29:54.610 --> 00:30:05.410
So if we know that, we need to make sure that we are helping those kids who are now going to be targeted with ads that know you better than your mother, that know you better than yourself.

00:30:05.410 --> 00:30:07.570
We need to make sure they know what is happening.

00:30:07.570 --> 00:30:12.370
Cause even now as adults, we're getting um like we're getting trapped into it.

00:30:12.370 --> 00:30:15.490
And we're watching certain shows because Netflix told us what to watch.

00:30:15.490 --> 00:30:24.289
And now all of our, all of our Netflix shows are 99% of what we watch are YouTube, or we've now bought thousands of dollars worth of stuff because of ads.

00:30:24.289 --> 00:30:26.450
Like now put that in the hands of a kid.

00:30:26.450 --> 00:30:34.210
Um, so we want to make sure kids know about AI before AI knows them and before AI starts to tell it about itself.

00:30:34.210 --> 00:30:35.650
We need to make sure they know.

00:30:35.650 --> 00:30:37.170
And I I take that stance.

00:30:37.170 --> 00:30:39.410
And I don't think it has to happen with a tablet.

00:30:39.410 --> 00:30:41.009
I think there's ways, there's books.

00:30:41.009 --> 00:30:44.930
There are people who are also getting into the screen free movement to try to help kids.

00:30:44.930 --> 00:30:48.130
And I hope more people get into this movement for AI in particular.

00:30:48.130 --> 00:30:56.289
Um, like for example, Chat GPT is gonna come out with a, if they haven't already, a um, which I agree or don't disagree, we're not gonna talk about that.

00:30:56.289 --> 00:31:02.130
But it's a um a version of chat, um, chat GPT that's just like a box that you can talk into.

00:31:02.130 --> 00:31:04.289
Some other AI companies have tried it as well.

00:31:04.289 --> 00:31:09.009
I think one was called Rabbit in the past, but they're trying to come out with these things that aren't screen free.

00:31:09.009 --> 00:31:11.650
There's also a necklace called Friend that kind of follows you.

00:31:11.650 --> 00:31:15.090
I don't think you should call AI a friend, um, but there are certain things I don't agree with.

00:31:15.090 --> 00:31:20.850
With but there are these tools that are coming that are screen free, but they still collect a bunch of data.

00:31:20.850 --> 00:31:24.850
How can you make sure you give kids access to this without the data?

00:31:24.850 --> 00:31:28.370
And the reality is everyone is selling an AI app now.

00:31:28.370 --> 00:31:37.810
Everyone has overlaid some UX or user interface or user experience that's exciting on top of mostly Chat GPT or these other LLMs.

00:31:37.810 --> 00:31:39.330
It's still ChatGPT.

00:31:39.330 --> 00:31:40.769
It wasn't built for a kid.

00:31:40.769 --> 00:31:42.450
And that's an area of a problem.

00:31:42.450 --> 00:31:55.890
If you're going to introduce kids in your classroom to AI or if your school district is going to do it, it needs to be an AI that was built for kids, not a wrapper, meaning a tool that's just using Chat GPT with a different wrapper on the outside.

00:31:55.890 --> 00:31:57.009
We should never do that.

00:31:57.009 --> 00:32:00.930
Check with those companies to make sure it was built for kids in mind.

00:32:00.930 --> 00:32:04.210
It has all the um certifications it needs and use that.

00:32:04.210 --> 00:32:11.890
But ChatGPT with an edutainment or excuse me, education or ed tech um overlay on it for kids is the wrong approach.

00:32:11.890 --> 00:32:16.610
So teachers, I get that, but make sure they're doing the right things as they build these tools out for your kids.

00:32:16.850 --> 00:32:17.250
Absolutely.

00:32:17.250 --> 00:32:19.810
And I think that's something that you brought up a great point there.

00:32:19.810 --> 00:32:31.650
And that is something that from the very beginning, as as you know, November 20, 2022, you know, when this started coming out, I really got into just looking at terms of service and things of that sort.

00:32:31.650 --> 00:32:40.930
And as we know, a lot of the platforms do connect to Chat GPT, they connect to Cloud, they connect to other uh large language models.

00:32:40.930 --> 00:32:57.090
And again, like you mentioned, it's just something that is overlaid over those models that although those models will tell you this is not for anybody younger than 18, then but on the wrapper itself, it'll say, oh, it's safe for students from 11 to 18 or whatever the case is.

00:32:57.090 --> 00:33:01.090
But you're absolutely right, it is still overlaid upon ChatGPT.

00:33:01.090 --> 00:33:14.850
However, because of the writing and everything, you know, I I don't know how many, how the checks that really go into a new app when it just they can simply write, hey, I've got COPA, FERPA, I've got all of that.

00:33:14.850 --> 00:33:19.009
And then all of a sudden schools just see those initials and they're like, oh, okay, here we go.

00:33:19.009 --> 00:33:19.970
We're good to go.

00:33:19.970 --> 00:33:23.970
And maybe the data storage isn't even in the US and it's overseas.

00:33:23.970 --> 00:33:26.130
So now where is your data going?

00:33:26.130 --> 00:33:28.930
So there's definitely a lot of things to think about there.

00:33:28.930 --> 00:33:36.130
But I love the fact that you are doing something great for that future that is here.

00:33:36.130 --> 00:33:44.610
The the students that are coming in that are going to be having or be have be exposed to this at a lot younger age than we were.

00:33:44.610 --> 00:33:47.009
Now they're gonna be better prepared.

00:33:47.009 --> 00:33:49.250
Their parents are gonna be better prepared.

00:33:49.250 --> 00:33:59.890
And I think that that is something that is great, that we are preparing them as opposed to before uh, you know, a lot of people is just ban it, ban it altogether, keep it away.

00:33:59.890 --> 00:34:09.329
It's not gonna go away, but what can we do to help our students just really understand the risks and how it works and so on?

00:34:09.329 --> 00:34:14.449
So that's why I've always said, like, I've always been a cautious advocate and just kind of being in the middle.

00:34:14.449 --> 00:34:23.730
And sometimes when I feel like I'm leaning to like, yes, this is it, this is going, and then all of a sudden you see something in the news, and I'm like, all right, I'm right back over here, you know.

00:34:23.730 --> 00:34:25.009
But but it's okay.

00:34:25.009 --> 00:34:27.090
And that's what these conversations are about.

00:34:27.090 --> 00:34:33.489
And this is what is wonderful about what you are doing, Amber, and the work that you're doing.

00:34:33.489 --> 00:34:47.170
I I love the fact that what you are doing through the cards, the students are now gonna have exposure to the terminology, they're gonna have exposure to the language, the vocabulary that is a continue to grow and progress.

00:34:47.170 --> 00:34:50.690
And of course, the technology continues to grow and progress.

00:34:50.690 --> 00:34:52.690
They have a better understanding.

00:34:52.690 --> 00:34:58.849
And like you mentioned earlier, at that young age, now they're gonna be able to make and connect those dots.

00:34:58.849 --> 00:35:01.489
And it's gonna be something that is fantastic.

00:35:01.489 --> 00:35:02.610
So I love it.

00:35:02.610 --> 00:35:03.809
I love the work that you're doing.

00:35:03.809 --> 00:35:06.769
But Amber, I just kind of want to flip it on you a little bit.

00:35:06.769 --> 00:35:25.010
We're we're gonna change this because I just want to ask you, you know, as a personal reflection, and maybe I might catch you off guard with this question, but and the reason that I asked this is because from the first time that you were on the show back in uh show episode March episode 317, till now, you know, we're getting close to December.

00:35:25.010 --> 00:35:36.690
You know, you've done a lot and you've brought a lot of great content for young kids, um, you know, bringing in adults for to be able to speak, you know, to young kids through your podcast and your book.

00:35:36.690 --> 00:35:48.930
But did you ever imagine, you know, yourself as as a kid in the whole pre-AI era when we were, you know, still with or I was playing with Transformers and I was playing with Hot Wheels and so on.

00:35:48.930 --> 00:36:03.490
But did you ever imagine that you would be working within this space in this intersection of where you are working with data analytics, AI, and now you're doing AI literacy for kids.

00:36:03.490 --> 00:36:06.130
Did you ever imagine yourself doing that?

00:36:06.130 --> 00:36:09.970
And if not, like how do you feel about it now?

00:36:10.289 --> 00:36:18.289
So the funny thing is, I never imagined that one, but when I look at it, all the skill sets I have make sense.

00:36:18.289 --> 00:36:30.690
So I've always struggled with, like, for example, I told you during my day job I do this work for government, but even like for for the last in Baltimore City in particular, that's why I live high, folks.

00:36:30.690 --> 00:36:36.130
Um, in Baltimore City in particular, I've been volunteering with the National Urban League since 2012.

00:36:36.130 --> 00:36:47.730
And in that, we have this program called the Saturday Leadership Program, as well as a few other programs that focus on helping kids to see all the beautiful universities here at um it within um Baltimore.

00:36:47.730 --> 00:36:49.090
We have so many universities.

00:36:49.090 --> 00:36:56.610
So we take kids um over the school year to different uh universities to help themselves see, help see themselves there.

00:36:56.610 --> 00:37:02.369
So I've always had an interest in kids and the next generation um and always been focused on kids in that way.

00:37:02.369 --> 00:37:04.050
And I've always had a creative brain.

00:37:04.050 --> 00:37:09.809
So data is logical and but also allows me to be creative if I'm doing things around like visualizations.

00:37:09.809 --> 00:37:15.809
But I finally feel like all my skill sets can come together now that I have the AI for kids component.

00:37:15.809 --> 00:37:17.170
Did I see myself here?

00:37:17.170 --> 00:37:17.490
No.

00:37:17.490 --> 00:37:21.170
We're all the components and all the breadcrumbs there, 100%.

00:37:21.730 --> 00:37:22.369
I love it.

00:37:22.369 --> 00:37:23.490
That is fantastic.

00:37:23.490 --> 00:37:28.050
And then just one more question before we wrap up as a just a kind of personal reflection.

00:37:28.050 --> 00:37:37.250
And I've asked this question before to previous guests, but I I would definitely uh with the work that you've done, and uh, you know, I think that I would love to hear your answer on this.

00:37:37.250 --> 00:37:46.769
So I want to ask you what keeps you up at night in a good way about AI, kids, and education, and what excites you the most?

00:37:46.769 --> 00:37:49.329
I thought you were gonna ask me a bad way.

00:37:49.409 --> 00:37:55.970
I was like, oh, so what keeps me up in at night in a good way is that it's not gonna be a good way.

00:37:55.970 --> 00:37:57.650
It's gonna be a good way, kinda.

00:37:57.650 --> 00:37:59.809
Um, let me let me say that.

00:37:59.809 --> 00:38:10.369
So I the thing that keeps me up at night, like if I'm being real about just that power of the question without the good way, is that kids have different levels of access to technology.

00:38:10.369 --> 00:38:15.090
Um, my background as a kid, and both my parents were military, moved around a lot.

00:38:15.090 --> 00:38:22.210
And I saw very early on, every time we moved to a different zip code, there were very different outcomes for kids.

00:38:22.210 --> 00:38:25.490
Um, very different technology that was available in our schools.

00:38:25.490 --> 00:38:27.570
And technology can mean even a book, right?

00:38:27.570 --> 00:38:30.289
That's a form of a technology if you look at it that way.

00:38:30.289 --> 00:38:41.970
Um, and when AI came out, I was just like, what really led to that first book is like, I want kids to make sure they're ready for this because AI can be used, which is the good thing about what keeps me up at night.

00:38:41.970 --> 00:38:51.170
AI can be used to close so many gaps, to help kids explore, to help people who are already like creating companies and like able to sustain their families using these tools.

00:38:51.170 --> 00:38:52.769
Like, there's so much good there.

00:38:52.769 --> 00:38:57.170
But the flip side is if we don't get it right, we're gonna leave a whole lot of people behind.

00:38:57.170 --> 00:38:59.809
And this could be a great um bridge.

00:38:59.809 --> 00:39:07.170
So that's what keep me up at night is like this thing can solve a lot of our problems that we've had in society, and we have to do that right.

00:39:07.170 --> 00:39:09.809
And I'm really excited about the potential of that happening.

00:39:09.809 --> 00:39:11.970
And then um thank you.

00:39:11.970 --> 00:39:32.930
And then just in general, is just I really want to make sure that all kids, and I this is one of my other taglines, all kids, no matter their zip codes, have the ability to get introduced to AI in a way that's screen-free, that protects their curiosity, their creativity, their critical thinking, and their confidence and allows them to be the best little humans they can be.

00:39:32.930 --> 00:39:36.289
Um, I also say this because we don't want robots, we want kids.

00:39:36.289 --> 00:39:38.050
So let kids be kids, not robots.

00:39:38.050 --> 00:39:39.570
And that's so important to me.

00:39:39.570 --> 00:39:48.130
And I think that with AI and with what I'm creating with AI Digicars, we have a chance to do that, as well as others who are thinking about screen-free options for kids.

00:39:48.450 --> 00:39:49.090
Excellent.

00:39:49.090 --> 00:39:52.530
Well, Amber, this has been a fantastic conversation.

00:39:52.530 --> 00:39:54.690
I am so excited and so thrilled.

00:39:54.690 --> 00:40:02.769
Just again, the work that you've done up until this point, the work that you continue to do and this mission that you are or are on is something that is wonderful.

00:40:02.769 --> 00:40:10.690
And I really appreciate that we have fine educators like yourself that are doing these things because it's something that is so important.

00:40:10.690 --> 00:40:22.210
And, you know, just trying to get in the front, in front of all of this and being able to help out those parents and being able to help out those young kids through your book, through your podcast, and now through AI, Digicards.

00:40:22.210 --> 00:40:25.809
So I want to ask, though, because I know that this was on Kickstarter.

00:40:25.809 --> 00:40:28.690
Uh, we will link this in the show notes as well.

00:40:28.690 --> 00:40:31.970
But I know that may I I'm not sure the way Kickstarter works.

00:40:31.970 --> 00:40:40.050
So do we still have you know time for any listeners to really get in and be able to purchase or be able to back your project?

00:40:40.050 --> 00:40:41.650
Tell us a little bit about that.

00:40:41.889 --> 00:40:42.210
Yes.

00:40:42.210 --> 00:40:43.730
So we're on Kickstarter.

00:40:43.730 --> 00:40:45.970
We got funded within 13 hours.

00:40:45.970 --> 00:40:48.289
So that means we're going print.

00:40:48.289 --> 00:40:49.490
So I'm really excited about that.

00:40:49.490 --> 00:40:52.130
So we have 19 more days left in our Kickstarter.

00:40:52.130 --> 00:40:54.690
It actually ends on my birthday, December 3rd.

00:40:54.690 --> 00:40:56.530
Happy birthday early birthday to me.

00:40:56.530 --> 00:40:58.369
Um, so make sure you check it out.

00:40:58.369 --> 00:41:03.570
But the thing about the Kickstarter is it's to allow us to fund that first um set of cards.

00:41:03.570 --> 00:41:10.530
We're now working on our stretch goal since we hit the first goal, which is to get to those um digital expansion packs, which I'm really excited about.

00:41:10.530 --> 00:41:18.369
But if you go on Kickstarter right now and you choose one of the um different levels that has a card deck in it, you'll get the card deck.

00:41:18.369 --> 00:41:23.329
There's also the ability for people who may say, Hey, I don't have kids in my life, but I want to gift it to a school.

00:41:23.329 --> 00:41:28.130
There's ability to back a classroom, back a school, or just get a pack to send to someone.

00:41:28.130 --> 00:41:33.090
It also comes with a poster that also have the ABCs of AI that can be printed out.

00:41:33.090 --> 00:41:37.329
Um it comes with like a cool little uh luggage, luggage or book bag tag.

00:41:37.329 --> 00:41:39.170
Um, each one comes with a different thing.

00:41:39.170 --> 00:41:47.250
And for everyone who backs us, you will have your name either via a QR code depending on the level or your name actually on a card within every single deck.

00:41:47.250 --> 00:41:50.769
So people will be able to see that you actually help back this and make it come to life.

00:41:50.769 --> 00:41:52.210
And I'm really excited about that.

00:41:52.210 --> 00:42:02.369
So, yes, depending on the day, as long as it's before December 3rd that this comes out, please go to our Kickstarter Kickstarter and then um choose uh where you want to back us.

00:42:02.369 --> 00:42:04.210
And these cards are going to print.

00:42:04.210 --> 00:42:08.210
Our hope is that we'll have them to you all by early next year.

00:42:08.450 --> 00:42:08.930
Excellent.

00:42:08.930 --> 00:42:09.889
Well, I'm excited.

00:42:09.889 --> 00:42:11.090
I know I saw the project.

00:42:11.090 --> 00:42:12.369
I was like, all right, I'm in.

00:42:12.369 --> 00:42:16.769
I definitely want to support just a wonderful, wonderful mission that you are doing.

00:42:16.769 --> 00:42:18.210
So I'm really excited about that.

00:42:18.210 --> 00:42:21.250
And I'm excited that it's gonna be autographed too as well.

00:42:21.250 --> 00:42:23.409
So I'm not gonna lie, I'm not gonna lie.

00:42:23.409 --> 00:42:25.090
I'm really excited about that too.

00:42:25.090 --> 00:42:26.210
But that's fantastic.

00:42:26.210 --> 00:42:29.329
Amber, again, thank you so much for being on the show.

00:42:29.329 --> 00:42:32.930
And again, before we wrap up, here are three lightning round questions.

00:42:32.930 --> 00:42:33.730
So here we go.

00:42:33.730 --> 00:42:38.610
Question number one: as we know, every superhero has a pain point or weakness.

00:42:38.610 --> 00:42:43.329
So for Superman, kryptonite was his weakness and I guess his pain point.

00:42:43.329 --> 00:42:52.530
So I want to ask you, Amber, in the current state of we'll say AI in education, what would you say would be your current kryptonite?

00:42:52.849 --> 00:42:53.889
That is hard.

00:42:53.889 --> 00:43:02.050
Um, I would say it's balancing the benefit of the technology with the default.

00:43:02.050 --> 00:43:07.170
And for me, it's like this whole idea of deep fakes, as I told you all earlier, my voice is on an AI.

00:43:07.170 --> 00:43:12.289
So my kryptonite right now is like using people's voices when they didn't get permission.

00:43:12.289 --> 00:43:17.490
I gave permission for mine, my voice is licensed, but now people are being able to take it and use it in different ways.

00:43:17.490 --> 00:43:24.130
And I think that actually takes away from it and doesn't make the AI movement as strong when things like that are happening.

00:43:24.369 --> 00:43:25.409
All right, good answer.

00:43:25.409 --> 00:43:25.730
Thank you.

00:43:25.730 --> 00:43:36.530
Question number two if you could have a billboard, and we'll say they're in right smack in the middle of Baltimore, with and that your billboard could have anything on it, what would it be and why?

00:43:36.769 --> 00:43:43.409
It would literally probably say one of the quotes I said earlier, it would say, all kids deserve access, no matter their zip code.

00:43:43.409 --> 00:43:50.450
And that's access to AI, access to math, access to cool learning labs, access to all things.

00:43:50.450 --> 00:43:53.730
Like we live in a generation where we have all this technology.

00:43:53.730 --> 00:44:00.450
If we have it, let's make sure we give kids access to do what it is they need to do or the information they need to at least get it for themselves.

00:44:00.690 --> 00:44:01.170
Excellent.

00:44:01.170 --> 00:44:01.970
Great answer.

00:44:01.970 --> 00:44:09.090
And the last question, Amber, if you could trade places with a single person for a day, who would that be and why?

00:44:09.409 --> 00:44:22.050
I would say Mackenzie Scott, mainly because she is giving out so much money to so many different organizations, and like they're not even having to ask for it.

00:44:22.050 --> 00:44:27.650
I just want to be in her shoes to just see like what is her thought process, how is she finding these organizations?

00:44:27.650 --> 00:44:38.769
She's given, um, for folks who don't know, she is her husband is a former, former, her ex-husband is a former or is a current tech exec, and she got a lot of money through their divorce.

00:44:38.769 --> 00:44:45.809
And she has been doing so much stuff for H recus, woman, women-owned businesses, like um minority-owned businesses.

00:44:45.809 --> 00:44:48.930
She's just cutting checks that they've never seen before.

00:44:48.930 --> 00:44:53.090
Put me in those for a day just to see how she does it and how she gets there.

00:44:53.090 --> 00:45:04.450
I think it's so cool to have that type of impact and giving money out at that level that's changing, changing lives, changing students' lives, and is really helping a lot of universities as well, especially I'm in the current funding environment.

00:45:04.690 --> 00:45:05.090
Excellent.

00:45:05.090 --> 00:45:06.369
Well, that is a great answer.

00:45:06.369 --> 00:45:10.289
So, Amber, thank you so much again for being a guest on our show.

00:45:10.289 --> 00:45:13.010
And again, it's an honor to have you here.

00:45:13.010 --> 00:45:14.769
Again, I am a big fan of your work.

00:45:14.769 --> 00:45:22.210
And please make sure to all our listeners, if you do not follow Amber yet, please make sure that you check out the link in the show notes.

00:45:22.210 --> 00:45:28.450
Make sure you make sure you follow on LinkedIn, make sure you follow on Instagram, subscribe to the pod because it is fantastic.

00:45:28.450 --> 00:45:40.769
So if you are an educator for education professional and you're listening to this and you have kids, I will definitely put this on on road trips and allow your kids to listen to it because you will find some wonderful content for them.

00:45:40.769 --> 00:45:45.809
Make sure you check out the book too as well, and be on the lookout for those AI Digicards.

00:45:45.809 --> 00:45:59.730
So again, Amber is doing some wonderful things where, like she mentioned earlier, she cannot be at all the places, but she can definitely bring that knowledge and passion and that vision to all the places through all these multiple means for your students.

00:45:59.730 --> 00:46:01.650
So again, Amber, thank you so much.

00:46:01.650 --> 00:46:11.409
And for all our audience members, please make sure you visit our website, myedtech.life, or you can check out this amazing episode and the other 342 episodes.

00:46:11.409 --> 00:46:19.250
So I promise you, you will find a little something there that, or little knowledge there, that you can sprinkle onto what you are already doing great.

00:46:19.250 --> 00:46:21.650
And again, a big shout out to our sponsors.

00:46:21.650 --> 00:46:25.809
Thank you so much, Book Creator, Edu8, Yellow Dig, Peelback Education.

00:46:25.809 --> 00:46:31.570
It really means the world to me that you believe in our mission to bring these great conversations into our space.

00:46:31.570 --> 00:46:36.610
And again, my friends, until next time, don't forget, stay techy.
Amber Ivey Profile Photo

Creator of AiDigiTales and the host of the AI for Kids podcast

Amber Ivey “AI” is currently a Vice President at a non-profit where she leads a team that helps governments drive impact. In her prior role, as the Senior Director for the Bloomberg Philanthropies Center for Government Excellence at Johns Hopkins University, she led a team that assisted governments in utilizing data and performance management for decision-making. Furthermore, she played a key role in the design and launch of the Bloomberg Philanthropies City Data Alliance. The program aims to train 100 mayors and their senior leaders throughout the Americas on utilizing data to achieve better outcomes. Formerly, she worked at The Pew Charitable Trust—a nonprofit focused on solving today's challenges by using data-driven, nonpartisan analysis. Here, Amber led the data collection and organization efforts of a first-of-its-kind research study on how all 50 states and the District of Columbia use data to solve complex problems, improve the delivery of government services, manage resources, and evaluate effectiveness. Most recently, Amber led a team that provided technical as well as strategic assistance to states and counties, who were working towards streamlining their business processes and launching technology, like legal assistance websites and online courts, to modernize and improve access to the legal system.

Before joining Pew, Amber served at Maryland StateStat, a performance-measurement and management office established by former Governor Martin O'Malley (D). Following the change in administration, she helped facilitate the transition by demonstr… Read More