March 23, 2025

Episode 317: Amber Ivey

Episode 317: Amber Ivey

Episode 317: Why Ethics Must Lead Our AI Conversations with Amber Ivey

In this powerful episode of My EdTech Life, I sit down with Amber Ivey—data expert and host of the AI for Kids podcast—to unpack one of the most urgent topics in the world of AI: ethics.

Amber shares how her journey in government data and policy led her to become a passionate advocate for AI literacy among kids. From her edutainment platform AiDigiTales to her bestselling children's book AI Meets AI, Amber is on a mission to simplify AI and empower the next generation to become creators, not just consumers.

We get real about bias in AI, how large language models fall short for diverse populations, and why representation—and ethical design—matters now more than ever. Amber doesn’t just talk tech—she breaks it down in a way that’s simple, accessible, and human-centered.

🎧 Whether you're a parent, teacher, tech leader, or cautious advocate like me, this episode will challenge and inspire you to think more critically about the tools we’re putting in front of our kids—and who’s building them.

πŸ‘‡ Timestamps
00:00 – Intro and shoutout to our sponsor, Book Creator
01:45 – Welcoming Amber Ivey to the show
03:00 – Amber’s journey: From government data to AI for kids
06:24 – Helping kids become AI-literate in a tech-first world
09:00 – How data work shaped Amber’s approach to simplifying AI
12:00 – Using the KISS philosophy: Keep it simple and human
15:35 – The fear-based messaging around AI adoption
17:00 – Why kids need to understand AI through their passions
20:00 – How AiDigiTales was born and the power of edutainment
23:30 – Access, equity, and designing AI literacy for all zip codes
25:50 – Breaking down LLMs and algorithms for kids
28:00 – Building podcast episodes for kids and teachers
31:30 – Keeping learners engaged with sound and storytelling
33:00 – Addressing the ethics of AI—early and often
35:00 – Real talk about bias, language, and who AI is leaving out
38:20 – Empowering kids to be at the table
41:00 – Amber’s AI kryptonite and why it’s okay to not know every tool
42:20 – The billboard message she’d share with the world
43:14 – Who Amber would trade places with for a day (you’ll love it)
45:00 – Final reflections and gratitude

πŸ’‘ Key Topics Covered

  • AI literacy for kids
  • Ethics in AI development
  • Data-driven storytelling
  • Simplifying complex tech
  • EdTech equity
  • The importance of diverse creators
  • Responsible integration of AI in education

πŸ“Œ Don't forget to like, subscribe, and share this episode!
Help us reach 1,000 subscribers by hitting that πŸ”” and spreading the word.

🌐 For more amazing conversations, visit: https://www.myedtech.life
πŸ“² Follow us everywhere: @myedtechlife
πŸŽ‰ Stay Techie!

πŸ™Œ A Huge Thank You to Our Sponsors

Authentic engagement, inclusion, and learning across the curriculum for ALL your students. Teachers love Book Creator.

Yellowdig is transforming higher education by building online communities that drive engagement and collaboration. My EdTech Life is proud to partner with Yellowdig to amplify its mission.

See how Yellowdig can revolutionize your campus—visit Yellowdig.co today!

Support the show

Thank you for watching or listening to our show! 

Until Next Time, Stay Techie!

-Fonz

πŸŽ™οΈ Love our content? Sponsor MyEdTechLife Podcast and connect with our passionate edtech audience! Reach out to me at myedtechlife@gmail.com. ✨

 

Chapters

00:30 - Welcome and Introduction

06:00 - Amber's Journey into AI

18:15 - From Data to AI for Kids

26:41 - AI DigiTales Origin Story

38:19 - Creating the AI for Kids Podcast

41:39 - Ethics and Responsibility in AI

Transcript
WEBVTT

00:00:30.115 --> 00:00:33.777
Hello everybody and welcome to another great episode of my EdTech life.

00:00:33.777 --> 00:00:38.953
Thank you so much for joining us on this wonderful day, wherever in the world that you're joining us from.

00:00:38.953 --> 00:00:41.768
Thank you, as always, for all of your support.

00:00:41.768 --> 00:00:44.424
We appreciate all the likes, the shares, the follows.

00:00:44.424 --> 00:00:47.651
Thank you so much for interacting with our content.

00:00:47.970 --> 00:00:53.030
And a big shout out to our wonderful sponsor, book Creator.

00:00:53.030 --> 00:00:58.886
You know I got to have my caffeine and I want to thank Book Creator for this wonderful mug and the work that they're doing.

00:00:58.886 --> 00:01:09.487
So don't forget that it is be an author month, so make sure that you go sign up for book creator, use code my ed tech life and you get three months of premium service.

00:01:09.487 --> 00:01:11.492
So please make sure that you check that out.

00:01:11.492 --> 00:01:13.847
And, of course, all of that will be in the show notes.

00:01:14.250 --> 00:01:16.920
But, guys, it's been a while I've been out.

00:01:16.920 --> 00:01:21.471
I went to Puerto Rico and it was amazing doing a conference there with tech, my school.

00:01:21.471 --> 00:01:47.765
But it is always always awesome to be back behind the mic, loving and you know doing what well, loving what I'm doing as far as podcasting, but, more than anything, connecting with some amazing and wonderful educators, professionals you know, people that are out there in the space that are advocating for AI, and especially AI for kids, in the space that are advocating for AI, and especially AI for kids.

00:01:47.765 --> 00:01:48.987
So I would love to welcome to the show today Amber Ivey.

00:01:48.987 --> 00:01:50.251
Amber, how are you doing today?

00:01:50.873 --> 00:01:51.594
I'm doing great.

00:01:51.594 --> 00:01:52.216
How are you?

00:01:52.879 --> 00:01:56.448
I am doing wonderful, amber, like we were talking a little bit in the pre-show.

00:01:56.448 --> 00:02:32.641
Again, I just recently found your podcast, too, as well, and I just enjoy the format, like I was telling you the the questioning the guests that you have had, and I just love the ABCs to have AI that you've been doing, and so I find it to be a wonderful resource for parents, educators, really anybody in our space that is interested in AI, because you do such a great job at just really going in deep but doing it in a fun way that is engaging, and I definitely am engaged on my drive home from work or even on my drive to work.

00:02:32.641 --> 00:02:35.110
So thank you so much for the work that you're doing there as well.

00:02:36.060 --> 00:02:37.162
So glad it resonates.

00:02:37.162 --> 00:02:42.112
I know the title is AI for Kids, but the secret is it is for AI for everyone.

00:02:42.112 --> 00:02:50.980
We break it down in a simple way and we really want to make sure parents and teachers have the resources they need to help kids understand AI in this outpacing AI world.

00:02:51.782 --> 00:02:53.045
Absolutely Well.

00:02:53.045 --> 00:03:11.770
Thank you so much for being here today, but for our audience members, amber, who are not familiar with your work yet, but after the show today, I hope and I'm telling everybody and all my followers make sure that you go over to our webpage and make sure that you go over to the podcast and subscribe, because I promise you that you will definitely find some wonderful resources that you can sprinkle onto it.

00:03:11.770 --> 00:03:12.801
You're already doing great.

00:03:12.801 --> 00:03:20.686
So, amber, for all of those audience members who are not familiar with your work yet, can you tell us a little bit of your journey?

00:03:20.686 --> 00:03:24.372
You know your context within now in the AI space.

00:03:24.372 --> 00:03:25.534
How did that come about?

00:03:26.841 --> 00:03:29.587
So my journey did not start in the AI space.

00:03:29.587 --> 00:03:30.610
Well, technically it did.

00:03:30.610 --> 00:03:37.420
My initials are AI, so technically I was born into this world of AI, right, but no, it actually started in the data space.

00:03:37.420 --> 00:03:39.566
So my background is in data.

00:03:39.566 --> 00:04:04.575
I say I do government data or help governments with data by day and I help kids understand AI by night, but my career has been in the data space for some time, so I've been helping out governments, for the most part, learn how to use data to make decisions and now, because of AI just being an extension of data, like that's the thing that powers data, over the past, probably eight years or so six to eight years or so I've been doing research in the space.

00:04:04.575 --> 00:04:09.718
Um, probably eight years or so, six to eight years or so I've been doing research in the space and then also um getting more active in it by lending lending my voice to an ai avatar.

00:04:09.718 --> 00:04:12.169
My voice is actually using an ai avatar.

00:04:12.169 --> 00:04:14.358
That's um by this organization called create lab.

00:04:14.358 --> 00:04:15.301
She travels the world.

00:04:15.301 --> 00:04:20.418
She speaks multiple languages, um, including spanish, russian, uh, japanese.

00:04:20.418 --> 00:04:25.430
I just tested these language out, languages out with some kids, so I learned that she does speak a bunch of stuff.

00:04:25.430 --> 00:04:56.521
I didn't know she speaks, but I've been active in this space mostly because data is an extension of AI and data work is very important to me, because I believe data helps people make better decisions, particularly government, when they're helping people get access to services or improve the lives of citizens, because that is their main job, and because of that I was able to do research in AI, led some projects that had AI within them as well, and then decided to really look at how I wanted to think about AI and I wanted to focus on kids, because adults were a little bit scared of things.

00:04:56.521 --> 00:05:16.870
And when ChatGPT came out in 2022, at this point I can't remember which year I was already the voice of the avatar I told you about, so I had already interacted with chat, gpt or GPT through that, because the company had access to OpenAI's research tools at the time.

00:05:16.870 --> 00:05:17.812
We're testing it out.

00:05:18.240 --> 00:05:20.963
So when it came out, everyone was like oh, ban it, do this.

00:05:20.963 --> 00:05:22.326
So what is this thing going on?

00:05:22.326 --> 00:05:26.572
I was like, wait, I've been like playing with this thing and the voice of this thing for some time.

00:05:26.572 --> 00:05:28.555
Now it's not as scary as it seems.

00:05:28.555 --> 00:05:31.187
Yes, we should be cautious, but we should think about what that looked like.

00:05:31.288 --> 00:05:37.661
And then that's when I decided to switch over to kids, because I was podcasting about AI for adults and helping them understand it.

00:05:37.661 --> 00:05:40.204
But as we get older, we get a little bit nervous.

00:05:40.204 --> 00:05:44.670
We're not interested in the new things that are coming up often, and me I get it.

00:05:44.670 --> 00:05:47.375
I don't like change unless it's a change I want to have happen.

00:05:47.375 --> 00:06:02.185
So what would it mean then to focus on the group that's already growing up in the age of AI Someone said this to me and I think it's a good point is that kids that are born today, their first interaction with the internet is AI, through Alexa, google and all these different tools.

00:06:02.185 --> 00:06:16.692
And that was the moment where my mind, like, really opened up and said if their first interaction is with AI and their AI first generation, how then do we make sure they're prepared in 10, 15 years, when they're going to be in the workforce and all these tools are going to be integrated?

00:06:16.692 --> 00:06:23.387
So that's a little bit of like how I got here, starting with AI excuse me, data and then moved over to AI.

00:06:24.269 --> 00:06:30.021
That is wonderful and you know, actually it kind of seems like just really a natural progression, like you said.

00:06:30.021 --> 00:06:37.048
You know the type of work that you were doing and then, of course, moving into the AI space and you know, I agree with you with what you said.

00:06:37.048 --> 00:07:06.723
It's just also that fear of change, and especially in adults, in November 2022, when this whole thing came out, it was just like, from then on, like my podcast I've been doing and interviewing, and it's just heavy conversations on AI and it's just great because right now I'm just researching those conversations and doing like first quarter, second quarter, third quarter, fourth quarter and just kind of seeing how the themes have changed, how it went from panic, panic like no, we don't want this to slowly like that slow acceptance and just kind of seeing it.

00:07:06.723 --> 00:07:21.050
And then, of course, now it's it's really now that in at least in the education space, finding those main players within the education space now saying, okay, like now, how can we really ideally integrate this technology into the classroom?

00:07:21.050 --> 00:07:27.264
Because, as you said, you know many, you know times walking around in the shopping and things of that sort.

00:07:27.264 --> 00:07:37.322
You have students that have a device, students that you can actually hear them just control the device by saying, hey, alexa, or, you know, hey, siri, and things of that sort.

00:07:37.322 --> 00:07:40.050
You know this is the world that they are growing up in.

00:07:40.250 --> 00:08:19.869
But, at the same time, what I do love that you said is just kind of, you know, moving your energy into the space for students and for kids to be able to explain this technology to them, and I think that's fantastic how the technology, how you as a human, can be able to manipulate, as far as you know, if there's something that you're looking for, you know specifically the context and things of that sort to be able to, you know, even look at that output and make sure that we educate them to make sure that they're looking for outputs that are accurate to as well.

00:08:19.968 --> 00:08:43.054
And so the one thing that I do love about your podcast is just those little gems and the way that, even speaking with amazing guests that you've had and one that I'll mention to you like I always say, I just gave a shout out pre-recording and now I'll give a shout out here in the recording is Dr Nika McGee, because she is fantastic and her and I have had the opportunity to present, and she has an awesome session on AI for Littles.

00:08:43.275 --> 00:08:49.299
That is like unplug and it is fantastic even for adults to be able to understand those things.

00:08:49.299 --> 00:09:01.006
So I love the work that you kind of both are doing and kind of in the same space and adjacent to one another, and you through the podcast, and I know we'll talk a little bit about AI, digitales 2 as well.

00:09:01.006 --> 00:09:22.844
But I want to ask you you know just deep dive into you mentioned, you know, working with data to drive efficiency and decision-making and all of those things Like so how has that influenced and I know you talked a little bit more about that, but really targeting with kids how have those experiences influenced your approach to integrating AI in education?

00:09:23.164 --> 00:09:27.500
Oh yeah, so my first job out of my master's program.

00:09:27.500 --> 00:09:34.046
So originally I was in private sector doing logistics, which is like heavy data, heavily efficient, heavily focused on efficiencies.

00:09:34.046 --> 00:09:38.745
And I went to school to get my master's with the goal of helping government be more efficient and work better.

00:09:38.745 --> 00:09:57.269
Didn't know how, didn't know the way I was going to do it, but I took a performance management class and one of the professors told me about this program called StateSat at the time, which was a data program in the state of Maryland that was helping the governor at the time, former Governor O'Malley use data to drive outcomes and get towards his goals.

00:09:57.269 --> 00:10:14.355
He had goals around decreasing infant mortality, maternal mortality, decreasing the CO emissions that were going into the climate, helping to reduce violence that was happening in the state.

00:10:14.355 --> 00:10:25.371
He had a lot of different goals he was focused on and those goals all had data attached to them, so we were able to go in and help analyze those data to help the leaders of the state understand what was happening.

00:10:25.840 --> 00:10:30.351
What I learned there is a lot of leaders in the state or anywhere I work with government.

00:10:30.351 --> 00:10:34.630
People come into government because they want to do a good job and make the world a better place.

00:10:34.630 --> 00:10:46.072
They may not have a data analyst background or be able to look at numbers in a different way, so it was our job to make sure we put in front of them what the numbers said in an easy way so they can make a decision.

00:10:46.072 --> 00:11:00.027
Why I bring that up is often, I think, across any career field we're in, we get very technical or we make it so hard to understand and we try to make things complicated when the reality is it's best to keep things simple.

00:11:00.027 --> 00:11:13.070
Very, very early on in my government career, how to take complex concepts, data, images or whatever it was that was happening, and explain it to someone, no matter their background, in a way that they got immediately and could make a decision off of that.

00:11:13.070 --> 00:11:15.424
How did that translate to AI for?

00:11:15.445 --> 00:11:15.907
kids Go ahead.

00:11:15.907 --> 00:11:16.750
Oh no, no, no.

00:11:16.750 --> 00:11:18.325
I am so sorry to interrupt there.

00:11:18.325 --> 00:11:19.470
Continue, continue.

00:11:20.139 --> 00:11:29.940
And then, as it relates to AI for kids because I've had the background in doing that and not to mention, my dad also was in the Army they have a slogan called keep it simple, with a bad word at the end.

00:11:29.940 --> 00:11:38.715
But keep it simple is the part I keep, because when you're like deploying missions or doing work, you have to keep it simple so that you don't get someone injured or you're able to do what you need to do.

00:11:38.715 --> 00:11:40.827
So that's my model for life in general.

00:11:40.827 --> 00:11:49.315
Like how can I explain things in a way that's super simple for someone to understand so then they can take action, because I'm not here for like just sharing, and then we're like doing nothing with it.

00:11:49.315 --> 00:11:53.741
Like how can you actually help someone do something with their the information they receive?

00:11:54.102 --> 00:11:57.091
So with AI for Kids, it was an easier transition.

00:11:57.091 --> 00:12:00.585
I was doing the same thing for AI for Adults and breaking down tough topics.

00:12:00.585 --> 00:12:10.402
But it was also an easy transition for me to say to kids, parents and teachers this is this weird thing called neural networks or this is this thing called large language models.

00:12:10.402 --> 00:12:26.070
Let's break it down in the most simplest way so anyone can understand, and then we take away the intimidation, we take away the fear around it and we allow someone to make a decision, because they actually have the same information that you and I have, who may be in the space and living and breathing it every day.

00:12:27.172 --> 00:12:27.692
I love it.

00:12:27.692 --> 00:12:46.462
You know, a lot of the things that you said just really resonate with me and, believe it or not, like the way that you're speaking, it's like you're speaking my language and it really fills my heart because it's very interesting that even I myself came from private sector and then I came into education and it's very weird because I I kind of see things in a different way, in a different lens.

00:12:46.462 --> 00:13:00.210
But I'm also the very same way that the way that you explained it, that I love to get information and dissect that information and present it to my audience, whether it's teachers, whether it's professionals and so on, in a specific way that works with them.

00:13:00.210 --> 00:13:12.571
And I also follow the KISS philosophy, except that my K, my kiss philosophy, I say it's keep it simple and streamlined and so and so I've done a lot of presentations like that.

00:13:12.571 --> 00:13:18.023
But I love that hook where it's like, ah, guys, come on, we just got to do the kiss philosophy, and everybody's like, oh my gosh.

00:13:18.023 --> 00:13:34.261
And I said, no, keep it simple and streamlined, which is really what you're doing in that sense of taking all of this I mean with so much that has come out since 2022, and for you to be able to really share that information in a wonderful way, like I mentioned.

00:13:34.422 --> 00:13:51.096
You know listening to your podcast and you know the way that you make it so easy to understand and you're talking to some very high level guests too as well that have wonderful experiences and wonderful backgrounds in coding, in large language models, in presenting.

00:13:51.096 --> 00:14:02.129
But the way that you question and draw out those answers, you do it in such a way where I'm like, oh, okay, so that's what that means and oh, I can see that connection and so on.

00:14:02.129 --> 00:14:21.190
So I absolutely love that and your heart for that is in the right place, because we definitely need to make sure that we simplify all of this information to really take away a lot of those barriers to entry, like you mentioned, obviously number one being the fear and you know I like myself and I know I really that's.

00:14:21.190 --> 00:14:39.965
This is why I really, you know, with Dr Nika McGee, her and I really get along is like we just consider ourselves cautious advocates where we're kind of right in the middle, like I'm not too far to you know, like yes, let's go all in, and I'm not too far into like no, no, no, like kind of just there in the middle and just trying to bring and reconcile both worlds.

00:14:39.965 --> 00:14:56.011
And that's why I love having these conversations with amazing guests like you, to be able to see both sides and just have people kind of take, you know, the good that is on this side, the good that is on this side, and just kind of make it their own and see where they are in their adoption phase.

00:14:56.179 --> 00:15:00.772
Because I always say there's, you know, those speedboats, those tugboats and then those anchors.

00:15:00.772 --> 00:15:07.261
So where is it that we fall in and how can we just continue to move forward at at any pace?

00:15:07.261 --> 00:15:23.386
You know, I don't know if you find this in your space, but sometimes there's the, those people that really put in that fear and saying if you're not doing this, you're ruining education and if you're not doing this, you're ruining the kids future, like you should be on this, you know, all the time.

00:15:23.386 --> 00:15:29.335
And they just really, you know, rile people up and just causing them, you know, just with a little bit of that fear.

00:15:29.335 --> 00:15:30.602
What are your thoughts on that?

00:15:30.602 --> 00:15:33.770
Like, what has been your experience with that?

00:15:35.174 --> 00:15:38.162
I feel like there's every movement that happens.

00:15:38.162 --> 00:15:56.831
We have that type of voice coming through the void of saying like, of voice coming through the void of saying like you're going to miss this, and I used to believe it for a very long time and then I realized if I miss this, there's another thing coming the next day, and then there's another thing coming the next year, another thing coming the next 10 years.

00:15:56.831 --> 00:16:00.789
Regardless of what it is Like, there are skill sets that I do want kids to learn.

00:16:00.789 --> 00:16:03.663
I think that's important for all of us to be AI literate.

00:16:03.663 --> 00:16:09.452
But do I believe if a kid doesn't fully understand AI, will they not make it in society?

00:16:09.452 --> 00:16:28.174
I don't believe that, because I also know that people who are investing in AI are also investing in outdoor sports and outdoor activities that have nothing to do with a computer, because they don't know what's going to happen with AI and they don't know if people are going to be so tired of being integrated into computers that they're going to go then want to do more things that are away from a computer.

00:16:28.174 --> 00:16:31.038
So we all don't know 100%.

00:16:31.038 --> 00:16:49.754
Either way, I tell kids think about how AI STEM, whatever the topic du jour is of the day how can that integrate into things you want to do and enhance the thing you want to do, like I was just speaking to some kids last Friday for Pi Day and I always ask do you have a career that STEM or AI will not touch?

00:16:49.754 --> 00:16:52.048
And of course, I have kids raising their hand left and right.

00:16:52.779 --> 00:16:57.438
Some folks are like I want to be a cook, I want to be a soccer player, I want to be an artist.

00:16:57.438 --> 00:17:08.188
And then I have the other kids tell them why AI, stem, are still going to be a part of that career and what you find is that I don't have to tell them.

00:17:08.188 --> 00:17:19.347
The kids around them, even the ones who raised their hands and said their career would not be impacted, are able to tell the other young person like hey, no, actually it's used in the shoes that the soccer player is wearing.

00:17:19.347 --> 00:17:23.643
And if you want to be a really good soccer player, you're going to want to think about math and angles and kicks.

00:17:23.643 --> 00:17:31.192
Or if you're a great cook, you want to think about measuring the perfect recipe or et cetera, or using YouTube or whatever to share your cooking videos.

00:17:31.794 --> 00:17:36.227
So I say that because kids and the generation that are currently born are going to figure it out.

00:17:36.227 --> 00:17:42.435
Most of the careers that are here today we could have told kids we couldn't have.

00:17:42.435 --> 00:18:02.165
There's no way we could have told kids that YouTube influencers would be a career that kids are talking about, or being on TikTok or dancing and choreographed dance and all these things that are now making people a livable wage right Not in the million dollar version, but the people who are using it to make an income on the side, who had never known that was a job, the Ubers of the world, all these things.

00:18:02.165 --> 00:18:50.010
So I caution against people saying if you don't do this, because I know that humans are innovative and creative and when pressure they're gonna figure it out and figure out a way no-transcript are going to revolutionize education and test scores were going to go up, and I still haven't seen a lot of that yet.

00:18:50.432 --> 00:18:59.986
Then Chromebooks came about, and you know that new technology, everybody needs to be one-on-one, and you know things of that sort, but still haven't seen a lot of those results yet as far as the education.

00:18:59.986 --> 00:19:09.050
And so the next thing will come and, like you mentioned, you know I know Dr McGee's always she's already talking about quantum computing and I know you've recently had those conversations on the show too as well.

00:19:09.050 --> 00:19:12.015
So then, like you mentioned, it's like you think that this is it.

00:19:12.015 --> 00:19:17.383
I mean, just wait till what's coming out, either later this year or next year and so on, and so.

00:19:17.803 --> 00:19:36.839
But I do love the fact that you address the skill set that is important for a lot of, you know, young men and women to have at such an early age to really grow into that just the critical thinking skills, the collaboration, the communication, the problem solving, troubleshooting, all of those things that really just are.

00:19:36.839 --> 00:19:44.056
I think and I always thought this is what I've already been doing in my classroom when doing my lessons and doing my projects and so on.

00:19:44.056 --> 00:19:50.758
So oftentimes I just feel that it, it they take it to that whole other level where, no, you're, you're really missing out.

00:19:50.758 --> 00:19:53.372
But thank you so much for that uh reply.

00:19:53.372 --> 00:19:54.555
I really loved it.

00:19:54.555 --> 00:19:57.903
Now I want to want to ask you about AI DigiTales.

00:19:57.903 --> 00:20:11.530
So I want to ask you what inspired you to create AI DigiTales and how do you envision that or come up with that idea of engaging young learners and educators to understand and interact with AI in that manner?

00:20:12.471 --> 00:20:19.163
So when I decided to switch to AI for kids, it was just one little idea.

00:20:19.163 --> 00:20:30.142
So I was playing with Mitt Journey one day and I was doing a prompt that basically said have a little girl in Baltimore meeting a robot in Baltimore city that's lost, and played around with it.

00:20:30.142 --> 00:20:31.733
Had a whole bunch of images in one image.

00:20:31.733 --> 00:20:32.517
I was just looking at it.

00:20:32.517 --> 00:20:36.098
I'm like this could be a book, like let's actually make this into a book.

00:20:36.098 --> 00:20:38.459
So that became the AI meets AI book.

00:20:38.459 --> 00:20:43.519
It became a bestseller on Amazon and then I was like wait, people actually want to hear about this.

00:20:43.519 --> 00:20:45.435
What else is this thing?

00:20:45.435 --> 00:20:54.061
And then that's where AI DigiTales came from and the reason why it's called AI DigiTales it's like a play on digital but AI DigiTales.

00:20:54.061 --> 00:20:55.807
It's like a play on digital but AI digi tales.

00:20:55.807 --> 00:21:05.319
And also understanding that, because it started with a book, my background and the underlying pieces of it are storytelling and like edutainment is what the company is focused on.

00:21:05.319 --> 00:21:19.907
Because I know for me, sesame Street, schoolhouse Rock Barney for my younger siblings that I had to listen to because they were in the room, all those songs I still remember to this day and all those tips about how a bill is developed through schoolhouse rock or whatever it is.

00:21:19.907 --> 00:21:26.071
I remember because there was like an edutainment factor, whether it was song, storytelling or helping me think about that.

00:21:26.071 --> 00:21:42.795
So my thing is that AI DigiTales is the first part of a kid's journey and we're planting a seed, or even getting the ground ready for a seed, and my hope is that when a kid thinks about what they've learned later like I had, a four-year-old mom reached out and said their kid was talking about algorithms One.

00:21:42.795 --> 00:21:49.238
Algorithms is not a four-year-old word, but we've TikTok'd them that in the AI for algorithms and now they're using that word.

00:21:49.238 --> 00:21:57.820
She may not even know what that means, even after listening to the podcast, but what she will know at 10, 15, and 20, means even after listening to the podcast.

00:21:57.820 --> 00:22:09.073
But what she will know at 10, 15, and 20, oh, I learned the basics of that and then understand how to apply that in life and that's how life works.

00:22:09.093 --> 00:22:11.057
Someone just talked to me about this recently Dipti Bidet at Little Lit AI.

00:22:11.057 --> 00:22:15.115
She was talking about how learning about AI or learning in general is a whole framework for the child.

00:22:15.115 --> 00:22:21.102
So it's not just in the classroom, it's not just at home, it's not just walking through your neighborhood.

00:22:21.102 --> 00:22:23.457
You're learning across so many different mechanisms.

00:22:23.858 --> 00:22:38.877
So my goals for AI DigiTales to be an AI for kids entertainment company that gives them a piece that then adds on to their greater learning and it encourages teachers I know some teachers play it in their classrooms allows teachers to use it who may not have access to AI technology.

00:22:39.420 --> 00:22:43.699
And the other piece is that we want to make sure that kids, no matter their zip code, get access to this.

00:22:43.699 --> 00:22:51.653
So most of the stuff we talk about, even if you listen to, like the AI for Kids ABC series, a lot of the things you do, you're not doing with a computer.

00:22:51.653 --> 00:23:06.863
You're doing with pieces of paper, with your friends nearby, you're using what you have to understand the concept, because it's not missed for me that a lot of kids don't have access to technology and, coming from a data and government background, I know communities.

00:23:06.863 --> 00:23:14.758
They're still on dial-up and people are shocked every time I say that Dial-up, not broadband, and do not have access to internet and don't have access to cell service.

00:23:14.758 --> 00:23:23.558
So for kids who are able to hear this stuff at school or other places, they can then go home and do an activity where they may not have a computer access to high speed internet.

00:23:23.558 --> 00:23:29.241
I want to make sure kids are able to do it, no matter what, and I'm just a piece of that puzzle, a piece of that story.

00:23:29.970 --> 00:23:39.734
I love it and, if you don't mind, what I'm going to do is I'm going to go ahead and just share, so bring this on, just so for our audience members that are going to be checking out the video.

00:23:39.734 --> 00:23:44.314
So, here it is ai digi tells, and I love this, the layout.

00:23:44.314 --> 00:23:47.083
She, amber, has videos.

00:23:47.083 --> 00:23:53.661
There's some so many great things of course, all the resources here, the books, but this is the one thing that I I love right here, the ai digi todds.

00:23:53.661 --> 00:23:55.766
This is the one thing that I love right here, the AI Digitods.

00:23:55.766 --> 00:24:27.835
This is wonderful, and so I think, amber, you really thought about this and just the way that you have here just these AI Digitods characters and, of course, the representation that's there, too as well, and so this is something that is fantastic, that is engaging, and the one thing that I love is that, like you mentioned, you've got resources here that could be played either at school, they could be played at home, and, of course, the student is going to be learning the ABCs of AI, and this is something that is fantastic.

00:24:27.835 --> 00:24:30.711
And, of course, guys, this will all be linked in the show notes.

00:24:30.711 --> 00:24:35.461
You can definitely check out the website, and I do want to talk about the podcast now.

00:24:35.461 --> 00:24:38.970
You can definitely check out the website and I do want to talk about the podcast now.

00:24:38.970 --> 00:24:46.589
So, as, like I mentioned to you as a listener and a fairly new listener but I've listened to about six episodes now and everything, and you know, and I'm going to continue listening to them.

00:24:46.630 --> 00:24:52.611
But one thing that I loved about that is you know just the way that you take all of this.

00:24:52.611 --> 00:24:58.343
You know, you know, I guess, information about AI and breaking it down.

00:24:58.343 --> 00:25:02.619
Like we talked a little bit about this, simplifying it for a child.

00:25:02.619 --> 00:25:08.599
Can you, amber, walk me through that process of how you take, you know, large language models.

00:25:08.599 --> 00:25:15.842
You know if you take, you know, algorithms and how do you, what's your process in breaking it down to explain it to a child?

00:25:21.529 --> 00:25:23.417
So because of my background, I've had to break topics down my whole entire life.

00:25:23.417 --> 00:25:25.405
So it's helped that because of the way I've grown up.

00:25:25.405 --> 00:25:36.714
And also another thing I do want to bring up about AI DigiTalks which matters here is that all those AI DigiTalks are representative of kids I've met throughout my history and my childhood.

00:25:36.714 --> 00:25:42.680
My dad was in the Army, my mom was in the Navy when they met, and so my dad stayed on in the Navy and my mom got out.

00:25:42.680 --> 00:25:50.279
Excuse me, my dad stayed on in the Army, my mom got out of the Navy, so we traveled around a lot so I was able to meet a bunch of different kids from different backgrounds.

00:25:50.279 --> 00:25:56.583
So for me, my world was always full of kids of different backgrounds and I didn't know anything but that.

00:25:56.583 --> 00:26:11.586
And because of that and moving around a lot, I had to learn how to adjust to where I was and to be able to fit into whatever cultural context I was in, whatever was happening within my environment, the different types of people I was interacting.

00:26:11.586 --> 00:26:16.278
I became a master adjuster, so I'll say that that has helped me out a lot.

00:26:16.278 --> 00:26:25.564
So now, when it comes to just AI for kids, I try to make sure one thing is like the age level.

00:26:25.564 --> 00:26:37.981
So originally I wasn't sure what age level I knew I wanted to do, like elementary and middle school, because a lot of resources do focus on high school, particularly around coding and everyone else, and there's nothing for like elementary kids, then middle school.

00:26:37.981 --> 00:26:41.314
So I said I wanted my target audience to be four to 12.

00:26:41.314 --> 00:26:55.259
I talked to a bunch of kids about this to try to figure out what makes sense for them, and I also get to test this stuff out in person or virtually when I'm doing AI for kids workshops, which has been helpful in that they're like my case study.

00:26:55.279 --> 00:27:03.875
For that and because of AI, I also use AI to help clean up my scripts.

00:27:03.894 --> 00:27:05.098
If I'm saying things in a way that's too complicated.

00:27:05.098 --> 00:27:12.942
I'm like make this more simple or help me say this in a way that a kid can understand, because I don't want to assume that I have done it right for that four-year-old.

00:27:12.942 --> 00:27:24.626
For the elementary school episodes and for the middle schoolers Middle schoolers I'm a little bit better because I interact with them a little bit more and they're a little bit higher as it relates to where they are with understanding concepts.

00:27:24.626 --> 00:27:37.663
But elementary is hard, so I will ask AI to help me think through how do I say this in a way that's helpful and it does help me to create something that allows me to explain it in a way and AI also is not perfect, right?

00:27:37.663 --> 00:27:40.619
So I start with my script use AI, and then I'm like I don't like that.

00:27:40.619 --> 00:27:50.654
I would explain it like this and then make the edits to that and I'm able to help drive AI through, using AI to help me explain it, particularly for my elementary school kids.

00:27:51.156 --> 00:27:51.637
Love it.

00:27:51.637 --> 00:28:23.349
No-transcript.

00:28:23.349 --> 00:28:25.480
How did that inspiration come about?

00:28:25.480 --> 00:28:26.644
When did you say you know what?

00:28:26.644 --> 00:28:34.811
I'm going to go ahead and start a podcast and you know, just continue the work through that so I started originally with my podcast.

00:28:35.093 --> 00:28:46.420
It was called the name was horrible, but I'll say it it was called AI Decodes the System and my whole thing was like Amber breaking down which wasn't simple, like as I told everyone to be simple or kiss a method.

00:28:46.420 --> 00:28:52.742
That was not, but it was me breaking down different things about AI data technology in the space that I live in.

00:28:52.742 --> 00:28:56.035
So I was doing that for some time and then fast forward.

00:28:56.035 --> 00:29:00.340
The things happened with ChatGPT coming out back in 2022.

00:29:00.340 --> 00:29:03.853
And I still kept podcasting.

00:29:03.853 --> 00:29:08.278
But then I also realized that there was that disconnect that I was saying I felt like I needed to focus on kids.

00:29:08.278 --> 00:29:11.691
So last May I was like I just need to transition fully to kids.

00:29:11.691 --> 00:29:14.196
And that was after the book it came out.

00:29:14.196 --> 00:29:22.060
I had a few books after that and I decided that it was time for me to move on to kids Because it's kid-focused.

00:29:22.060 --> 00:29:37.160
I was like, all right, I got to put on a different hat and again talking to young people to say what matters to you, listening to other kid-focused podcasts and also knowing at the same time I wanted to be focused on kids, but I knew a parent or teacher would be listening in the background or using that to try to get their kids to listen.

00:29:37.160 --> 00:29:45.714
I wanted to make it age appropriate for young people, but also not to the point where adults would be like I don't want to listen to this, this is like too kiddish or whatever.

00:29:45.714 --> 00:29:55.482
So I was trying to play a helpful balance and I always tell people before we interview I'm like there are going to be sound effects, apologize in advance if you don't like where it is, and things like that.

00:29:55.542 --> 00:30:04.397
But the goal was, because kids one already have a short amount of short attention span, how can we keep them involved and remind them to listen again?

00:30:04.397 --> 00:30:08.775
When we're doing for elementary school kids, five minutes that's a long time.

00:30:08.775 --> 00:30:10.181
So what does that mean?

00:30:10.181 --> 00:30:10.663
To make sure?

00:30:10.663 --> 00:30:11.226
So that one.

00:30:11.306 --> 00:30:13.813
I wanted to have music playing throughout the whole episode.

00:30:13.813 --> 00:30:16.799
So there is music throughout the whole episode and then there's also sound effects.

00:30:16.799 --> 00:30:27.780
I understand that middle and high schoolers, which are which is the longer form content, may not want to hear music the whole time, but also need that like reminder of hey, listen in or may have zoned out.

00:30:27.780 --> 00:30:39.556
How many of us listen to podcasts and then we zone out and then we pull back in, trying to make sure there was sound effects in it that pulled a person back in to get them to listen or refocus and then possibly rewind if they missed it.

00:30:39.556 --> 00:30:49.755
So that was all intentional understanding my audience was no longer only adults, but also wanting to make sure adults didn't feel like it was too much, and I'm glad to hear from you it doesn't feel like it's too much.

00:30:50.470 --> 00:30:51.354
No, not at all.

00:30:51.354 --> 00:30:52.852
Actually, it's really like you mentioned.

00:30:52.852 --> 00:30:53.773
It's really like you mentioned.

00:30:53.773 --> 00:30:58.162
It's really engaging, because you know, you're driving, you're listening, and then like you're absolutely right.

00:30:58.162 --> 00:31:04.171
I mean, sometimes you get out of work and you're trying to decompress, so you've got something going on, and then all of a sudden, it's like you hear like ding.

00:31:04.171 --> 00:31:19.878
You're like, oh, okay, you know, and then you're back and then I'll, and interacting with it, and that's the one thing that I do love and I'm going to go ahead and bring this up to onto the screen.

00:31:20.240 --> 00:31:21.811
So this is the podcast.

00:31:21.811 --> 00:31:24.538
We'll definitely be linking this up in the show notes.

00:31:24.538 --> 00:31:31.711
It's AI for Kids, and you can find this on Buzzsprout and I know that you can listen to it on your favorite podcast player too as well.

00:31:31.711 --> 00:31:40.858
But what I do want to ask is, or actually share with our audience, is just really some of the titles that you do have here and, of course, some of the recent guests.

00:31:40.858 --> 00:31:54.002
But I love this, like where you have R is for reinforcement and then you go through the alphabet, like there's N is for neuro learning and then Q is for quantum computing, and I love the way that you label this.

00:31:54.002 --> 00:31:55.432
You know, this is for elementary.

00:31:55.432 --> 00:32:08.992
This is for middle school, you know, and so I think that this is a fantastic resource for teachers to be able to have in their classrooms when they have a little bit of extra time or they can do a little mini assignment.

00:32:09.012 --> 00:32:20.837
This could be a station just to even just go through the alphabet and understand this a little bit more, tying it into digital literacy and you know things of that sort, and this is great.

00:32:20.837 --> 00:32:32.512
I mean, to me it's weird, because this is, to me, as interactive of a podcast as you can find, you know, even through Buzzsprout, because of the sounds, you know.

00:32:32.512 --> 00:32:38.297
It really engages you and really captures your attention and making sure that you're not missing those important bits of information.

00:32:38.297 --> 00:32:45.821
So I think that's great, and I also love your format, where you have the little game show, questions for your guests and so on.

00:32:45.821 --> 00:32:54.717
So, guys, please, I promise you you will not be disappointed Make sure that you check out AI for kids, and I promise you it's not just for kids.

00:32:54.717 --> 00:33:15.034
You yourself, as an adult, as an educator, as a professional, will definitely be engaged with the wonderful conversations that you have there, and just the format is fantastic, and I think Amber really knocked it out of the park, as far as really going back and digging in deep, to that KISS philosophy, you know, and just really putting some great stuff out there.

00:33:15.034 --> 00:33:16.651
So, Amber, thank you for that.

00:33:17.073 --> 00:33:24.538
So just a couple more questions, Amber, and then we'll go to our final, last three questions that I always ask our guests as we get ready for closing.

00:33:24.538 --> 00:33:33.576
But I want to ask you here as far as and I always got to bring it up, you know, because it's part of my research too as well it's just navigating the ethics of all of this.

00:33:33.576 --> 00:33:39.618
So, of course, as you know, as it continues to move forward, we're gonna get new large language models.

00:33:39.618 --> 00:33:43.675
You know we had Manus, we had DeepSeek and we have many, many more.

00:33:43.675 --> 00:33:52.690
So I wanna ask you, as far as you know the ethics of AI you know it's really pressing that we do have these conversations.

00:33:52.690 --> 00:34:02.059
So how do you address the balance between that innovation and responsibility in your work through AI, DigiTeles and through your podcast?

00:34:02.990 --> 00:34:14.264
So one thing that's beyond important for me and I'm glad this is something that I know you care about as well is that ethics has to be at the forefront of this conversation.

00:34:14.264 --> 00:34:21.628
To the point, even in my podcast I added a pre-show note that's basically like hey kids, ai is great, but also it can cause harm.

00:34:21.628 --> 00:34:38.019
Um, because I don't want to be responsible as someone who's helping kids understand ai, not leaving that up front every single time, and even in some of the younger kids episodes I make sure I say, as always, there's also bad things associated with this, and this is what you should look out for.

00:34:38.019 --> 00:35:03.418
It's not to make them afraid of it, but to make them understand that there's this not necessarily new a technology that's been around for a while since the 50s and application right, but it's now new in a way that we have access to it and it's in our hands, in our homes and places where we live, we go to school, we walk, we drive, like it's literally everywhere and embedded in a way that we now see the forwards behind the scenes.

00:35:03.418 --> 00:35:12.588
So, because we can now see it and we as human beings are now allowed to interact with it, we are then held responsible for what we do with it and what happens with this technology.

00:35:12.588 --> 00:35:20.347
So for me I also say to kids it's like I don't want you to just be users of AI, like, if that's what you want to do, that's also totally fine.

00:35:20.347 --> 00:35:30.275
But I want them to understand they can be the next creators of it, and being the next creator of it means that we may have an ethical responsibility to take care of our fellow man who are using these tools.

00:35:31.257 --> 00:35:41.431
Often people who are making these tools are not at the intersection of different spaces and different communities, different cultures, different data sets, different backgrounds, where they can see everything.

00:35:41.431 --> 00:35:45.900
I, as a human being, would be wrong to say I know everything about.

00:35:45.900 --> 00:35:56.070
Every community or everything I should think about are developed by one type of person or one person from one background, or me as an individual.

00:35:56.070 --> 00:36:00.681
If I'm the only person there developing this, I have no view outside of myself.

00:36:00.681 --> 00:36:11.021
That's probably not the best way to build anything we want to build for the person who has least access, so that we all have access and we make sure that everyone's protected.

00:36:11.614 --> 00:36:13.342
Too many times tools have went out.

00:36:13.342 --> 00:36:15.336
Social media is a big one, right.

00:36:15.336 --> 00:36:18.143
Too many tools, phones or things have went out.

00:36:18.143 --> 00:36:19.826
Social media is a big one, right?

00:36:19.826 --> 00:36:28.018
Too many tools, phones or things have went out like social media and then we just let it run and then we're like, oh my gosh, bullying is up, suicide is up, depression is up.

00:36:28.018 --> 00:36:40.088
In kids, they're now dealing with imposter syndrome Because we as a society didn't decide to take ownership in the shared resource that we're all using and make sure our most vulnerable mostly kids and those who cannot protect themselves are protected.

00:36:40.088 --> 00:36:43.059
So I talk about AI, always with AI ethics.

00:36:43.059 --> 00:36:46.186
If we do not do this right, we will continue to widen the gap.

00:36:46.186 --> 00:36:48.902
The gap is already widening as we speak.

00:36:49.324 --> 00:36:57.016
Different AI tools, as we've seen, have had adverse impacts for people who look closer to you and I than those who do not look like us.

00:36:57.016 --> 00:36:58.961
For women, what does that look like?

00:36:58.961 --> 00:37:01.835
For people who speak different languages, what does that look like?

00:37:01.835 --> 00:37:08.516
The AI large language models was really built off of Western data sets, so what does that mean?

00:37:08.516 --> 00:37:14.820
That means they may think a little bit differently than I would think on a day-to-day basis, and we got to make sure that.

00:37:14.820 --> 00:37:24.496
If that is happening, we want to know about it, we want to make a decision on how we're going to improve that, and the kids who are growing up that they have an opportunity to say because they're really good at it.

00:37:24.878 --> 00:37:28.567
That's not right, or this should be done another way, or you should think about this.

00:37:29.414 --> 00:37:38.809
Kids, young people, need to be at the table because as we get older, we often forget about all those things that we were passionate about when we were younger and forget about the questions and the curiosity.

00:37:39.315 --> 00:37:50.496
So we need to make sure those people are at the table and the people we're building these for are at the table, so that we have a more ethical AI, understanding that bias is inherently built into how the models work.

00:37:50.916 --> 00:37:53.898
Like you're literally doing weights and things like that, we understand that.

00:37:53.898 --> 00:38:04.867
But it shouldn't be built in a way that harms someone, takes away their liberty, freedom or any of their rights that they are granted to being in this country, we got to make sure we protect people.

00:38:04.867 --> 00:38:19.166
I don't care if it tells you to pick a red dress over a blue one, but if it takes your kid away from your home because of some analytic tool that's in a government child welfare office, that needs to be as close to right as possible if you're using it to make those type of decisions.

00:38:19.166 --> 00:38:20.391
So ai.

00:38:20.391 --> 00:38:25.358
Ethics is always at the front of this conversation for me and it should never go away and we're never going to get it fully right.

00:38:25.358 --> 00:38:33.500
So we got to keep working on it all the way through until this thing becomes agi or whatever is going to be at the next level yes, no, I agree with you.

00:38:33.539 --> 00:38:38.065
Thank you so much for sharing that and just really, you know, very passionate about it.

00:38:38.065 --> 00:38:55.811
You know, and you and I are very in line with you, know your response, and this is really one of the things that I've definitely been voicing, and oftentimes it seems that our voices are really drowned out by a lot of the hype and the excitement that is out there because of people that are all about, hey, let's move fast and break things.

00:38:55.811 --> 00:39:05.755
But we know that move fast and break things doesn't work and it's okay to slow down, to go fast, but you know it's it's one of those things too, that we see it on social media, we see it on LinkedIn.

00:39:05.755 --> 00:39:17.402
I know you and I both interact on LinkedIn and it just seems like those voices are the ones that are heard the most, but then the ones that are, you know, know, just really just asking you to be more cautious and just be more ethical about this.

00:39:17.402 --> 00:39:28.364
It's like, oh, no, no, no, you're, you're the ones that are going to cause us to fall behind on all this race and, uh, you know that we're moving ahead and we need to go do these things, but again, I, I agree with you 100.

00:39:28.364 --> 00:39:42.518
So thank you so much for being very honest, genuine and authentic with that answer, all right, well, this has been an amazing, amazing episode and it's definitely filled my bucket.

00:39:42.518 --> 00:39:55.465
I it's amazing, like okay, so this whole time I'll be honest with you I've been very like starstruck, just because and it's weird, because it's like I'm listening to your podcast and I'm a fan of, and now I'm interviewing you as the host, which is really cool, and because, also, the work that you're doing.

00:39:55.465 --> 00:40:03.362
So it's been fantastic and it's great, and thank you for spending a little bit of your morning with me and, of course, with our audience members, which I kept repeating.

00:40:03.623 --> 00:40:17.306
Please make sure that you go visit AI DigiTales, make sure you visit AI for Kids, make sure you subscribe, make sure that you sign up for the email newsletter, check out the books, check out the resources, and all of that will be linked in in the show note, or linked in the show notes, I should say.

00:40:17.306 --> 00:40:23.650
And, of course, you'll be able to follow Amber on LinkedIn as well and all her socials that will be posted to also.

00:40:23.650 --> 00:40:24.936
So make sure that you connect with her.

00:40:24.936 --> 00:40:33.905
She's a wonderful resource and I promise you that she will take anything that is out there and she'll, you know, make it simple for you with her kiss philosophy.

00:40:33.905 --> 00:40:41.335
I promise All right, well, right, well, amber, let's go ahead and wrap up with the last three questions that I always send my audience members before the show.

00:40:41.715 --> 00:40:43.280
So hopefully you're ready.

00:40:43.280 --> 00:40:44.282
So here we go.

00:40:44.282 --> 00:40:48.016
Question number one in the current state of education.

00:40:48.016 --> 00:40:56.666
Why I should start this the way I normally do is, as we know, every superhero has a weakness and for superman, that kryptonite just weakened him.

00:40:56.666 --> 00:40:58.699
It was just like, ah, such a pain point.

00:40:58.699 --> 00:41:07.844
So I want to ask you, in the current state of, we'll say, ai, what would you say is your current AI kryptonite?

00:41:08.847 --> 00:41:20.943
My current kryptonite is the amount of models and feeling like every time a new one comes out that I don't know what to do with myself and cause it's so many coming out all the time.

00:41:20.943 --> 00:41:27.945
So it's like if you're trying to then tell everyone you're helping them AI literacy but then you don't know or haven't played with the newest model.

00:41:27.945 --> 00:41:29.701
It's like did I mess up?

00:41:29.701 --> 00:41:30.664
Am I not where I need to be?

00:41:30.664 --> 00:41:32.963
But I had someone actually told me this week.

00:41:32.963 --> 00:41:40.181
They were like you don't need to know all the models, amber, it's okay.

00:41:40.181 --> 00:41:43.373
Like the underlying foundation is what you need to know, and that's helping me feel a little bit less like it's my kryptonite, but for right now it is.

00:41:43.373 --> 00:41:45.298
The number of models stresses me out daily.

00:41:46.280 --> 00:41:50.137
So I guess it's kind of almost like that fear of missing out, like just like, oh like.

00:41:50.217 --> 00:41:53.922
Oh my gosh, like I just learned this one and now now like here's this next one, and so on.

00:41:53.922 --> 00:41:54.643
But you know what?

00:41:54.643 --> 00:41:58.429
That was some great sound advice that you just shared, or that they shared with you.

00:41:58.429 --> 00:42:05.938
It's like, hey, it's okay, everything kind of like under the hood, still very similar, just slightly different, but really it's just.

00:42:05.938 --> 00:42:13.554
As long as you focus on sharing those wonderful skills that will help students, educators and professionals be able to navigate that space.

00:42:13.554 --> 00:42:16.079
I think that, in the long run, is what's going to help them.

00:42:16.079 --> 00:42:18.565
So just keep doing what you're doing, my friend, all right.

00:42:18.565 --> 00:42:24.666
Question number two is if you could have a billboard with anything on it, what would it be and why?

00:42:25.536 --> 00:42:26.077
I'm cheating.

00:42:26.077 --> 00:42:31.947
It would say AI for kids you need to be literate in this technology.

00:42:31.947 --> 00:42:39.568
The reason I say that is because I made my podcast it's's very simple name, ai for kids, after messing up on the ai because the system, because that was not clear.

00:42:39.568 --> 00:42:52.081
But I want people to understand like ai is something that kids should know and should learn, and that would be my billboard simply ai for kids something you should learn um and make sure your kids are literate in it love it.

00:42:52.262 --> 00:43:04.451
And don't forget, you got to put your qr code on there, so if people drive by they can scan it and it goes straight to the show and they get linked and connected with you you know it started because of this question yeah, all right, here we go.

00:43:04.451 --> 00:43:12.648
And the last question, amber, is if you could trade places with one person for a day, who would it be and why?

00:43:14.284 --> 00:43:18.302
so this one was a hard question because there's like a bunch of people that I was thinking about.

00:43:18.302 --> 00:43:22.289
I'm like, should I change places with someone in AI?

00:43:22.289 --> 00:43:24.862
Would I change places with someone in government?

00:43:24.862 --> 00:43:45.862
Because I have like this weird intersection and I decided on it's gonna feel like a kind of weird answer, I guess maybe, but I would love to trade places with Sal Khan of Khan Academy, only because everything they're doing right now even beforehand, like they had early access to OpenAI for the tutoring which is now like Conmigo.

00:43:46.545 --> 00:43:51.599
I love what he did in like his story of like starting with helping out his cousins.

00:43:51.599 --> 00:44:02.764
I believe it was understand math and breaking it down, and I love that he takes complicated topics and breaks it down the way that everyone understands and grew that into a company that now we all use.

00:44:02.764 --> 00:44:08.824
I remember being, when it first came out, my mom sending it to me I think I was in college saying, hey, you've heard of Khan Academy, check it out.

00:44:08.824 --> 00:44:21.440
And it's just a guy like up there with his videos and to see what he's turned that into is just so impressive to me, and just to get into his shoes for a second to see how he's running such an amazing company really is an inspiration to me.

00:44:21.440 --> 00:44:30.018
Other folks would be likely politicians or other AI leaders, but that's who I probably would change places with, because it's mostly aligned with what I would love to do and aspire to become.

00:44:30.594 --> 00:44:31.699
Yeah, and I love it.

00:44:31.699 --> 00:44:36.086
I mean, it kind of falls in line with what you said, just taking something complex and making it simple.

00:44:36.086 --> 00:44:39.985
But I just want to let you know that you are already doing that, my friend.

00:44:40.335 --> 00:44:40.454
And.

00:44:40.556 --> 00:44:50.385
I really do appreciate it because and I only say that because sometimes, as podcasters and the work that we do, sometimes we may not get that immediate feedback, and it's every once in a while you get that feedback.

00:44:50.385 --> 00:45:03.469
But I just want to let you know that your podcast has really made an impact on me in the way that you present things Again, taking those hard concepts, making them easy, and not only for us as adults, but for the littles too as well.

00:45:03.469 --> 00:45:05.762
So keep doing what you're doing, my friend.

00:45:05.762 --> 00:45:10.025
And again, guys, if you haven't followed Amber yet, please make sure that you do so.

00:45:10.025 --> 00:45:14.385
I know I've repeated that several times, but I promise you this is something that is great.

00:45:14.385 --> 00:45:15.427
That's going to be beneficial.

00:45:15.427 --> 00:45:22.614
The resources and the knowledge that you're going to get from the interviews and, of course, from the website and the resources, is definitely going to be wonderful.

00:45:22.614 --> 00:45:24.221
So make sure that you follow Amber.

00:45:24.221 --> 00:45:28.666
And again, guys, thank you so much as always for all of your support.

00:45:28.666 --> 00:45:49.463
Please make sure that you visit our website at myedtechlife where you can check out this amazing episode and the other 300, and I believe it's 15 or 16 episodes, one of those, but make sure that you go check out all the episodes, because I promise you, guys, you will find a little something for you to sprinkle onto what you are already doing.

00:45:49.463 --> 00:45:51.380
Great, so please make sure that you do that.

00:45:51.474 --> 00:45:58.047
If you haven't followed us on all socials, make sure you follow us at myedtechlife on all socials and jump over to your youtube channel.

00:45:58.047 --> 00:46:01.436
Make sure that you hit the thumbs up subscribe to our channel.

00:46:01.436 --> 00:46:07.219
We're this this close to a thousand subscribers and we'd appreciate if you were our thousandth subscriber.

00:46:07.219 --> 00:46:08.503
That would be amazing.

00:46:08.503 --> 00:46:15.943
And again, as always, thank you to our amazing sponsors, as I mentioned to you in the very beginning Book Creator, thank you, as always, for your support.

00:46:15.943 --> 00:46:18.286
Edu8, yellowdig, thank you.

00:46:18.286 --> 00:46:35.376
Thank you for making this show possible and believing in our mission to bring amazing educators, professionals, authors and everybody from this space to be able to share more about AI, share our experiences, so that we can continue to grow together and just really adapt to this landscape.

00:46:35.376 --> 00:46:37.536
So thank you, as always, always for all of your support.

00:46:37.536 --> 00:46:41.356
And until next time, my friends, don't forget, stay techie.

00:46:41.356 --> 00:47:10.704
Thank you.
Amber Ivey Profile Photo

Amber Ivey

Creator of AiDigiTales and the host of the AI for Kids podcast

Amber Ivey β€œAI” is currently a Vice President at a non-profit where she leads a team that helps governments drive impact. In her prior role, as the Senior Director for the Bloomberg Philanthropies Center for Government Excellence at Johns Hopkins University, she led a team that assisted governments in utilizing data and performance management for decision-making. Furthermore, she played a key role in the design and launch of the Bloomberg Philanthropies City Data Alliance. The program aims to train 100 mayors and their senior leaders throughout the Americas on utilizing data to achieve better outcomes. Formerly, she worked at The Pew Charitable Trustβ€”a nonprofit focused on solving today's challenges by using data-driven, nonpartisan analysis. Here, Amber led the data collection and organization efforts of a first-of-its-kind research study on how all 50 states and the District of Columbia use data to solve complex problems, improve the delivery of government services, manage resources, and evaluate effectiveness. Most recently, Amber led a team that provided technical as well as strategic assistance to states and counties, who were working towards streamlining their business processes and launching technology, like legal assistance websites and online courts, to modernize and improve access to the legal system.

Before joining Pew, Amber served at Maryland StateStat, a performance-measurement and management office established by former Governor Martin O'Malley (D). Following the change in administration, she helped facilitate the transition by demonstr… Read More