As AI rapidly advances, educators grapple with how to adopt and use AI tools in K-12 classrooms responsibly. In this insightful expert panel hosted by Fonz of the My EdTech Life podcast, educational leaders and an AI EdTech founder dive deep into the ethics, challenges, and opportunities surrounding AI in education.
Join the conversation with panelists Adam Sparks, Karle Delo, Ed Dieterle, Bonnie Chelette, and Maurie Beasley as they discuss key considerations for educators, including:
→Protecting student data privacy and security with AI platforms
→Challenges with AI companies' terms of service for schools
→Best practices for AI professional development and teacher training
→Overcoming barriers to AI adoption in schools
→AI's potential impact on learning outcomes and critical thinking skills
The panelists emphasized the importance of digital literacy, empowering teachers, using AI to enhance rather than replace human capabilities, and keeping students' best interests at the forefront. Each expert also shares their biggest concern, "AI EduKryptonite," in the current K-12 AI landscape.
This discussion offers valuable insights and perspectives for any educator or administrator navigating the complex issues surrounding AI in schools. Don't miss this timely conversation about one of the most important issues in education today.
Timestamps:
00:00 Intro
02:20 Meet the expert panelists
08:10 Protecting student data with AI platforms
20:10 AI terms of service challenges for schools
27:12 AI professional development best practices
35:08 Overcoming barriers to AI adoption
42:54 AI's impact on learning outcomes & skills
57:40 Panelists' biggest AI concerns - "AI EduKryptonite"
1:09:13 Wrap-up
--- Support this podcast: https://podcasters.spotify.com/pod/show/myedtechlife/support
Thank you for watching or listening to our show!
Until Next Time, Stay Techie!
-Fonz
🎙️ Love our content? Sponsor MyEdTechLife Podcast and connect with our passionate edtech audience! Reach out to me at myedtechlife@gmail.com. ✨
Guests: Adam Sparks, Karle Delo, Maurie Beasley, Bonnie Chelette, Ed Dieterle
[00:00:30] Fonz: Hello everybody and welcome to another great episode of my ed tech life. Thank you so much for joining us on this wonderful day and wherever it is that you may be joining us from around the world. We thank you as always for making us part of your listening pleasure and definitely making us part of your day.
As you know, we do what we do for you to bring you some amazing conversations, bringing, uh, here in the education space with amazing educators, the Founders, practitioners, and we have all different kinds of backgrounds. So thank you so much as always for all of your support. And we definitely want to give a big shout out to our show sponsor, EduAid.
Thank you so much for sponsoring our show. And of course, supporting the work that we do and bringing amazing conversations into our space. Well, everybody, I am excited for today's show. As you notice on the screen, we have a fun. Full house and usually it's one on one, but now we're bringing in some amazing guests who have amazing backgrounds.
And as you know, today's topic is going to be about AI and AI and education. And I'm going to go ahead and turn it over to my guests so they can introduce themselves and obviously give us a little bit about what their context is within the education AI space. So we'll go ahead and get started with Adam.
Adam, how are you this evening?
[00:01:52] Adam: I'm doing well. I'm super excited to be here with all of you. Um, yeah, my name is Adams Parks. I was a teacher for seven years. Um, I just finished my master's in learning design and technology at Stanford and as a part of that work, um, built a platform called short answer.
It's a K 12 writing platform that we launched as a startup about a year ago. Um, so I work with schools now on helping them adjust writing instruction assessment in the wake of AI and, um, super excited to talk with you all today. So, uh, yeah, this is great.
[00:02:20] Fonz: Thank you, Adam. I appreciate it. And Adam is a two time guest.
This is his second time here on the show. So definitely excited when we get a two time guest here on the show. Next up we have Carly. Carly, how are you doing this evening?
[00:02:33] Karle: I'm great. Um, I thank you so much for having me. So my name is Carly Delo and I'm the director of curriculum and instruction at a small public rural school in mid Michigan.
And prior to that, I was an instructional coach for three years. I taught middle school science for 10 years. And have my master's in ad tech. So really excited to be here.
[00:02:54] Fonz: Thank you, Carly. And next up we have Ed. Ed, how are you doing this evening?
[00:02:59] Ed: I'm great. Uh, thank you so much for inviting me onto your show.
Uh, my name is Ed Dieterle. I've been in education now for almost 30 years, wearing a variety of different hats from high school chemistry teacher to university instructor, social scientist, funder, as well as executive. I got involved with AI in education. Uh, around the year 2006, back before AI and education was such a popular term, and during that time it was studying the video game interface as well as digital tools, but not the front end, but the digital files they automatically generate and how we can use that to make guesses or predictions about learning and instruction and so forth.
For the past five years, I've been writing a lot about ethics and AI and education. And I'm really excited to share what I've discovered and what I've uncovered. In today's episode.
[00:03:51] Fonz: Excellent. Well, thank you so much, Ed. I really appreciate it. Next up we have Bonnie. Bonnie, how are you, Bonnie?
[00:04:02] Bonnie: That was me. I feel like this is my fourth time here, right?
[00:04:05] Fonz: Yes, it is your fourth time on the show. First time, fourth time guest.
[00:04:09] Bonnie: I know. Are you serious? Yeah. I love the show. Um, I'm Bonnie Schlett. I'm the director of ed tech for the Louisiana department of education. Um, I, uh, um, I have my PhD from the LSU.
Got it recently. Did a dissertation. On cybersecurity and K 12 institutions. I'm going to adjunct at Nichols University in the, uh, in the fall in their master's program. Um, and regarding AI, you know, at policy level, um, it's a little different. And we recently stood up a task force to, to create AI policy for our state.
Um, which I was lucky enough to be a part of. Um, so really looking forward to this conversation because. I think that like conversations are data. Right. And so I really think like, this is how we're going to figure it out. Um, I don't have the answer, but I think we do. So looking forward to hearing what everybody has to say.
[00:05:04] Fonz: Excellent. Well, thank you so much, Bonnie. I really appreciate it. And Mari, how are you this evening?
[00:05:10] Maurie: I'm doing awesome. So happy to be here. So, uh, my background's kind of different. I was in the private sector for quite some time. And then when I grew up, I decided I wanted to be in education. So I've been in education for probably 20 years.
I was in a classroom as a technology teacher, maker space, got to do all sorts of fun things with like, um, 3d printers and laser cutters and just all that great stuff. And then I decided to get a master's in counseling. So I have a master's in counseling, went in, uh, to a counseling position for some time.
And then they, uh, talked me into slash tricked me into becoming a assistant principal. So I was an assistant principal on a campus for a while. And then two years ago, my husband who's the technology director at our district and has been for 20 years, convinced me to come back over to the technology side.
And so I have been the system network administrator at my district for the last two years. He really got into AI when it first started out, kind of like Ed here. He's been messing with AI for a long time. And so I guess about a year and a half ago, we started a little side gig because if you're in education, yeah, I have to have a side gig, you know, it's either a scentsy candles or you start your own business.
So, uh, we have AI education professionals. We go around to conferences and to districts all over the state of Texas, Louisiana, Mississippi, and we do AI training.
[00:06:36] Fonz: Excellent. Well, thank you so much, every single one of you for being here. And I love the way Bonnie put it, you know, conversations are data. And it's great just to have this conversation at this point in time.
Just, you know, it's been a while since obviously November 2022 and a lot has changed. And, you know, having some great backgrounds here today, it's wonderful to Get your insight. So I would love to get the conversation started, uh, talking a little bit about ethics now for myself, and if you're a fan of the show or seen the show, a lot of the conversations that centered around AI for myself, I definitely am and consider myself a cautious advocate.
So big shout out to Dr. Nico McGee for, you know, I coined, she coined that for you. That's the first time I heard it. I use it myself, just being a very cautious advocate as far as the technology and as good as it is. For me as an adult to be able to use it and streamline my workflows, you know, just for myself speaking is just the considerations when being used by students.
So I want to, you know, ask you the questions as far as, uh, educational institutions. And I know that we've got some founders here as well and P and many of you that work in the educational institutions is that what are some considerations? What are some things that an educational institution should look for?
Before they decide to adopt a platform. And I'm going to go ahead and start with Bonnie first, and then we can go ahead and, uh, circle around to, uh, anybody else that would like to add. So Bonnie, we'll start with you.
[00:08:10] Bonnie: I really like, I'm going to get, I had to shorten my answer. I typed it up because I think this is like, for me, number one is like watching what, um, what, what data.
These these programs are taken in, I think, in the past, like, a lot of, uh, school school system to no fault of their own have signed data sharing agreements and been like, we're safe and and that kind of stuff. And I think now, with a, they're going to have to dive in deeper than they ever have. Um, they're going to have to read those terms and conditions really deeply and almost create, like, a cyclical system to review because.
You know, as as this stuff gets deeper and deeper, you're just gonna have to review what what data are these companies taking, you know, and make sure your students aren't putting in that personal data, right? Like that's gonna be huge is training. Um, but yeah, I think this is just gonna be once again, cyber security is my thing.
But like, I think this is just something that's gonna be huge. And I think one question I have, and maybe you guys can answer it like So if students put that, you know, their information into that large language model and train that AI, is that like proprietary to the school system? Like, and I think we just have problems we haven't faced before.
Um, so can't wait. So everybody answers this question because we're taking notes.
[00:09:30] Fonz: So, okay. So maybe I'll go to Adam because Adam, obviously you are the founder of short answer, and I know that you and I at TCA, uh, thank you so much for meeting with me and having some breakfast and talking and talking about Seeing how you're doing things different.
And, and, and we know that many platforms do things different and, uh, try obviously safety first safety in mind, but, you know, just to get that reassurance, just to make sure that our students are safe. What are some of the steps or frameworks or guidelines that you follow to make sure that, um, student data or teacher data is being protected?
[00:10:04] Adam: Yeah. And again, I can only speak to how we're doing it because everybody does it different. So it's hard for me to answer your question, Bonnie, at a, at a kind of writ large, because everybody's got their own thing and some are more sketchy than others, to be quite honest. Um, the way that we've approached it is, um, anytime users interact with AI on our platform, which right now we're not an AI platform, like in this, in the same way that some of the large players, like a magic school or like a diff it, or, you know, your name, your big AI tool.
Uh, we're not an AI tool in the same way that they are. But we do have AI features. So if, and when a user interacts with AI, they're actually inter interact with a copy of the AI. So we use a service called. Um, it's Amazon bedrock, which is basically just a wrapper for a variety of different foundation models.
Um, and the way that Amazon bedrock works is they, um, they take a copy. They make an instance of a large language model. So you can think of it like on Google Docs when you, like, take a doc and click file, make a copy, and then you're interacting with that copy. When our users interact with the large language models that we've plugged into, they're interacting with a copy so that we can control the data that's being shared.
Um, and, and, and essentially not share data with those foundation models. Because if, and when you're interacting directly with the foundation models, like chat GPT or Claude, um, or, you know, name the one that you prefer, um, oftentimes that data is being used to train the model. So I think that's the first step is like interacting with an instance of it.
And then when it comes to, if you're having kids interact with it directly, I think the safe move there is to have the kids sign in using a code so you can think like Kahoot. So instead of like creating an account with the platform, the kid just creates a temporary one time I signed in with the code and put in my first name.
And then they're not sharing personally identifiable information or PII. At least not through that interaction. Now this is the important part for schools is we got to coach kids through. You should not be putting your personally identifiable information into the platforms, um, because even so in our case, even if the kids did and kids can't interact with AI right now in short answer, but even if they could, that data wouldn't be being shared with those those third party.
foundation model creators because of the way we've set up that copy. Um, but it could be with other providers. So it's, it's really important that kids learn to like, not share personally identifiable information with those models. So those are kind of three steps. There's a lot more to it, but I'm going to watch my space because there's a lot of brilliant people on this panel and I don't want to take up all this.
[00:12:26] Fonz: Excellent. So what I'll do is just to give you all a heads up, I'm going to go with Maury next, kind of going through, uh, the network side of it, and then we'll go with Carly, and then Ed will go with you as far as an overcompassing, as far as, uh, the ethical side of it, and things of that sort. So, Maury, uh, I often hear the phrase that, you know, Being able to actually, it was I saw it today on a post where access to the firewall or pay is privileged for a lot of applications.
So I want to know from your experience that you have. What are some of the things that with the trainings that you do, the districts that you work with? What are some things that you keep an eye out for to make sure that student data privacy and teacher privacy is safe?
[00:13:10] Maurie: Well, from my standpoint, I think that teachers have been trained for a very long time about, you know, FERPA and COPPA and all of the different things that they've had to go through.
They know, uh, not to put, or they should. If they don't know it by now, then you've got a problem that's not just being caused by artificial intelligence. It's, it's, you know, because of the teacher. But, um, You know, you don't put the name in. My husband has a Texas phrase. You file off the serial numbers, you know, so it's one of the things that they have to learn that.
And I agree. I think digital literacy and including AI in our digital literacy programs is absolutely critical. When it comes to not inter, interacting, but let's face it, they give personal data to tick talk, they get personal data to Snapchat, they get personal data. So for us to all of a sudden step in and say, Oh no, no, it's AI.
Don't give your personal data. No, it's, it's a training thing that we have to do across the board about safety in a digital world, not just AI.
[00:14:15] Fonz: Excellent, great answer. Thank you so much. Carly, on your side, working with teachers, because I know that you do a lot of one on ones and you do a lot of trainings too as well.
What are some of the things that you keep an eye out on to make sure that what you're presenting to your teachers as far as a useful tool is something that is safe and falls in line with your district's policy as well?
[00:14:36] Karle: Yeah, I think there's a, I'm looking at this from two different lenses. So one is as a PD provider, I just think that there needs to be stress and emphasis on, um, what the ethics are and what are some basic bare bones guidelines that, um, you That teacher should have in mind and should communicate to students when they're when they're using AI.
So I've been to a lot of ed tech conferences and there are a lot of sessions that are just look at all these amazing tools, you know, 20 tools, 20 AI tools that will change your classroom. And a lot of times, and I have been guilty of this in the past too, a lot of times there might be a quick section about, um, some basic guidelines, but then you move on.
And I just think it needs to be a, as a PD provider, it needs to be a bigger part of the conversation and it needs to be repeated and revisited over and over again. Um, And then when we think about it from the district standpoint as well, I think that first of all, it's very important for districts to have guidelines, um, and not just necessarily take them from somewhere else.
Something's better than nothing, but also, um, you know, try to think about, you know, what is going to work for you. So I highly encourage you to involve teachers in that process and potentially even students, um, but also. You know, there are great lessons, really comprehensive lessons out there on digital literacy and AI literacy for students, but I also think there should be a short version, a summary of that teachers can point to maybe these are three expectations every time we use AI and go back and revisit those with kids every time.
And I think we need to provide those. For teachers and not just expect them to figure it out on their own.
[00:16:20] Fonz: Excellent. Great point. And that's very true as far as going to the conferences. I know this last year at TCA, it was more about the platforms and not so much about going in to detail as far as privacy data concerns and bias.
So, Ed, I'll turn it over to you as far as you, as your experience. And again, in the 30 plus years that you've been doing this and, and working with districts and working and seeing what it is that. That is out there right now. What are some things that you feel are very cautious about as far as ethics and bias within a I to help keep our students and our teachers safe as well?
[00:16:58] Ed: Well, I think first and foremost, it really comes back to kind of grounding and kind of pulling back the curtains. So the students really understand what is happening. It shouldn't be AI is something that's used by other people or the impacts of AI happen to other people. It should be this is something that I contribute to and I have the ability to shape.
Um, I would begin by first and foremost thinking about the algorithms that underpin all AI systems. There's a wonderful historic example. I won't name the company, but you can probably figure it out. They were a pioneer in technology tools that you could talk to. Now, imagine a major corporate event where this thing is being released.
And there is a vice president who is up on stage with a beautiful Southern accent. And she is going to talk to this device and ask it a question. It has no idea. What she just said. So she asked the question a second time, a third time and a fourth time. And the kind of, um, review of, of how the technology was made.
It turned out that the people who were in the room, making the algorithm didn't share any diversity in terms of the way that they pronounce things, the way that they were trained and so forth. And so when they look to the training data, they weren't being particularly critical of where that data came from.
And they weren't overly sampling and making sure that representative populations from a variety of groups and ages and genders were included in the training set. So that would be the first thing I would say. Think about the algorithm. But then there's also the downstream effects. So, um, what we know from the learning sciences is that if you put a graph or a chart in front of an educator, if it's a pretty straightforward chart or graph, they'll, they'll understand it.
They're smart people. They'll maybe even know what to do with it. But if you start putting something that's more and more sophisticated in front of someone, but do not provide training on how to interpret it or how to use it to inform their pedagogy, people can really make some weird. Choices. For example, imagine if you went to your dentist every six months and your dentist loved to give x rays.
Imagine if that dentist was using the technology correctly, but was misreading the x ray. Bad things could happen. In fact, the use of the technology could potentially do more harm than good. So I think with AI and education, not only is it important for educators and decision makers and those who are making purchasing decisions to really dig into everything that folks have said thus far into how the algorithm was made, but also what kinds of trainings are available so that educators can understand what the technology is sharing with them so that it becomes a partnership and they can use it to make decisions.
Wise choices and not be left on their own devices to figure out what's going on or how to use it.
[00:19:59] Fonz: Excellent. Thank you so much. Great answers, everybody. Thank you so much. This first round went really well and I absolutely love your insight. So thank you for that. So the next question that I want to go into, okay.
We, so we talked a little bit. I got to
[00:20:10] Bonnie: thank Ed for making his point about the Southern accent being beautiful. Like keep. Nailed it. I can really relate to this story. I appreciated it. I
[00:20:20] Fonz: love it. I love it. Go Bonnie. There we go. I love it. Alright guys, so going a little bit, um, still in line with, uh, the ethical considerations and so on.
Now, and I'm going to be honest and, and, uh, And straightforward with you, my biggest concern when I go and see some of these applications that are being used within the classrooms, I go and I dig in to the terms of service. One of the things that really scares me, and I want to go in there and, you know, hopefully, maybe Adam, you can, you know, I know I've read yours, and many are very different, and like you said, there are some that are, kind of questionable, but many of them will say like, Hey, you know, we don't share any of your data.
This has never kept and so on. But if you keep digging deeper and you go down and further down and the more you scroll down, the more you find out that it'll say, however, the information can be shared to a third party. And if anything were to happen, we will not be held responsible. You will go ahead and compete or, you know, go up against our third party.
Okay. Now that to me is a big red flag in the sense of Taking accountability and to make sure that, you know, the, the application is safe. Now, I don't know, I know Maury, you brought up a great point. I mean, our students are already doing this with Tik TOK. They're doing it with Snapchat too, as well, but I don't know.
It's just something that just doesn't quite sit well with me. So I'll go ahead and start with Adam. As far as terms of service, you know, what are some things that. You would expect, and I know you guys do things very different, and that's one thing that I do appreciate, and I applaud your efforts and the things that you do, but what are some things that you feel could be done a little bit better to make sure to ensure a district that if I go into a contract with you for one year, that I am going to be okay as a district, and that I can feel safe and that our relationship is strong and held strongly That if something were to happen, I'm not going to have to go up against open AI because of a data breach.
[00:22:28] Adam: I'm sorry for how basic this sounds, but I would look for a cop of FERPA compliance. Um, and, and I guess before I say all of this, I am not a lawyer, so, so don't take my advice. And the scary thing for me has been, we have consulted with our legal. We have legal counsel, obviously, as a company, and oftentimes they don't know the answers to these questions because a lot of these questions are still being settled in courts, um, largely many, many times because of open a I had the company behind chat.
So, so I'm not a lawyer. So all this needs to go with a caveat. What I would say is, um, copper for compliance should be should be top of mind. And a big part of for compliance is compliance Schools own their data. Um, and so if and when, if a company is going to train a model, and this is an open question in the legal community, I think.
If a company is going to train a model on your data, um, and that data is coming from students and teachers. Um, and part of that data privacy agreement is like the school owns the data. The company doesn't own the data. It's my understanding that the company, if, and when the school district reaches out to the company and says, Hey, we need you to delete the data because we don't want you to have that anymore.
It's my understanding the company would have to delete their model. So that means that companies like can't train models or at least like fine tuned models on. On this on this data. Um, so I look for copper for compliance. And then I would also say, I would just double click on, like, create data privacy agreements with these companies because if and when your district creates a data privacy agreement with the company, that's the binding, um, document for arbitration.
It's not. Whatever terms of service the company has come up with. Um, again, I am not a lawyer. So don't take my advice, but that's kind of at a high level what I would think. Um, and I have actually thought a lot about this. Um, and I'm I hope I'll get a there's so many brilliant minds on this right now. I want to not take up too much space.
But, um, you know, I think that, you know, I'm going to sound like a bad guy saying this, and I just want to acknowledge that up front. Like, we want to protect student data 100%. That's that should be our number 1 concern at the same time. I also think about. And this is coming directly from a panel I saw with the head of learning economy Academy, who has talked about this and then another panel I went to with Jeffrey Canada, who's.
Who's done the Harlem Children's Zone and done a lot of amazing work. Um, what if, what if by using the data, we can create a better product that helps improve learning more. Like if we can fine tune a model on like all of Khan Academy's math data and in so doing create a, like a thousand times better math tutor, um, Shouldn't we share the data?
Like, if we can anonymize it and share it in a healthy way, like, isn't that a good thing? Um, now, obviously, because I think, obviously, like, when we think about data sharing, we think about all the awful things like Facebook and Cambridge Analytica, like, sharing all of our social media data for targeted marketing, and obviously that stuff's horrible.
But, like, in the context of education, if we can build a better learning product using that data, I also sometimes wonder about, like, Are we overly paranoid about this stuff sometimes? So I know that you should rightfully put challenge me on that because obviously there's a give and take here. Um, but I don't know.
That's what's top of mind for me. I gave you more information than you asked for there. No, but
[00:25:39] Fonz: you know, I just wanted to add before I turn it over to our next panel is just to kind of go off of that. Like I said, for me, like, I always dig through those terms of service, but you hit on that something. You know, and saying like, you know, if you can use the data and in a safe way, safeguarded anonymous way, like you said, to build a better product in the end, like I said, I, I tell people, although I may sound like I'm so anti, it's just because of that fear of the beginning when I read terms of service.
That says this should only be used for 13 years or older with parental consent. And then this is my fear. And I'm going to be honest with you is that if a school district or a teacher, actually, let's say a teacher says, Hey, I went to the conference. I'm using this product, but it says must have parental consent.
And then the terms of service says that the app has to verify that it has the, the parental consent signed or by the parent. I was like, how many of these platforms are actually verifying this with the district and how many district leaders are verifying this with teachers and how many parents actually know that this is being used.
And then what if something bad does happen? Just one thing that may go wrong. Then who's at fault here? The teacher. The company, and I just want to avoid all of that. And that's why I know we're in the very early stages. And like you say, we're still trying to figure things out, but I'm just being honest, like that to me is my biggest fear when it comes to that.
So let me go to, uh, let's go with Bonnie. Bonnie, what would you like to add to this particular question?
[00:27:12] Bonnie: Yeah, I'd love this. And like, like what Adam was saying about like, you know, who owns that data that trains that model? So I was saying earlier, I think it's a huge question. And, um, I'm interested to see what comes out of those legal cases.
Now, I am not a lawyer, but I deal with legal a lot. Um, because everything I put, we push out of the state level goes through this lovely lady, um, who does is is pretty great. So we, when we do statewide day sharing agreements, you know, has to comply with laws. I know Texas has some pretty intense, uh, privacy laws.
And I think that's what, uh, People are really going to have to, like, like I said, know more about it. I know we at 1 point have the most restrictive privacy laws. I think Texas passed us up and you have to check me on that. But making sure, like, legally, you're covered and if anybody needs a template for a data sharing agreement, you're welcome.
Look at ours because it is very clear that, um. You okay, um. It is very clear who is responsible. Um, and, and, like, in no uncertain terms. And so when I do statewide data sharing agreements with vendors, I know exactly what they're, they're going to flag. And it's like, no, it's the state law. So I think people too.
I know state laws on, you know, privacy is something like way out of the realm of teachers, um, but like, some of the stuff's illegal. Um, so we're really going to have to, like, Okay. Drill down and make sure nobody's doing what they shouldn't be doing. Um, and just being a lot more aware of like, what's illegal in your region.
Like, I don't know, COPPA, FERPA, I know they're updating COPPA. I'm so happy about it. Um, you know, COPPA 2. 0, it's supposed to cover a lot of this. Um, I think the, the intention, whatever intention, whatever is about to expire, but. Like, like I said, like, I think I love what you said about the terms and conditions.
I am always on the soapbox. I was speaking my language, um, cause people talk to me about student use all the time. And I'm like, how are, what do you, you shouldn't be using it with the student, like should not have it with your second grade class. Um, so I panic I'm always on alert because I just don't think we know.
And like you said, we're in the early stages. We just don't know.
[00:29:25] Fonz: Yeah, and that's the thing that really causes the panic with me too, is just terms of service, but it's so hard to oftentimes be heard because there's so much push and you know, on Twitter, on LinkedIn, it's like, it's almost like if I'm not using it, I'm not the cool kid on Twitter and I want to be on Twitter and I want people to know that I'm being, that I'm using this, but you're using it in an incorrect manner and fashion that shouldn't be used.
You know, and then when you do question and say, Hey, uh, terms of service. Oh, well, they said it's okay. They said it's safe, but really is it, is it really, I should say. So like I said, those are some of the things that I'm concerned about. And so let me see, Maury, how about your experience? What has your experience been with this?
What are your thoughts behind it?
[00:30:14] Maurie: Well, my thoughts and what we're actually doing at My school district is we've developed our own large language model. So we have a thing called Agnes and basically we're doing an internal AI. We started out a year ago as just like a chat bot that we were doing because frankly, we're tired of training teachers over and over and over again on how to get their phone on Wi Fi.
So we decided to do a chat bot and then when large language models came out, it kind of grew from there. So we have Agnes that we have ingested like, um, you know, The handbooks and the teaks and the extension list and things like that. And now our teachers ask Agnes and that's where we're focusing on versus the teachers.
Because, you know, kids are going to use AI not at school. I mean, they're going to use AI all the time from home. So we're thinking, okay, our teachers really need to be aware of what it is, how it works, how to use it, how to make it into a tool and assistant. And then we're going to roll out Agnes for student use.
So, and then we will let them litigate until they figure out what we're going to be doing in the United States when it comes to artificial intelligence and K 12, because we in the U. S. are reactive, not proactive. And so once everyone has a chance to react to everything, then we will start taking action.
Really assessing those tools that are meant for student use. That's what we're doing.
[00:31:41] Fonz: Excellent. Thank you for sharing. And Ed, what, what are your thoughts on this as far as your overall perspective and experience?
[00:31:48] Ed: Yeah. So, uh, as a historic example, if we go back in time a little bit more than 10 years ago, there were two family foundations that put together a fund of about a hundred million dollars to start a new organization called In Bloom.
And there were multiple states that signed up. So that within a handful of months, millions of student records were associated with this particular database. The thinking was that this would not only personalize learning and help teachers, but also allow education technology companies to gain access to data in order to optimize and improve their solutions.
So, like, technically it was light years ahead of pretty much everything else that was happening in education. Economically, it was incredibly well funded and supported. Culturally, the, um, you know, the ideas of what this was, there were various trainings, uh, provided at the state level, but teachers were still often very confused by, like, What exactly is happening and how does this work where things really started to break down though is on the psychological dimension a handful of parents started to ask questions about what are you doing with my kid's data?
And then this in turn became a political lightning rod. And a hundred million dollar investment closed up shop within 12 months time. And so when I think about all of this, I just want to make sure that we don't repeat history. And, and I think beginning with the, you know, as a, as a parent of two high school kids.
Just being open and honest. What are you doing with my kid's data? Why do you need it? And how will my students benefit from this and not be harmed because of participation with your solution? Just being human about it, being honest, upfront and transparent will go a long way to quell a lot of the anxiety.
It comes from parents just wondering, like, what is this black box doing? And, and how could it potentially harm my kid?
[00:33:54] Fonz: Great. Thank you so much, Ed. And I think you brought up some great points and really it's just a communication issue, just being transparent, being open. And I think it's just having those conversations with all stakeholders, including parents.
And oftentimes I know that as educators or people, we always say, you know, the, the learning community, but oftentimes the learning community excludes the parents. Which sometimes we need to make sure that the learning community, it does include the parents and it's just not, you know, administrators, superintendent, teachers, and that's it.
And, and students, but it does include the parents as well. So now I'm going to go to the next question, Carly, and I'm going to start off with you, uh, talking about. Adoption, AI adoption and some of the challenges that come with it. Now, I know we've talked a little bit here about some of the challenges as far as, you know, terms of service, the safety, the AI privacy and data, but let's say aside from that in, and in your role as, you know, a PD provider for your district, what are some challenges that you may come about or that do come about when you are providing teacher training?
Our teachers. willing to just try this and go for it? Or do you still have some of those that are just quite hesitant to use the technology?
[00:35:08] Karle: Yeah, so I do provide professional development for our district, but I also provide PD for other districts as well. So I've had a variety of different experiences and what I've noticed is that When you actually address the skepticism and you talk about some of the risks associated with AI right from the get go, I have noticed there's a lot more buy in from teachers that way because they trust, they, they realize, oh, you're not just going to show us the, the 20, the 20 AI tools or how this is going to just immediately change, you know, it, a lot of times it is introduced that way.
Like this one thing is going to. Change everything. And then the, you know, they get on and try it. And sometimes if it doesn't do exactly what they expect, they're kind of disheartened from it. So we, I just think that education is so important when, when approaching this and having a realistic approach myself, I, I am a ed tech person.
I've always been drawn towards technology. So of course I was, I was an AI enthusiast personally. Um, however. As a district leader, I'm somewhere in the middle because that's, that's where I should be as a district leader. And I think the skeptics have a point. And so it is just important to address that from the beginning.
Um, and I've noticed I've, I've delivered PD again into, to different districts, different areas, and I've taken that approach of where it's, you know, it's always been part of the conversation, but I noticed when you, when you talk about it upfront, um, I actually, in those situations get teachers saying we need to talk to our kids about this.
Um, that comes up sooner when you start with some of the risks and you're just transparent about that. And, um, another challenge, though, that I want to point out with just adoption in general is that as humans, we are, we tend to be resistant to change, right? And the, you know, the thing about AI is it's advancing and changing so rapidly.
You know, I, I think about when generative AI and chat GPT first came out, you had to go to it to experience generative AI. And already now you open up a new Google doc and there's a help me write tool at the top. Um, Amazon reviews, it gives you an AI generated summary. Um, so generative AI is already coming.
Into other people's, you know, coming into your world and into your workspace, whether you know it's there or not and recognize that's what it is and whether you want it there. So I just think the rapid change is something that is also going to be a challenge, not impossible. But just something to keep in mind.
[00:37:49] Fonz: Excellent. And all great points and especially in the role that I have very similar to yours where I do provide training for all our district leaders and teachers as well and everything. And those are some of the things that just being fully transparent. Now, when I do work with adults, it's like, Hey, we're okay.
You have the choice to either use it or don't, but because you are the adults here and we're okay. But when it comes to the, to using it with students and things of that sort, like you said, It has helped to you to have those conversations and then you're able to open up and say, Hey, you know, these are some of the plus side for us as adults.
But then, you know, when it comes to terms of service and students, these are some of the things that we face and, you know, it, the conversations do go very well and then, you know, those that are hesitant, kids still kind of like, Hey, I don't mind trying it just for my sake to know The way this works and the way that this can help my workflow, but also still being very cautious and, and with the student use.
So now, Maury, how about yourself? Uh, what experiences have you had or seen, you know, within your district as far as adoption implementation and professional development?
[00:38:57] Maurie: Well, um, I am with you on the professional development side. We address immediately, you know, we're going to get the elephant out of the room.
Very first thing we're going to talk about is cheating because, um, that's what all the teachers for the last year. That's all I hear. You know, my kids are going to cheat. They're going to. Lose their critical, you know, thinking skills and all that stuff. So we just get that out of the, out of the way. And we teach a process to address cheating.
You know, when I was a principal or assistant principal, I had to handle all those phone calls. And so I'm always like, okay, here's your process. We're going to get our ducks in a row. You're going to do this, this, this, and this, and then you're going to tell me to call the parent. And then I'll make that phone call.
Because, you know, the, the single tool that catches AI does not exist. And I don't think it ever will exist. And not just that, but I can beat those tools. Give me five seconds and I'll beat that tool. And if I can do it, believe me, a 17 year old kid can do it a heck of a lot faster. So we get that out of the way.
The one thing about adoption for me though, um, I did a Medium article not too long ago that. I didn't think it was going to be that popular, but pardon my French. I'll say the a instead of the actual word, but it is, uh, teachers are up to their a and alligators. Um, they don't have time to worry about AI teachers in the classroom.
Guys, we are dealing with chronic absenteeism. We are dealing with, um, you know, mental health. We are dealing with language barriers. We are dealing with so much in the classroom. A lot of teachers see artificial intelligence as just something else they're going to have to go up a huge learning curve with.
So, when I go in to teach teachers, it's like, okay guys, this is different because it's a large language model, human voice type interaction. You ask it a question. That's it. You just ask. And if you don't get the answer you want, you ask again. So, that to me is how you're going to get teachers to, you know, Engage and, and embrace it.
Um, we do a challenge Atlanta YSD. My husband will take me into a classroom without telling me. He did this to me a couple of weeks ago. He said, Hey, I'm going to take you over and we're going to visit with our Spanish teacher. And I was like, okay. He goes, if you cannot teach her in 30 minutes away that AI can save her time, you have to buy her breakfast.
Well, I taught her, you know, immediately I said, okay, what's taken up the most of your time? Was it lesson planning? No, she's a veteran teacher. She has been lesson planning for 10 years. You know what it was? She had been asked to be the president of the student council. It was an other duty as assigned.
She didn't know she'd never been the president of student council. So we took all the documentation from the state website, fed it into Claude. It gave her the meetings that she needed to have. It gave her whether or not she was meeting all the requirements. It gave her an outline. I saved her so much time.
Does that teacher use AI in her classroom now? Absolutely. So it's that hook. You got to hook them. Just like you have to hook your teeth, your kids when they walk the door. To keep them engaged, you gotta hook your teacher too.
[00:41:58] Fonz: There you go. And I think, you know, as far as the marketing aspect of it from a lot of applications, I think they do play into that.
Like, hey, we're gonna save you some time. And a lot of them do a great and phenomenal job at it. But obviously, like you said, with proper training, because like Carly says, sometimes when You promised that it's going to do something, but then it doesn't perform the way it should, then that really teachers just shut down and, you know, and not to say that they, that what they were doing before isn't good.
I always say that a lot of these tools are things that you can sprinkle onto what you're already doing. Great. That's all it is. It's going to enhance what you're doing, but don't, don't like expect it to. Like completely come up with something fresh and new. That's going to be like, Spot on, you know, so those are some of the things that you'll still have to do some work, but at least you're not starting, you know, uh, you're not reinventing the wheel and you can enhance some of the stuff that you already have.
So those are some of the things that I feel that are very important. So now I'm going to go ahead.
[00:42:54] Maurie: Well, I was going to say, one of the things that we do not do as whenever we go into train the teachers is we don't teach a tool. We teach the backend. We teach the big four. Okay, because I don't know if anybody has had any experience with the tool in your classroom that you put all this time and effort and everything into and then you come in one day and it's not there anymore.
That happened to me personally. I lost everything overnight. So I'm like, okay, if you learn the top four. Guys, billions of dollars are going into this. They're not going to go away. So learn those top four. And then once you're familiar with it, find those tools that give you that great user interface and put that package around it for you, then start learning those tools, because that way, if that tool disappears, You're not going to like, you know, have a mental breakdown because all your information's gone.
[00:43:44] Fonz: There we go. All right, Adam, how about yourself? What are some of the things that, and I know that you do a lot of personally, you'll go and do a lot of professional development and you work with teachers one on one. And also not only that, but you're the creator and co founder of the app that you're presenting.
What are some of the biggest takeaways that you see as far as, you know, maybe just that hesitancy or some of the barriers that you run into?
[00:44:10] Adam: Um, well, I, I would just double click on what everyone else said. I think it's important to start from a place of empathy. Um, and just acknowledge that some people are on the, on the 1 end of the spectrum of like, is the grim reaper and it's going to be the end of the world.
And on the other end, it's. Rainbow like I live in Silicon Valley. It tends to be this end of it where it's as rainbows and it's going to solve everything. Um, and we don't have any more problems because I will solve all of it. Um, and, and, and everybody is somewhere on that spectrum. Most people are somewhere in between and just acknowledging that.
Um, that's how so I lead workshops around the country on adjusting writing instruction and assessment. And that's where we start all the workshops of like, No matter where you're at on the spectrum. It's okay. And in some ways, the way you feel is like if you're on the far end, you think it's the grim reaper, you might be missing out on opportunities.
Alternatively, if you're on the other end of the spectrum, and you think this is going to solve every problem we ever have, you're going to be ignoring some flaws that are going to lead to some problems in your life. Um, so just acknowledging that and then and then framing everything as and, um, I appreciate the things that were just shared of like.
Okay. Starting with the problem, it can be so tempting to start with the solution, given that like the first time that you used a tool like chat, CBT or whatever a tool that you prefer. It's magical. Like is mind blowing. Like, I'm to the point where I feel that again, I encourage you to disagree with me. I think that I might be the most important invention since like the written word.
So with a technology like that important, it can be very, very, very tempting to like Start with the solution because it's so incredible. The solution is all these amazing things. Um, but when you start with the solution, it leads to so much misalignment. And so what I liked about what you guys just said is like, start with the problem.
So, like, what, what is your problem? Like, yeah, you're the Spanish teacher, but it turns out your problem is you're exhausted from all this other duties is assigned. So now, okay. You know, what does the research say is the best way to solve that problem? Okay, great. And now, what are the tools that we have that can actualize that research based best practice in solving that problem?
So it's, it's what's the problem? What's the research say about the problem? And then the solution comes in on like, how can we actualize? Um, the research to solve the problem. Um, that's the framing we use in all of our workshops. Um, and, but the most important part of all that is just starting with empathy and listening to where teachers are at.
[00:46:32] Fonz: Excellent. Thank you so much for sharing that, Adam. And we're going to switch it up here. And Ed, I'm going to go ahead and we're going to change the question here. Is that that way we can go ahead and maximize their time and hit on a couple more questions that I really want to, um, answer. You know, share with many of you also, we can get some of your insight, but I wanted to ask you as far as learning outcomes, uh, and the impact of learning outcomes.
So I want to ask you through the experience that you've had, you know, now that we've seen AI at least really forward facing since November of 22. And obviously you've had a lot more experience with that, you know, prior to that, but I want to ask you, you know, what have been some of the things that maybe you have seen.
Or have noticed as far as learning outcomes, knowledge, and retention. Does it really affect critical thinking? What are some of the things that you have experienced yourself and seen?
[00:47:27] Ed: Yeah. So Fonz, I think that's a critical question that we in education, you know, education writ large are going to have to wrestle with that for the next five, 10 or more years.
Uh, I started in the classroom in the 1990s and the internet was just coming into classrooms and there was this really fast growth. Fun and interesting and healthy tension between using technology for instruction and using technology for creation. So the idea for instruction, it was the idea of using technology to do things better, right?
Faster, louder, solve up more examples of flashier PowerPoint presentations, sending kids to a website to read something. But then there was also this opportunity of doing better things. Right. So collecting data at your local stream and uploading that to an authentic database and asking questions about the health of a particular community.
And, and I think if you start to look at the trajectory across time from say around the year 2000 up until today, when you go to ASU GSV or ISTE or EDUCAUSE or some of the other conferences, you'll see that instruction is still like dominant in this space. Much more so than it was in the 90s or even the early 2000s and and so one of the things I worry about with this idea of impacts and learning outcomes is that if we left the status quo continuing to go forward.
Instruction ism is going to win out and that means the kids will get an even louder Khan Academy. They'll get an even louder collection of multiple choice questions. And what that does is it perpetuates apologies coming from the academic perspective. It perpetuates legitimate knowledge, the knowledge that has been endorsed by institutions.
And what it does is it lessens students inquiry into alternatives and to the critical thinking. So I think that. One thing that we really need to bring back front and center of the conversation is the idea of construction of knowledge and also digital and physical artifacts that address local issues and global problems that allow for people to really demonstrate what they know and they're able to do so that they're solving things today, not trying to just piece together abstract ideas that they might use at some point in the future.
And there's lots of examples. And Fonz, I'd be more than happy to share with you summaries of the literature that talk about examples of the idea of instructionism and the, also the idea of constructing knowledge.
[00:50:05] Fonz: If you, sorry, that'd be a great idea. If you do have that available, we can definitely put that in the show notes and, uh, that could also even be a whole different show altogether too, as well. But Maury, I'm coming over to you now. What are your thoughts on this as far as, you know, what Ed just talked about, what he mentioned in your experience?
[00:50:24] Maurie: Well, I 100 percent agree with Ed when it comes to we just haven't had enough time. I mean, you know, AI is very new to the scene. So, um, we just time we need more time to collect the data and to look at it. But I will tell you that I think that AI is going to have a huge impact on learning outcomes. If you use it correctly, because I'm just looking at one instance here of like a language barrier.
In a classroom now, I don't know about, you know, bigger schools or whatever, but I know in rural America, we have a very hard time finding teachers, much less bilingual teachers, but we have a lot of bilingual students. And if I can take and have my AI create a story that's in multiple languages, but has the same characters, the same setting, the same plot, and then have that classroom discussion.
Without that student being pinpointed from a, uh, a mental health issue of them, you know, think of, think of the power of that. They're not different anymore. They can have the discussions across the board about all of that information. That's all the same, but yet it's different because it's in their language and then have the English version versus the Spanish version and be able to do a one to one correlation of the words.
To me, it's going to be huge with learning outcomes, but we just don't have enough data yet to be able to say. What that's going to be
[00:51:49] Fonz: and that's a great point as far as not having enough data yet. However, I think that right now we, we are the data. These conversations are the data, your experiences, even though they're anecdotal to as well, serve as data also as well.
And any documentation that you may have. And I think I agree with what you're saying wholeheartedly and everything that you guys have been adding to this. And so this time I'm going to go over to Carly. Carly, how about you yourself? What are some insights, uh, as far as. This is concerned, um, you know, as far as the critical thinking skills, the learning retention, you know, as far as you providing professional development, what do you hear back from your teachers that you have trained or that you yourself see when you go into the classrooms?
[00:52:32] Karle: Yeah, so, you know, good teaching is good teaching, and I think it's important and I know we've touched on this already. Um, but it's really important to talk about good teaching strategies and not just the tech tools. And so I think that there is a world where, and I'm just talking about when teachers use AI, uh, to help with the planning process.
Um, When, you know, when teachers use AI correctly, it not only can save them time, but it can really be transformative for students. It can help with things like student engagement, which is really challenging right now for a lot of people. Um, and, and starting to incorporate those skills that students need for the future of work.
And so those things like creativity and critical thinking, problem solving, um, collaborating, even think about, especially I'm thinking about secondary classrooms that have textbooks. And, You know, in many classrooms, teachers are still lecturing. They're still, it's, it's just very, you know, traditional style of, of teaching.
And we're not necessarily always pushing students to create and critically think. Um, and so I see a huge potential, uh, in teacher planning. Um, To use these tools to figure out how can I take this textbook and make it something that's really meaningful for kids. Um, and, you know, there's, there's also, you know, someone earlier was talking about start with the problem, um, and think about, you know, Making learning stick.
Uh, those there's a lot of popular idea, you know, ideas and cognitive science right now about that. And I think there's huge potential for a I to help create multiple practice sets for retrieval practice to create activities that have spacing and interleaving. So these are some things that have been around.
It don't necessarily need technology, but teachers can use a I to really make it more accessible and realistic. Thank you. You know, possibility. So I, I do, I am excited to see what those impacts are on learning outcomes. And I think there's just a ton of potential there.
[00:54:33] Fonz: Excellent. And Bonnie.
[00:54:35] Bonnie: Uh, yeah. And like she said, good teaching is good teaching.
Um, technology is not a lesson, right? It isn't a teacher. It supports teaching across the board. Um, and I think for like, for me as an amazing thought partner. But like, at the same time, when we're thinking about, like, students demonstrating their learning, we also have to think about the questions we're asking, like, if students can just generate it with a click of AI, like, do we need to change our questions and think about how we're asking students to demonstrate their learning?
Like, maybe we try more presentations or like debating the bot, just like two examples. Um, and also think that iterative process is essential and it's important. Like we teach everybody this, right? Like that iterative process of not getting it right. Um, adjusting your question. Um, I know it's a computer science skill.
Um, it's a digital literacy skill and it's a life skill. And it's something that really translates to a lot of subjects. Um, and then Maury was talking about like bilingual, um, and things like that. And, and somebody once said to me, he was like, I think our words are going to become more valuable than ever, because you're going to have to, um, you're going to have to be able to say the right words, right.
To get your output. And so those of you already know, like you can change one little word in your prompt and it completely changes your output. And I think now, like, we're going to have a whole different, like vocabulary. to support students. So I think it's just going to be different. It's going to be fascinating, right?
Like we're going to be here in a year and we're just going to have like completely different conversations. So I'm excited.
[00:56:11] Fonz: Absolutely. And I'm really excited about that. And I just want to say that for every single one of you, you all have been wonderful and great panelists. And like I said, I definitely respect your time and this has been a very fruitful conversation and especially getting a lot of insight from each of your backgrounds.
And this is really exciting. And. Just to let you know, you should definitely be excited because this conversation is definitely going to be going into my dissertation for sure. So that's going to be great too. So thank you all for your contributions. But before we wrap up, I know that this was not a question here and you know, I've got you here on the comments and totally improving this, but so I'll go ahead and start with Adam, but here's the question just to kind of set the mood.
I always end the question. I always end my shows with the last three questions, but everybody always loves the first question, which is the edu kryptonite. So as we know. Every superhero has a weakness. So for Superman, his greatest weakness or pain point was that kryptonite. So I want to ask you, and we'll go and we'll start with Adam, then Carly, and then we'll go with Maury, Bonnie, and then we'll end with Ed.
But I want to ask you what in the current state of AI and again, current state, it could have been what was released, you know, recently, you know, from Chad, GPT and Google or anything else, any kind of practice or anything. This is your time to kind of just be honest and straightforward with everybody here.
So Adam, we'll start with you in the current state of AI in education. What would you say is your current AI kryptonite?
[00:57:40] Adam: I want to make sure I understand the question. So when you say my kryptonite, like the thing that drives me the most crazy, is that kind of question? Yeah.
[00:57:46] Fonz: Kind of crazy. Kind of just makes you weak.
Like when you hear it, it can be edgy speak. It could be like words like equity and all this other stuff or whatever. So what is your AI kryptonite?
[00:57:59] Adam: Um, wow. Oh, there's a lot. Um, I live in Silicon Valley, so it's just rainbows and unicorns out here. Um, Sal Khan is a hero of mine, like straight up hero. He's on my Mount Rushmore of educators.
He's the, for those that don't know, he's, he's the creator of Khan Academy. Um, their flagship AI chat bot is called Khan Migo. Um, for my money, it is, it is the, the flagship AI chat bot tutor. Um, And there's a lot of conversation in Silicon Valley at the moment of how AI chatbot tutors are going to be a revolution.
Um, and they're going to, everyone's always very careful to say they're not going to replace teachers, but they're gonna, uh, and then they just kind of trail off. Um, and, and I just think it's ridiculous. I, I, I think that, um, the, the idea of a chatbot tutor being like a silver bullet for education is, is absurd because it, it assumes so much of a student.
It is, it assumes so much in terms of motivation. In terms of self regulated learning skills, in terms of metacognition that just our kids, especially as novices, don't have, um, and it's, it's, it requires that not just a human in the loop. Um, that framing is another kryptonite of mine. Human in the loop framing is broken.
It needs to be a I in the loop teacher in charge. Um, and so if we've got a chat bot and a teacher sitting next to a student or a parent sitting next to a student and an AI chat bot, I think that triangle that that can be really powerful of like parent. Student working together with a chat bot together.
But the idea of a student going home and working one on one with the chat bot and that that's going to revolutionize education. I think that that's flawed thinking for a million different reasons. So that's one of many, um, of AI kryptonite that I have right now. Um, but again, I want to hear what everyone else has to say.
So I'll stop there.
[00:59:47] Fonz: That was a great one, Adam. Thank you so much for, uh, really sharing that. I really loved it. Now I'll go with Carly. Carly, what in the current state of AI in education is your current. AI kryptonite.
[00:59:59] Karle: So I'm gonna, I'm thinking back to one of the first PD sessions I provided about AI. It was at a different district and at the end, um, one, one teacher said, wow, it's like, I don't even have to think anymore.
And I was, yeah, I was crushed. I'm like, no, like that is not, that's not what we were trying to, to accomplish. And I think also that. That line of thinking is dangerous for teachers, but also students because students, you know, when they see, oh my gosh, it can generate my assignment for me. It can do my work for me.
Um, and I don't have to think anymore. That is that scares me. And so I. Totally changed my approach to delivering professional development after that moment. And I love this quote. This comes from Holly Clark, uh, one of her students. So she was using AI with students. And, um, one of her students said to her, AI is not thinking for me.
It's making me a better thinker. And so I just, I really tried to, I want to start with that. I want to end with that. I want to revisit it in the middle because that's how I think teachers should be looking at it is how can it enhance what I'm doing? How can it make me a better thinker? And if our teachers are internalizing that and communicating that to students, I think that that can be powerful.
[01:01:26] Fonz: Excellent. Thank you so much, Carly. I really appreciate it. Maureen, how about yourself?
[01:01:30] Maurie: Well, I have two things real quick. First thing that's really kind of bothering me right now about artificial intelligence is the number of people who don't have enough information to say that they know everything about AI.
It's like, I run into it all the time. It's like, Oh, you know, artificial intelligence has been trained on all of the worldwide web. No, it's 0. 009 percent as of like, 3. 5 or whatever. So it's, it's that lack of clarity of information that people are putting out there, um, for other people to, to learn. And it's a lot of it's misinformation that big kryptonite for me though, is cost.
I work in a small school district. I have 2000 students. If you're going to charge me 9 per student per month, I can't do it. Even 4 per student per month. I can't do it. So I get on the phone and I try to like, you know, bargain. Hey, you know, I'm a small school district. small district, can you do 2 per month per student?
And then they come back, Oh, well, we'll do 2 and 50 cents. No, I, and I can't do it. So I was having a conversation today with a guy from Weaver technologies, who is a big, huge server farm person. And he made a comment to me that I hadn't thought about. The reason that a lot of the companies aren't doing this stuff for free in the United States for K 12 education is because the United States doesn't have the capacity yet.
To give it for free because of the token cost and the the servers that it takes up and the GPUs that it takes and all the stuff to to be able to do AI, you have to have all of this power, and so they're selling it because they haven't built up. The capacity to give it away for free. So that kind of made me think, maybe eventually I can get it in K 12 for less cost.
I hear Ed typing. Ed, are you going to tell me something great that I can take? I
[01:03:29] Adam: can tell you something great, uh, to that, to that, directly to that point. Um, Con Academy just, Um, they're, they're a series of chat bots for free and, and, and they're able to do that because they have the backing of the Bill and Melinda and Microsoft.
I
[01:03:44] Maurie: saw that I forwarded that LinkedIn. As soon as I saw it an hour after I saw it, I forwarded it to my husband, the AI, the, you know, the AI and technology director. I was like, Jim, Khan Academy, he's coming through. But, um, yeah, it's, it's a cost. It's cost prohibitive right now for K 12 education.
[01:04:04] Bonnie: I'm pretty sure you said the name. Yeah. Um, I know exactly what mine is cause it's driving me crazy. Um, it's how AI has become like just adding those two letters has become such a marketing tool. And it's been like, I, I, it drives it. Cause I get a million emails from a million different vendors. And I'm like, that's not a app.
And it's almost like the all natural brand. Like there's no rule about, you can put like, you can put that label on everything. And you know, there are so many things I'm like, that's just a really complicated if then statement. Um, and I just, I really think that's. That's praying to one, like what you guys are all saying, like people don't know enough.
Um, so they're, they're slapping these letters on it and people are like, sounds good. Um, yeah, it has on it. It's, it's driving me. It is driving me crazy. Cause I'm seeing people, um, I'm seeing people who want to do the right thing for their students, right? Like they, they, they, and they're told this is good.
And like, this is going to improve it. And I'm like, you're not getting the whole story just because they added that little label. And to what you're talking about though, with money, It's going to be interesting, 930, that SR3 dries up. It's going to be a different landscape. I'm personally excited about it.
Cause I can't wait to all the diets, all the ed tech diets come on, but cut it down for teachers, but yeah, that's, that's mine. It's just the marketing of it and the using it and praying on people who don't know.
[01:05:29] Fonz: Yeah. And I think going with that too, right before we get to Ed, I just want to add to that is, uh, the marketing behind it is definitely been there.
I did notice there, there is a school. That, uh, you know, is in Texas and they, they're like, Hey, they just come, students come to school for like four hours and, um, they're just on the computer and the, the teacher there is not really, it's a facilitator. And, but the interesting is you, you get to go and see some of their curriculum and you can go ahead and download it and they just share it with you on a Google doc just to kind of see what they use and some of the platforms that they use are, I wouldn't say that they are AI at all whatsoever because we use them in our district and they don't label it as AI.
I understand that. There's a little algorithm in the back that, you know, uh, it's not really forward facing, but, you know, it helps you level set things, but they really sell the school as this just AI school. But a lot of the tools that they use aren't necessarily AI at all whatsoever. Um, so it was very interesting.
And I mean, for 40, 000 tuition per year per student, I mean, talk about branding. And selling on, selling that. So it's very interesting. All right, Ed, to close up with you, what is your current AI EduKryptonite?
[01:06:42] Ed: Yeah. So, so I was going back and forth about two ideas. So I'll just start with the one. And the first is the idea that there is like a whole world of traditional AI.
That has been the driving force behind education technologies now for like 20 years, but there's also generative AI and there is a break between the two. However, when people talk about AI and education, they lump everything together. And, and I think that, you know, going back to things like the instructional core and the relationship between the student and the teacher and the content and how the student is able to demonstrate what they know, and they're able to do, um, this race toward the most sophisticated, the generative AI isn't necessarily what each person needs.
And so my kryptonite is, is really just around this idea of technology should be a tool for enhancing relationships. To bring people closer together, to make things for yourself, for other people, with other people, in order to do good in the world, to question, to think more deeply, to be creative, so that we as a society can grow and develop over time, and not find ourselves in a situation where we're Where everyone in the room can write the most beautiful essay, five paragraphs using the tool, but has no idea what to do when they go outside and they see their garden withering away.
Or they understand, you know, uh, how to solve an esoteric math problem. Like I love Khan Academy. I, I, I love Khan Academy. Was around when it was first launched and so forth. But I think that alongside those tools for instruction, we also need tools for students to be creative and for allow students to think about problems over an extended period of time that have deep meaning for them and the important other people in their lives so that we can leave a world in the future.
That's not just as good as what we have today, but even better.
[01:08:47] Fonz: Thank you so much, Ed. That was great. And to everybody, this has been a wonderful conversation. And again, I really appreciate your time this evening just to give it, not only to me, like I consider this my own professional development for one hour with some of the brilliant minds that I follow on.
LinkedIn and on socials. But I know that all the educators that are going to be listening to this are definitely going to be taking a lot of great knowledge nuggets and sprinkling it on to what they already know what to do and what they already know how to do great. So thank you all. Adam, Carly, Maury, Bonnie, Ed.
so much. Thank you so much for spending your time here this evening, uh, with me and obviously with all our listeners that are going to be checking this out on the replay and listening to it on the podcast. Thank you so much for all of your support. Please make sure guys that you visit our website at myedtech.
life, myedtech. life, where you can check out this amazing episode and the other 277 episodes that we have with creators, professionals, uh, AI founders, we've got a little bit of everything. For you that you can go ahead and pull some of those knowledge nuggets from, and go ahead and add them to your teacher tool belt.
So thank you as always for your support. And if you're not following us on all socials, please make sure you follow us on socials. We are at my ed tech life on pretty much everything except LinkedIn. Uh, but you can find those links on the website too, as well. And again, a big favor. If you haven't done so already hop over to our YouTube channel, give us a thumbs up and subscribe.
We'll Our goal is to hit a thousand subscribers this year. We're very close, but any little thumbs up and likes will definitely help our channel continue to grow and obviously put this into, uh, use the algorithms to put this into all of y'all's feeds and everything. So thank you as always for all of your support and my friends as always, don't forget until next time, stay techie.
Director of Educational Technology
Bonnie Chelette has more than 14 years experience in education as a classroom teacher, admin, district, leader and now at the state level. As the Director of Educational Technology at the Louisiana Department of Education, Bonnie supports the development and implementation of technology strategies that support teaching and learning across the institution. She is passionate about leveraging technology to enhance student engagement, increase access to learning resources, and improve academic outcomes.
She is committed to ongoing professional development and staying up-to-date on the latest advances in technology and education. Bonnie holds a M.A., EdS, and is close to a Phd from Louisiana State University. #GeauxTigers
CEO & Teacher
My name is Adam Sparks. I grew up on a farm outside of the small town of Louisville, NE. I graduated from Creighton University in 2014 with a degree in history and secondary education. Since then, I've taught in the most varied contexts imaginable: Rural Nebraska for 3 years, urban China for 1, and the west side of Chicago for 2. During the pandemic, I worked in a district technology office helping out with their 1-1 chromebook rollout in response to COVID. For the last year, I was a full-time graduate student in Stanford's Learning Design and Technology program where my work focused on using peer feedback to improve formative assessment pedagogy. For my masters capstone project, we built a web app called Short Answer. I now work full time on Short Answer with my wife Alexa. At Short Answer, we help teachers build better K12 writers though engaging peer feedback activites for any subject.
K12 District Sys Admin/Founder & CEO
Maurie, an entrepreneurial spirit with a BA in Education and a Master's in Counseling, has a professional journey that spans 17 years in K-12 education. Her resume is a blend of diverse experiences, including owning a tech company, owning a restaurant, and holding various educational roles from a primary and secondary school teacher to a tech coach, school counselor, and assistant principal. Her voice has echoed in state conferences, resonating with her numerous professional development training sessions and maker space curriculum presentations in over 50 schools. Maurie's expertise transforms learning experiences, inspiring students and educators to embrace new technologies and discover innovative ways to enrich their classrooms.
Chief Learning Scientist
Dr. Edward Dieterle, the Chief Learning Scientist at Kovexa, leads research efforts on innovative learning solutions grounded in learning science principles. With a career spanning nearly three decades in learning and teaching, Ed is driven by a passion for improving the daily experiences and life outcomes of students and workers through his research, teaching, scholarship, and product development.
Prior to joining Kovexa, Ed was an Executive Director at Educational Testing Service (ETS) where he oversaw ETS’s externally funded research portfolio, R&D Consulting Services, and R&D’s merger and acquisition activities. Before joining ETS, Ed served as the Director at Summit Consulting, where he and his team of econometricians and data scientists specialized in litigation analytics and program evaluation using administrative data. Prior to that, he worked as a Senior Program Officer for Research at the Bill & Melinda Gates Foundation, developing and administering the Foundation’s personalized learning research portfolio. From 2008-2011, Ed conducted research at SRI International, with expertise in educational technology, including cross-disciplinary applications of statistical, computational, and machine learning tools. He served as a PI or co-PI on research projects for organizations such as the Department of Defense, NASA, and the Andrew Mellon Foundation.
Early in his career, Ed worked as a high school chemistry teacher, an instructor at Johns Hopkins University, and a curriculum developer for Public Television and the National Park Service. T… Read More
Director of Curriculum and Instruction, Speaker and PD Provider
Karle Delo (@coachkarle) is a Curriculum Director, Instructional Coach, and education speaker with over a decade of experience teaching middle school science. She has a Master of Arts in Educational Technology and was named one of the top 30 K-12 EdTech Influencers to follow in 2023 by EdTech Magazine. Karle offers training on AI tools and classroom technologies through PD and short-form content on social media as @coachkarle.