Episode 313: Angeline Corvaglia
Episode 313: Angeline Corvaglia
AI Safety, Ethics, and the SHIELD Conference with Angeline Corvaglia In this episode of My EdTech Life , I sit down with Angeline Corvaglia…
Choose your favorite podcast player
Feb. 19, 2025

Episode 313: Angeline Corvaglia

Episode 313: Angeline Corvaglia

AI Safety, Ethics, and the SHIELD Conference with Angeline Corvaglia 

In this episode of My EdTech Life, I sit down with Angeline Corvaglia, a global leader in online safety, AI ethics, and digital literacy. We dive into some of the most pressing issues in AI today, including:

πŸ›‘ AI-powered predators and chatbots—are they automating child grooming?
πŸ” The hidden dangers of AI relationships—a shocking story of a 13-year-old's first abusive chatbot "boyfriend"
πŸ“’ The SHIELD Global Online Safety Conference—why this worldwide event is breaking barriers and amplifying unheard voices
πŸš€ The AI arms race—who is really in control, and where are we headed?

Angeline shares eye-opening insights on the urgency of AI literacy, why parents need to rethink their approach to digital safety, and how the SHIELD Conference is uniting 16 countries to take action.

πŸ’‘ Join the conversation and be part of the movement! Sign up for the SHIELD Global Online Safety Conference here:
πŸ”— https://shieldthefuture.com/

πŸŽ™οΈ Discover more incredible conversations on AI, EdTech, and the future of learning at:
🌍 www.myedtech.life

πŸ”” Don’t forget to like, comment, and subscribe for more thought-provoking discussions!

Yellowdig is transforming higher education by building online communities that drive engagement and collaboration. My EdTech Life is proud to partner with Yellowdig to amplify its mission.

See how Yellowdig can revolutionize your campus—visit Yellowdig.co today!

Support the show

Thank you for watching or listening to our show! 

Until Next Time, Stay Techie!

-Fonz

πŸŽ™οΈ Love our content? Sponsor MyEdTechLife Podcast and connect with our passionate edtech audience! Reach out to me at myedtechlife@gmail.com. ✨

 

Chapters

00:30 - Digital Safety and AI Literacy

07:52 - Parental Involvement in Digital Literacy

14:11 - Recognizing Signs of Digital Danger

25:47 - The Future of AI Education

32:13 - Online Safety Conference and Future Plans

45:12 - Digital Education Conference Promotion

Transcript

Episode 313: AI Safety, Ethics, and the SHIELD Conference with Angeline Corvaglia

 [00:00:30] Fonz: Hello everybody. And welcome to another great episode of my ed tech life.

[00:00:33] Fonz: Thank you so much for joining us on this wonderful day and wherever it is that you're joining us from around the world. Thank you as always for all of your likes, your shares, your follows. Thank you so much for just interacting with our content. Thank you so much for all the great feedback. We really appreciate it because as you know, we do what we do for you to bring you some amazing conversations that will help nurture our education space.

[00:00:58] Fonz: And today I am really excited [00:01:00] to have on a wonderful guest that I have followed and then connected with on LinkedIn. She is putting out some amazing things and things that will really get you through the day. You know, at least stopping and thinking and kind of meditating because sometimes, you know, we move too fast, we break things.

[00:01:16] Fonz: And then we're like, Ooh, if I would have just maybe thought about that a little bit more. So today I would love to welcome to the show, Angeline Corvallia, who is joining us today. Angeline, how are you doing today? I'm doing great. Thank you. How are you? I am doing excellent. And thank you so much for connecting with me.

[00:01:34] Fonz: Like I mentioned earlier, I know Saturday mornings are very precious, but I do appreciate you taking the time out of your day to just share your passion, share your world with us, and just to really just share what you have been amplifying on LinkedIn and through. You know, all your various posts, and we're going to be talking about an amazing conference that you will be speaking at as well.

[00:01:58] Fonz: So there's a lot to [00:02:00] cover, but before we get into the meat of things, Angeline, if you can give us a little background introduction and just what your context is within the digital safety space, education space, and even parent space. So just that way, our audience members can get to know you a little bit more.

[00:02:19] Angeline: Okay. Thank you so much for having me. There's some things that are always worth giving our time for and spreading messages your own and others is definitely definitely worth the time. So I have a very eclectic background. It would take me a while to tell the whole thing. I'll just start with Um, and so when I started working with, uh, the whole online safety space, it actually found me, um, because when my daughter was born, she's nine now.

[00:02:44] Angeline: When she was born, I was a CFO in a financial institution, and I was working there for 15 years total. After that, you know, I never wanted that to be my, my future. So I quit and I was working for a software provider. Um, [00:03:00] In sales, they put me in sales. That wasn't the best decision, but I did that for, for around three years.

[00:03:07] Angeline: And then I said, I have to do my own thing. And my intention was actually to do digital transformation and like a little bit help with the kids on the side. Cause that was the part of the CFO that I liked, but then I noticed people on LinkedIn, they were posting things for kids and I felt like the medium wasn't something that kids were going to.

[00:03:27] Angeline: Um, I would listen to, or, or it was actually long articles, for example, sign thinking, Oh, I'll just have some fun and creative videos. I started asking these experts, can I create the videos and they're like, yeah, sure. And I got a very big positive reaction to these videos. Um, and then after this was a year and a half ago, about a year and a quarter, I realized, you know, there's so much that needs to be done, you know, after.

[00:03:56] Angeline: After spending then three months around three months, mainly [00:04:00] focused on that, I realized I have to do this. I have to do this full time. I have to help, um, with that. And AI is one thing I'm especially interested in. And the reason I first got interested in it is actually from my time at the software. Uh, provider, they were, they were doing work for customers and AI was the thing for the developers.

[00:04:23] Angeline: It was the thing for the tech experts. And I wasn't a tech expert. I was the bridge between tech and business. And I felt, you know, this was when chat GPT had just come out recently. And the world was changing and people who weren't tech didn't feel like they were part of it. So I was like, I need to, um, I need to help these people understand.

[00:04:44] Angeline: So that's how it started. And then I had two characters. My, um, my activity is called Data Girl and Friends. And I had Data Girl, and then someone suggested, why don't you have ai? So I have ALA AI Girl too, [00:05:00] and ALA talks about AI and data. Girl talks about online safety and all the other privacy aspects.

[00:05:08] Fonz: That is wonderful. Like what, what I love though, is just when I get to talk to guests and this is what I love the most about this and amplifying people's voices in their work is just to hear the background that they're coming from, what they're seeing and how they're trying to either saw something and trying to find a solution to, or, you know, working along with that, uh, you know, in this case company, like you say, you're making videos and then all of a sudden seeing, Hey, there's a need.

[00:05:34] Fonz: For this now, because now you're seeing some things, and this is, I think, fantastic and data girl and friends is something that I know that I would love to share with my parents. And so that's why I'm thankful that you are willing to be here today to learn a little bit more about that, because as part of my job, I do get to work with parents on a monthly basis.

[00:05:54] Fonz: We talk and we have these conversations about data, the data privacy, uh, the most [00:06:00] recent. Conversation that I did with them. I posted on LinkedIn, we were doing one on sharing team where parents are just oversharing, uh, you know, pictures and so on. And then I talked to them also about, you know, these AI platforms that now can take some of those pictures and, you know, basically undress those pictures and then there's the extortion aspect of it.

[00:06:20] Fonz: And so we go deep into those conversations because I know that although it's, it's a tough topic. You know, just to inform parents and letting them know just the dangers and also talking to them about AI and chatbots. So kind of going into that information, uh, into kind of that, um, I guess path now into the conversation.

[00:06:39] Fonz: I know that you are, have spoken very much about AI and you're very vocal about it, but I want to ask you, you know, uh, As far as AI literacy is concerned, I know that AI is moving like at a very rapid pace, and it just seems like every day or every second, there's a new app, a new company, [00:07:00] something new, and all these models are coming out that are reasoning and all that good stuff.

[00:07:04] Fonz: But I just want to ask you, do you think that with this and the moving as fast as it is, do we need to focus more on that AI? AI literacy side, or do we need to focus more on implementing more robust AI safety regulations?

[00:07:20] Angeline: I'm going to say to AI literacy, because I no longer trust government or, or industry to solve it.

[00:07:30] Angeline: I don't like to make political statements. So, um, I don't want to get political because that distracts often from the message, but any government. I don't think that we need any government. Because if we wait, we do need, we need regulation. We need responsible industry. But you said exactly. It's moving very fast.

[00:07:48] Angeline: And if we wait, um, then we're going to wait. There's enough examples of parents who have been fighting. I know one parent, um, Jesper Graugard in [00:08:00] Denmark, who's been fighting for the privacy of his kids for five years. He's clearly in the right, but, um, he can't. He can't really move very fast in other countries like Norway, where they managed to actually get change in the government.

[00:08:16] Angeline: But it took a year and a half for the government to actually make rules about privacy for kids in schools. And in a year and a half, you know how much has happened. So I think the literacy is most important. Long answer short question. Um, and in the The issue with this literacy is exactly that. It's moving so fast.

[00:08:38] Angeline: And just I was thinking of you're talking about working with parents. One whole aspect of my concept. My way of working is that parents need to understand that they're not going to know Their kids are going to know they're not going to know so the way to keep kids safe what I try to to to bring [00:09:00] about the short videos that parents and kids should watch them together and talk about them together and teach each other because there needs to be a trust because the kids are going to know they're going to know parental controls.

[00:09:10] Angeline: There's always a way around the parental control that their their friends are going to tell them how they're using AI and they're going to try it out. So there needs to be a trust and the parents just aren't going to figure it out. Um, so that's kind of my way of seeing it.

[00:09:26] Fonz: Yeah. And you know what? I love that you brought it back to the parent aspect of it, because I know with, with my work with parents and I I'm coming in just from education and actually coming in from, uh, business and marketing into education and.

[00:09:41] Fonz: And I know that there's a term that is used quite often. It says, Oh yeah, you know, our learning community and our learning community. And, you know, and sometimes what I feel like is that we don't include the parents as much in that learning community. It just seems like it's the upper, you know, management and, and then of course the mid level and then the [00:10:00] teachers and then students, but I love that you touched on the fact that parents need to know the, the students are already using it.

[00:10:08] Fonz: The students are already, obviously because of their friends and they see things on, you know, social media and things of that sort, they are already familiar with a lot of the apps, but the parents aren't. And so I love the way that you bring that together and saying these short videos are for parents and their, um, you know, children.

[00:10:26] Fonz: To watch together and have those conversations. And that's really the job that, that I get to do with our parent liaison specialist or our parent engagement specialist, I should say, is that the goal is we tell them it's like, we're having these conversations, but I'm giving you these resources also as well, both in English and Spanish, because those are the predominant two languages here where I live along the border.

[00:10:49] Fonz: But these are resources to have those conversations. with your son or daughter, just at least to get them to think for 0. 5 seconds, [00:11:00] you know, before they click send or whatever it is that they're going to do or share because of maybe the longterm consequence of that, that might happen later on. And also even talking to parents about that too, as well, like, Hey, when you're posting something about your child, is this something that you would like posted about yourself?

[00:11:16] Fonz: Because later on, you know, with students and with AI, like I said, it, there's even more of a danger now, I think, or at least it's heightened because of what can be done with these apps. So I love that, that the work that you're doing in bridging that gap between parent and student. And student or child in this case, and bringing that together.

[00:11:38] Fonz: So let's talk a little bit about, you know, more on that parent side, because I would love to pick your brain and learn more and, and see how I may also share what you're doing with parents as well. So I know you've spoken about AI powered predators and chatbots and the automating of the child's grooming.

[00:11:57] Fonz: Can you walk us through a, like an [00:12:00] example of. What are some of the flags or some of the things to see when this might be happening?

[00:12:06] Angeline: Um, well, obviously it's about change in behavior, right? And just before I go into, into more detail, there's one thing that I really want to mention in terms of, um, how. In my view, what I'm trying to achieve needs to be different, that parents, they need to admit more, be able to admit that they don't know things, they don't have the answers, um, it's the same.

[00:12:33] Angeline: It's a societal thing, right? If you're in a meeting at work, who's going to be the one to say, I don't understand what you're talking about. Can you please explain it in a simpler way? It's hard because we, In general, all the societies that I've lived in. I lived in six different countries. It was always like you're supposed to know and asking questions and meaning you don't know is hard.

[00:12:54] Angeline: But with the tech world and kids and parents, we have to admit we don't know [00:13:00] because that's part of the problem is kids think they know better. Especially in terms of privacy. So yeah, just before I say that, that, um, even the signs, I would say the first sign is openness. I just recently have been speaking a lot with Megan Garcia, who recently lost her son, Sol.

[00:13:19] Angeline: Um, and she's going to speak the conference. We're going to talk later and we've been talking about her, her experience. And one of the things that she noticed was a change in behavior in the sense that he was talking less to her and less honest, less open. It's a first sign, you know, that, that something is wrong.

[00:13:42] Angeline: Um, And another thing is, is just if they want to be alone with their device, you know, it's tempting to let them be in the bedroom or, uh, be alone. But I've heard a lot of experts say the worst things happen in the bedroom, even on, you know, all [00:14:00] these, I talk a lot about the online world but I don't spend much time on it.

[00:14:04] Angeline: Um, like Discord and things where the kids can um, watch, no, Roblox, Roblox where they can have the games and you think it's um, you know, it's, it's not dangerous but actually it can be and it's good especially if they're younger kids to have them always in the room with you, yeah. So those are signs basically just change in behavior.

[00:14:29] Fonz: And, you know, and that's very interesting because that's something that does come up with, uh, and the talk that I have with parents is many times they may think like, well, you know, it's just the, the, the puberty, it's just, you know, the age and, you know, they're in that awkward stage and, you know, they start isolating themselves.

[00:14:47] Fonz: And I always just say like, you know, if there's a sudden change. You know, that, that is something that should kind of be noted and kind of just start asking and, and just doing, you know, the parent thing is like, you know, just [00:15:00] observing is everybody, is there, are you okay? You know, noticing some of those behaviors and because like you mentioned, and you mentioned Megan Garcia, and that's something that I did bring up.

[00:15:09] Fonz: With our parents when we had our meeting, uh, you know, this past year, I think it was the November meeting and talking about how easy it is to access, you know, these chatbots on your devices, on computers, and how easy it is to even open up an account. And so I played a clip that when Megan was getting interviewed, where she mentioned it's move fast and break things should not be something.

[00:15:34] Fonz: That should be done when it deals with students and especially the lives of a child. So going into that, you know, through your work and what you've been doing, and, and I guess maybe in your opinion, what are some things that. These companies, these AI companies that are putting out these chatbots and so on, what could they do better than just move fast and break things?

[00:15:58] Fonz: What are some of the things that [00:16:00] if you can have that CEO in front of you, CFO, and, you know, say, hey, this is what needs to change, what would be some of those things that you could do? That you would ask to be changed.

[00:16:12] Angeline: I would ask them to have their products, um, looked at and created together with experts like psychologists and psychiatrists, behavioral experts, even teachers, um, because they're largely left out of the discussion and this would already be a big step forward, right?

[00:16:33] Angeline: I mean, there's, I recently learned about the existence of grief bots. When I found out about this, I was. Speechless for 20 minutes. Uh, you know this, uh, for people who don't know, these are AI chatbots that are actually created um, in a, actually a copy of a person who's passed away and they're apparently for grief, but when I, Psychologists that I know [00:17:00] they're like, obviously we weren't involved in this because this is extremely dangerous and risky, right?

[00:17:05] Angeline: The way it's, it's being done, especially towards kids. So this is what I would ask. Can you just get non technical experts to assess your product for whether it's safe or risky? Um, it's. Yeah, this just needs to be done more across different industries and expertise levels.

[00:17:29] Fonz: No. And that's something that is very powerful that you mentioned.

[00:17:32] Fonz: I did have. A guest, uh, a couple of months ago, Dr. Sonia Tavari, who was on here also, and it was one of the things that she mentioned that it was so important for her that you have that co creation of these applications with not just, I guess, your end, end all goal in mind of obviously just getting on the app and just keeping people on the app at any age level, but also, uh, If it's something that's supposed to [00:18:00] be used for young adults or children, for that matter, that they do get that feedback.

[00:18:06] Fonz: And so for me, what I see many times is there is the influencer market. You know, you get people that are, you know, have a heavy following. They get used and say, Hey, we'll give you our product or we'll pay you this much to promote our product. And really sometimes it's like, well, are you even, are they even, you know, taking into account the privacy, the data, the dangers that might occur, or is this just.

[00:18:32] Fonz: It's just simply a paycheck for them and I'm just going to put it out there and, you know, without any regard to, you know, their own personal beliefs or views or anything. It's just like, hey, this is what I do. I'll just share it out there. But I do believe that there is something that's very important and that's, you know, You know, making sure that everybody is at the table because it kind of brings back to the ethics of it.

[00:18:53] Fonz: And, uh, as far as ethical use of AI and, you know, going into the different [00:19:00] biases and the outputs and the uncertainty of those things, I mean, just to get more people involved in getting that feedback. I think that's something that's fantastic. And obviously we talk a lot about guardrails. Now, my big viewpoint.

[00:19:13] Fonz: has always been, it's how can you put a guardrail on something that you don't own? Because a lot of these applications are plugging into a system. That's kind of, you know, that large language model, they're pulling that data from there. So if you don't own that many companies say, Oh, well, we're putting guardrails and these safety rails, and I'll hear it in all the education platforms.

[00:19:35] Fonz: Well, we've got guardrails in place. I was like, but how, if you don't own that. Own this. Is it just somebody putting in code that says, if this, then don't do this. And that's your guardrail. And I don't think that that's very safe at home whatsoever or ethical on that. What are your thoughts on just, uh, AI ethics and what's, you know, and in this case for these companies, what could they [00:20:00] do better?

[00:20:01] Fonz: To improve that.

[00:20:03] Angeline: Well, I think that that exactly as you said, I mean, these companies overestimate their ability to control things and giving them the benefit of the doubt. Yeah, giving them the benefit of the doubt that they honestly believe that they're what they're putting out there can be controlled.

[00:20:23] Angeline: Um, then They need to trust, you know, that there are people on the other side. And I think part of the problem is actually that, um, obviously the industry that AI Industry that creates the creators are a lot in in a little click, and I sometimes feel that I'm probably pretty right. I don't know them. So I just saying they probably live in their own little world in San Francisco or something.

[00:20:53] Angeline: And honestly, have no idea. I have no idea what, um, You know, they're kind of [00:21:00] distorted reality. I, just what I've, you know, hear them talking about creating new beings or some strange things or religions and, and so, um, Yeah, I would tell them talk to normal people, see, you know, where people spend some time out of Silicon Valley, and I do believe, going back to something you said before, that in the end, I don't know how long this end is going to be, and sometimes it's hard to keep believing this, but in the end, the winner will be the one that puts the most people on the table, because AI Is going to be the most intelligent, more information that it has, the more useful it's gonna be.

[00:21:40] Angeline: I work with a lot of people from Africa and I have yet to have an AI system. I would love someone to show to, to show me one that can produce a non-biased image of an African, I mean. You know, and it's just even so far [00:22:00] that I had to ask my African partner, like, can you just send me pictures of Africans because I can't trust any system that I get that it's not biased.

[00:22:08] Angeline: Um, I'm getting this as a, as a story. The example, because that's the one I know the best, but in, in general, the AI system isn't going to be useful for all those people. And it's not going to be useful for students, for example, who need to learn about Africa, if it's not been fed with proper information about that, the continent, the countries in the continent.

[00:22:30] Angeline: So the winner is going to be the one that figures out I have to bring them with people on the table. So my system is really fair and useful.

[00:22:40] Fonz: Or

[00:22:41] Angeline: people,

[00:22:42] Fonz: and I, I agree with that, that what you said just really brings to mind, uh, recently at a presentation or actually at a conference, I should say, uh, myself and Dr Nika, Nika McGee, big shout out to her.

[00:22:55] Fonz: She's fantastic also, and she's a. Cautious advocate of [00:23:00] AI and she's out there also spreading the word. But we, we did a presentation together because here in the state of Texas, they are slowly rolling out the use of AI for grading, constructive responses or shorten little essays as opposed to using manpower.

[00:23:16] Fonz: To read through these essays. Obviously it would take a lot of time to do that. If you're doing it in person with more people, but now they're just saying, okay, we're going to do a small percentage just to kind of test it out and going back to what you were saying. So for example, an AI, um, model being used in Africa and an AI model being used here, I know that.

[00:23:38] Fonz: Even today, when I've gotten into some of the image generators and you put in, you know, show me like just janitor, you get a certain look, you know, then for doctors, you get a certain look for, you know, a lot of things. And I'm like, wait a minute, like, this is very unusual. This is very weird. And so. By, you know, [00:24:00] countries even, you know, now it's like, how are they perceiving us?

[00:24:04] Fonz: Like if they put there like an American, you know, what does that look like to them too, as well? So going back to that, it's, it's that information. Is it, is it accurate information? And that's kind of very scary too, because even when you use an image generator where there'll be like, Hey, you know, put yourself in here or put in a prompt and I describe myself and I'll put.

[00:24:24] Fonz: They're, you know, Hispanic male, every single output that I get Hispanic male, it always gives me a beard or a mustache. And it makes me look a little, well, I mean, it makes me look a little bit more bigger, filled out. I should say, Yes, a little bit more, uh, you know, filled out a little bit more robust and, uh, so I'm like, this is very interesting, you know, as you're putting in these prompts, you know, there still needs to be a lot of work being done with this, but, uh, you know, the fact that people around the world, [00:25:00] Educators, especially are like, Oh my gosh, this is the greatest thing in the world because now we can do this quickly.

[00:25:07] Fonz: Now I'm able to do this in 20 seconds. But my biggest concern is yes, he can do it in 20 seconds, but how accurate is it? If it's just statistically predicting the next word. The other thing is that the knowledge cutoff date. Is something that we brought up there at that conference too, because there's a lot of applications that teachers are using and they're purchasing for their teachers.

[00:25:32] Fonz: And in the terms of service, it'll tell you the knowledge cutoff date is 2023. We are already well into 2025. So how accurate is this going to be? If the data there is at 2023 and now in the state standards have you know, have been updated for a lot of our content area here in Texas, at least. So those are a lot of the things that I know many people don't look into and maybe they just want to turn a [00:26:00] blind eye because they're like, Oh, the, the magic, the, the whistle, the, this is the shiny object that's going to, you know, create my lesson for me and I'm done.

[00:26:10] Fonz: And that's what really concerns me too, as well. So kind of going and touching on that a little bit, you know, I know that, You've compared and saying like, you know, like what we were talking about a little earlier, those that bring more people to the table. So it's almost like we're comparing it to an AI race and it's definitely a competition, you know, without anybody just really, it's just like all hands on deck, everybody just go, go, go.

[00:26:36] Fonz: So I want to ask you, you know, looking ahead. In your perception and in your experience and from the, from the lens of the world that you live in, which is, you know, data girl and friends and all the amazing people that you're connected with in your network, you know, how do you envision, you know, AI as a force for good or.

[00:26:55] Fonz: Or do you envision it as a force for good, like maybe 10 years from now, or is there [00:27:00] many more pitfalls that are going to be coming that we should be worried about?

[00:27:06] Angeline: I try to be positive. I need to be positive. I need to believe that it's possible. Uh, the good. AI can be a force for good. It can, it can be used well.

[00:27:19] Angeline: Um, it doesn't look like it's necessarily going in that direction right now, um, because of exactly massive problems. You know, we were discussing before with the image generations, the, the, the one, the, the ones that create pornography. Kids are obviously You know, interested in this. So they use it. They create it.

[00:27:41] Angeline: They don't understand the weight of what they're doing. Um, so all sorts of things. And also these AI relationship chatbots, they're all completely overwhelming and influencing, especially if you give it to young kids. I was talking to, uh, I think it was Megan [00:28:00] who said that she met, you know, someone whose daughter had had their first relationship with, with Um, is an abusive AI chatbot boyfriend at 13.

[00:28:11] Angeline: So this is a person who's first relationship. I mean, this is the influence in a whole world, um, going forward. So this is a lot of reason to be negative, right. About it. But on the other hand, what the world I'm trying to create, um, is one where all of the, uh, tech connects us all over the whole world in a way that we've never been connected, they've figured out.

[00:28:35] Angeline: They make one product and it's sold in the entire world. Uh, what we haven't figured out is that the other side of it, right? So can we can take this connection that tech gives us and push together for a responsible tech. Right? Because in the in individuals and also individual countries, we're not really achieving as much.

[00:28:58] Angeline: But if [00:29:00] collectively we can do that, then we can, I mean, AI can help in really a lot of ways. It can help be very, uh, us to be very efficient, uh, and it can help us to be more creative. It can help us to know each other better because in the moment that, um, and I need to call out Bill Schmarzo because he's the first person I heard say this, that AI can help us.

[00:29:22] Angeline: And I think that's what's more important for us to be more human. And some people hate that statement and some people like that statement. I like it because it's, I think, because there are things that we can do as humans, that AI probably, I don't want to say probably won't be able to do is be, understand, be sentient, understand emotions, understand context, all of this like real context, life experience.

[00:29:47] Angeline: Um, and if you have a I, if you use a I, then you understand which parts of you are uniquely Hugh and kids can learn that from a younger age, right? They actually have to, [00:30:00] they should understand who am I, what makes me be unique? What kind of person am I? Because if you're using AI and they are, and you don't know who you are, then you can more easily be influenced.

[00:30:11] Angeline: And this is something that kids can then learn earlier. And then you're. Actually going stronger into the world because you know yourself better, so that can be a positive output of a I, but we have to be more intentional with him. We have to kind of force that use because, as you say, the tech companies are obviously.

[00:30:30] Angeline: They have billions in funding that they have to get returned on. So they're going to go for the, for the money first.

[00:30:39] Fonz: Absolutely. So I want to kind of just, uh, turn the conversation over now, because to talking about the global online safety conference, you know, so this is something that I did see recently that was posted on LinkedIn.

[00:30:51] Fonz: I have already signed up for it too, as well. And, uh, And just looking at the list of speakers, this is going to be an interesting conference. So can you [00:31:00] tell us a little bit more about this conference? Well, first of all, if you can, or have some background, how did this conference idea come about, and then tell us a little bit about what the goal of this conference is and why people should sign up for it.

[00:31:14] Angeline: So the idea came about. Just after a year of being in this space, I met some amazing people, a lot of amazing people, like this online safety, this, you know, an AI, a responsible AI community that somehow I have built, uh, on LinkedIn is just amazing. It's so amazing and it's full of, uh, I call them like individual warriors, really passionate people.

[00:31:40] Angeline: A lot of them are individuals or small companies, um, uh, small organizations fighting to survive, um, making a real, real difference. And I'm thinking, I was thinking these people could actually achieve a lot more if they were working together, if they knew each other more. So I said, [00:32:00] let's do a conference.

[00:32:01] Angeline: Um, and I talked to a few, uh, nonprofits that, that I work with, you know, will you support me to do this conference? It was in November and I was in a, you know, time it's urgent. So I said, I'm going to do it in three months. I said, in three months, we're going to do this conference. And you know, we talked about it, um, with the partners and also one, Angie, Andy Briarcliff, uh, who has been a lot of support as well.

[00:32:27] Angeline: He's been in the, in the space for a lot longer. What? How are we going to define it? So we'll just be very general. We're going to call it an online safety conference. It has to be global because that's what I said before. We have to work together more and we just put it out there and see and see, see what comes back.

[00:32:47] Angeline: What are people interested in? You know, who, who wants to talk? And we got this Massive just so many people so much energy came back. I was just putting on messaging. We're stronger together, [00:33:00] stronger together. We have to know each other. And, and I just, It was like every day something would come in and I said, I can't believe this person is speaking.

[00:33:09] Angeline: I can't believe this person wants to speak. Like, I always, ever since I heard the existence of, of the AI data labelers, I always wanted to meet an AI data labeler or a content moderator and, and there was a, a Facebook content moderator. From South Africa who, who contacted and wanted to speak. And I'm like, yes, that's exactly what the, so all sorts of people from where 16 countries, you know, different ages, uh, contacted in all across the spectrum, uh, of different topics and experiences are going to talk.

[00:33:48] Angeline: And what's important is, um, is that we did not go for any influencers.

[00:33:58] Angeline: We don't have, you know that [00:34:00] the keynote speaker who is going to bring in the audience like no, we want to hear from the people who need to be heard. Um, and, and it's quite unique. And we also made the conference like 12 hours a day. So that people from all over the world can speak in their time zone. And we made it free because, you And online, fully online, because then the barriers to actually attending are gone.

[00:34:30] Angeline: Um, because a lot of obviously university students, um, people in poorer regions or people like me, I'm actually in Europe. So it's Saturday afternoon. Um, and. I would love to attend conferences in, you know, in the U S but it's, it's a really long way and really expensive. So we're like, no, we want the, to have the voices who wants to speak and we want anybody to be able to, to listen.

[00:34:58] Angeline: Yeah. So that's, [00:35:00] that's how it came about.

[00:35:02] Fonz: Well, that's wonderful. And, you know, looking at the lineup, there's definitely some amazing, amazing speakers and people that I actually follow on LinkedIn too, as well. Like I said, I'm a follower of your work and everything that you're doing, because I love what you're doing and your mission.

[00:35:16] Fonz: And so this is definitely something that's going to be worth the view. And like I said, I've signed up for it. Uh, so I'm really excited to just gain some more knowledge and different perspective and different lenses from people in other countries that, Maybe our like minded, but are seeing things differently or may have a different perspective.

[00:35:33] Fonz: And like I said, for me, it's always looking for something different and something that just to think about and maybe change my perception on many things. And so this is an exciting opportunity for everybody to sign up. And I know the conference starts February 19th, so there's still time to sign up.

[00:35:50] Fonz: Correct. There you can say, yes,

[00:35:51] Angeline: absolutely. Yeah.

[00:35:53] Fonz: Perfect. Excellent. So for all our audience members that are checking this episode out, please make sure that you check out the link [00:36:00] in the show notes also as well, it'll be there. We'll definitely be posting it on LinkedIn too, and all our socials to make sure that you sign up for this, because this would be a great event for you to learn more and see things from different perspectives and different lenses.

[00:36:15] Fonz: And of course, like I said, it's, it's only going to nurture more people. Our growth within this space and just to see how as a collective, we can improve this space also as well. So thank you for sharing that. Now, Angeline, before we wrap up, I just want to ask you as far as, uh, you know, your projects, you know, what are some of the things that are, you know, that are, are in store in the future, uh, maybe for Data Girl and Friends?

[00:36:41] Angeline: Well, Data Girl and Friends, um, as I said, I, um, create the content. And I realized, uh, early on that. Sales is something I, if I were to actually try to sell my content, then I don't have the creative juices anymore. Yeah. So I'm really building out partnerships with amazing organizations [00:37:00] who have, maybe they have, um, the, the ideas and the knowledge, but maybe not the medium to, to bring it about, or they have, um, Schools, classes, parents who, but they could use the, the content.

[00:37:14] Angeline: Uh, so I'm data girl is, that's what I'm building out. That was the original idea for the conference, uh, to help that. And actually has grown into much more luckily. Um, and also I'm working on some online courses that will be ready soon. Um, basically the whole concept we didn't really speak about, um, is that.

[00:37:37] Angeline: I think that kids, teens, they should have like clear knowledge, something like you can't drive a car without a driver's license. So you actually shouldn't be using a device without basic safety. Just really basic. Like you you've learned this, you've heard this at least once. So I'm working on some, uh, some online [00:38:00] courses that will be ready soon, um, on, on this, and I'm really excited.

[00:38:05] Angeline: When those are when those are ready. And also the the shield conference is actually just a kickoff. Um, we are going to do a yearly conference, but it's actually intending to build collaboration. We're going to also do working groups and do smaller meetups and conferences, um, to really The idea is it will be the, the platform to help people come together.

[00:38:31] Angeline: So it's another reason to, to come, even if you can't be there. Cause I know a lot of, I picked a week where a lot of people are on vacation. I didn't know when I left the U S there was no vacation in February. Um, uh, so yeah, just to, to. Sign up so you can listen to the recordings afterwards and be a part of the movement in the community going forward because there will be other meetups and other more specialized conferences.[00:39:00]

[00:39:00] Fonz: Wow. Well, that is fantastic, Angeline. And thank you so much for joining me today and taking a little bit of time out of your day to just really share, you know, your passion, share your mission, your vision, and we're definitely getting people excited about the Shields conference also as well. So Again, for all our audience members, make sure that you check out the episode notes because all the links will be there and the conference is coming up really quick.

[00:39:24] Fonz: It'll be February 19th. So if you're watching this, uh, you know, please make sure you click on that link, sign up and check out all the amazing speakers and just to help us learn more. And obviously now that we're hearing from Angeline too, that this is something that is a community that's going to continue to grow.

[00:39:39] Fonz: Maybe, maybe in the near future around your area. There will be a meetup or there'll be a conference, but it's just something great to be part of and something that where you can find like like minded individuals and folks coming together. Like I mentioned, just to continue to nurture these conversations and continue to grow together.

[00:39:56] Fonz: So Angeline, thank you so much. I really appreciate your time [00:40:00] being here, but before we wrap up. We always end the show with our final three questions. And I know I gave the, I always give my, my guests those ahead of time. So hopefully you're ready to answer some of these questions or had a, just a little bit of time to think about them.

[00:40:14] Fonz: So question number one. Every superhero has a weakness. So for example, like Superman, it kryptonite just kind of weakened him a little bit, or it was a pain point for him. So I want to ask you, Angeline, in the current states of, I guess we'll say AI, or it could be education. It doesn't matter. But I want to ask you, what would you say is your current kryptonite?

[00:40:39] Angeline: My current kryptonite. I'm a creator and I am not good at, uh, selling my creations in the sense, selling, even getting it out there and approaching people with it.

[00:40:53] Angeline: That, that is my biggest pain point, because as you say, I have a lot of ideas. And I have a lot of creative juices and I create [00:41:00] things that a lot of people say are nice, but I'm not good at getting them out there. And which is a big problem, uh, obviously. Um, so I think it gets out there slowly through other people, but it could be a lot faster and more efficient and more useful if I would be better at

[00:41:18] Fonz: that.

[00:41:19] Fonz: There you go. All right. That's perfectly great answers. Just like we were talking about earlier, just kind of getting, uh, maybe I guess you'd say a little bit of that imposter syndrome, because I, I suffer from it too, as well, you know, I have great ideas, but just to kind of get them out there, it seems a little bit difficult many times, but yeah, that's a great answer.

[00:41:36] Fonz: Thank you so much for that. All right. So here's question number two is if you could have a billboard with anything on it, what would it be and why?

[00:41:47] Angeline: Billboard would be. And every individual can make change. Um, but the individuals need to work together [00:42:00] in the sense it's an individual thing, but it's a standing together.

[00:42:04] Angeline: This is what it would, it would be. And the why is simply because We often feel powerless, um, for all sorts of reasons. I mean, there are tech billionaires. There was recently, they were all standing behind the American president, uh, when he was being sworn in, uh, and then doing, you know, jetting off to, to Europe and changing, managing to get the regulations changed overnight.

[00:42:32] Angeline: You can feel powerless. Um, but we're not powerless. And if you look back in history, this tech field world that is all sorts of, has all sorts of dangers is new. It's, it's new. It's been around a really short time ago. Our world was different and it can, we can, Um, and so I don't think we're going to insist on no, it doesn't have to go in that direction.

[00:42:56] Angeline: And it's going to be individuals making individual [00:43:00] decisions, uh, that can make that happen. Um, but that said, if you find other individuals who are on the same path. And then you have more inner power and inner strength.

[00:43:16] Fonz: Great answer. Thank you, Angeline. And the last question to wrap up our wonderful conversation is if you could trade places with one person or a single day, and it could be anybody, who would it be and why?

[00:43:32] Angeline: I don't know if I would really want to do this to myself, but I would like to put myself in the throes of those Silicon Valley conversations. for one day and kind of figure out, um, maybe all of some of these things that I hear them say would make more sense if I spent a day there, uh, understanding what they were doing, how they were spending their day, probably wouldn't understand many of their [00:44:00] conversations, but you know, at least to kind of get a feel for it.

[00:44:02] Angeline: So I think that would be it. Yeah. Just to really understand better this whole other side of the

[00:44:08] Fonz: AI tech universe. That's a great answer. Well, Angeline, thank you so much again for spending a little bit of time with me on this wonderful day. And thank you so much for all you shared and for our, our audience members.

[00:44:22] Fonz: Like I mentioned, there's a conference coming up, the shields conference. So please make sure that you check our show notes for that link. And that way you can go ahead Sign up and also just find these wonderful speakers on LinkedIn as well. Follow them on socials because they are putting up some amazing things that really help you learn more, but also to kind of stop and think, and, and that's the wonderful part about it.

[00:44:45] Fonz: Just, it's not all about, you know, going fast and breaking things. It's, you know, you can go fast, but then also take a pause and just really reflect on some of those things that maybe we're already coming in with our own, uh, perceptions, but this would be a great way to continue [00:45:00] So for our audience members, make sure you check those out.

[00:45:02] Fonz: And also please don't forget to check out our other 312 episodes where I promise that you will find something just for you that you can sprinkle onto what you are already doing. Great. So make sure you go over to our website at my ad tech dot life, my ed tech dot life, and if you haven't done so yet.

[00:45:19] Fonz: Make sure you follow us on all socials. That way you can keep in touch with us, but also see all the wonderful guests that are coming on through the show, the wonderful conversations, and then that way you can go ahead and just get a little glimpse of the amazing, amazing work that is being done. So thank you so much.

[00:45:37] Fonz: And as always, my friends don't forget, stay techie. [00:46:00]

 

Angeline Corvaglia Profile Photo

Angeline Corvaglia

With my experience of living in six different countries, I have been exposed to a wide range of cultures and mindsets. My journey began in the world of arts, which laid the foundation for my creative approach. This background proved invaluable as I ventured into the financial sector and later information technology. In these industries, I have thrived by fostering environments that value individual contributions, flexibility, and resilience.
However, the highlight of my career has been my transformation into a digital changemaker since I started my activity Data Girl and Friends. A personal passion for understanding the nuances of online safety and artificial intelligence and its implications for the next generation drives my dedication to this field. This commitment has led me to become an advocate for AI awareness, particularly concerning its influence on our everyday lives. By continuously seeking knowledge and education in this area, I not only have shaped my professional skills but also have contributed meaningfully to discussions on technology and society.
I consider educating and empowering others about the skills needed to be safe and responsible in an AI-driven digital world my most important contribution.