Leveraging AI to Protect Children #167 | Embracing Digital Transformation | Intel Business

Leveraging AI to Protect Children #167 | Embracing Digital Transformation | Intel Business

Show Video

Hello, this is Darren Pulsipher, chief solution, architect of public sector at Intel. And welcome to Embracing Digital Transformation, where we investigate effective change, leveraging people process and technology. On today's episode, leveraging AI to help protect our children with returning special guest Rachel Driekosen Rachel, welcome back to the show. Thanks, Darren. So glad to be back. Rachel, you and I have been working together for three years, for years now, four years, four years, and it's been wonderful and we got to work on a couple projects together, but one that really is dear to our hearts.

It's around protecting children online and and protecting children against child predators. So both dear to our heart, we've done a lot of work in this case. And then you've recently written a paper on on the state of things in this. So we're going to talk a little bit about that today and often very, very excited to very excited to share some updates there. So, Rachel, before we get started with that deep topic, right, tell everyone a little bit about yourself, your background. It's been a while since you've been on the show.

It's been a couple of years, so people people want to hear a little more about what you've been doing and then how you got to where you're at. Oh boy, I don't know if people actually want to hear it, but I'll tell you anyway, since you asked. So I am here.

I'm currently a technical director for our State Local and education practice here at Intel in the sales and marketing group. And the charter there is to explore options for scaling with our partners. So New and innovative solutions alongside movements in the policy and regulatory area. So we're acting not just as a technical instrument, but also as a bellwether for things to come and helping our procurement agents, as well as our strategists in state, local and education think think a little more futureproof as they're selecting architectures and options for their digital infrastructure. Well, you had a you had an interesting way of coming to public sector. You did not starting public sector at all.

In fact, being from Michigan, where would you guess you would have started? Of course, out of automobiles, right? Yeah. Yeah, that's that's where I got started. I was born and raised in Michigan and I did a brief jaunt down to Texas and it was too hot. So I came back specifically to work on Auto and I did automotive. I'm, you know, embedded automotive sales, both OEM and Tier one, an aftermarket stuff, just putting putting those black squares on the green board and making sure that, hey, we're we're we're doing really cool stuff with with the latest and greatest of the capacitors power management, you know. And then I slowly crept into Intel because I wanted to work on memory and storage for autonomous vehicles. And that's how I started to get interested in the the policy piece of technology.

And then some fine gentlemen in California released something called autopilot, called it full self-driving. And people started dying because they didn't realize they still had to keep their hands on turn and to make so who's responsible and be awake as well. Stay awake while you're driving your car. So who's responsible when when, when an autonomous vehicle crashes and what safeguards should be in place? And how do we make sure that the correct policies are are structured so that they don't stifle innovation, but they're actually keeping people safer? And we're demonstrably using tech in a thoughtful way. So that's how I got started, thinking about the impact of regulation and policy. And I got super lucky that this role on the public sector team came up and four years ago I came over and five years ago I came over and ever since I've been having a blast.

Well, and you've taken this whole idea of how policy can influence technology to the next level because you went back to school. Yes, exactly. I am currently pursuing my master's of science in law at Northwestern. And this program, I cannot I cannot say enough good things about it because it is it is I mean, the tagline is the intersection of business, technology and law, and that is might as well be on my resume. That's what that's what I want to work on.

So it's a it's a they have a couple of different options. You can do it in-person. You can do it part time online hybrid. And I have chosen to go the the part time online route and I just pick from this just gorgeous menu of classes and biometric privacy policy which I just got through responsible data and artificial intelligence that I took in January. I think visual communication just there's it's an excellent, excellent program.

It's really allowing me to peak my, you know, quench quench my thirst for all these different areas that might be applicable to our future. Well, what I find interesting is you've used that passion that you have for policy and technology to tackle a really tough problem that we're talking about today, and that is prosecution and not just prosecution, but also a discovery and prosecution of child predators on online Child predators. Absolutely. And surprisingly, when you brought me into this, I was shocked at how disorganized ice the industry are. You can't really call it an industry.

It's just the jurisdictions in our government worldwide are completely discombobulated when it comes to technology and and online crimes now. Absolutely. I mean, when when I brought you into this particular project that was and the 21 so about two years ago, at this point, I was not sure what the actual intel play might be at the time. But the I mean, that's that's the most important part in my mind is technology companies leaning in alongside law enforcement to, you know, not not just come up with the technical solution, but come up with the language around the uniform language across different states and different municipalities that nap on a national level, on a global scale so that this is so that the actual prosecution is and the investigation is not stymied because of a technicality in the language.

That's what I found most interesting, the technology to do some of the things that they needed, like chain of custody of evidence for example, managing evidence, digital evidence. I thought, oh, surely this is the technology. So yeah, why haven't we done this already? Yeah, exactly. Well, it turns out that the technology is there, but the understanding on how to use that technology differs in every single jurisdiction, even in the same state.

Oh, yeah. And I was like, blown away. Yeah, yeah. I mean, yeah, something I did not understand at all. I think that, that was pretty mind blowing for me too, that the, you know, depending on which state you're in, the, the evidence handling requirements may be dictated by one house, one judge, one party here. And you got to be careful what language I use when I'm talking about like law enforcement and regulatory stuff.

House has one meaning. But, you know, depending on where you're at, the meaning of the same word can be entirely different. And it can mean it can be it can be the difference between, oh, you know, slap on the wrist and, you know, don't do that again. And ten years in jail, same word, different meaning.

So it's really important to, you know, as you're trying to solve problems in this arena, that the language and that awareness is a part of the campaign. So I love how you brought awareness in there. I think awareness is probably one of the things we overlook the most when we're trying to deploy new technologies or new solutions into a space that we haven't done before. So how do you go about how do you go about doing that when you've when you've got something like and I don't know if this is an industry jurisdiction, evidence management is that in an industry, say more, say more about that. Well, I'm just wondering because when we tackle this problem, we were we were tackling the problem of how do you how do I handle evidence of child pornography? How do I handle that evidence? Because it's illegal to look at child pornography. Oh, yeah, it's illegal to transport it.

There's lots of rules around it, then. Good. Yes, but how do I handle this evidence? That's digital evidence.

How do I make sure it's not tampered with? How do I make sure that the victims are not revictimized by getting loose? There's all these problems that arise. But how do I how do I teach awareness to law enforcement into the judicial part of our governments? What is available out there today that will protect? That to me was the hardest part, was the awareness part. Yeah. So how do we go about doing that broadly? Because like you said, depending on a judge, a judge can make these determinations. Yeah, exactly. Exactly.

I mean, so it's it's very hard. It is very hard. And I you know, I don't have I don't have a solution for that yet. I have lots of lots of speculation and the research I've done so far has indicated that, you know, there's really not an easy way to build that awareness. And were there still, I mean, there's still challenges around the differences in in judicial philosophy that would say, hey, we're going to look at how the original language was structured and we're going to ignore anything that's anything that we're trying to be said right now. So unless we've got active interest from that, that judicial system, unless we've got active interest from the prosecutor's office, and unless they've so they've got to have that willingness and interest to solve the problem and they've also got to have the bandwidth and the resources to address the problem, which is something that I'd really I go into a little bit in the paper that I wrote that that we're referencing today.

You know, we've got we've got in my statistic of, you know, in 1998 when the when the cyber tip line was set up by the National Center for Missing and Exploited Children, you know, they said, hey, please, if you if you are seeing any incidences of child pornography, report it here. And in 1998, they received north of, you know, 4500 reports. And as technology evolved and advanced as people became more aware of that reporting line and as the perpetrators of that illicit material also got smarter and were able to move faster because they weren't encumbered by red tape, that that number ballooned to 32 million in 2022. I mean, there's and you know, how many so you think that you think that's due to the ease or the awareness that people have. I can report something.

And also the bad guys are generating more of it. Yeah. Yeah. I mean there's and I I'd put the specific to because it's it's not just it's not like it's there's a lot of repeat content right. That's repurposed and that's where A.I. tools are in use right now.

I mean that that's part of why it's 32 million because these A.I. tools that are deployed by IWC or Facebook or whomever else or Google or whomever else, you know, they're on public platforms. They've got the ability to automate that scanning of some of this content now, so they're able to automate those reports to some extent.

But once you hit the the folks that need to validate the folks on the law enforcement side, they need to validate it and say, oh, yes, this is this is what I thought. This is child pornography. This is an illicit image of a minor that is not just old. You know, Mom taken a picture of her toddler in the bathtub or, you know, consensual sex among adults. The that is, you know, perfectly. I don't want to see it, but it's perfectly legal.

Like once it hits the people that need to validate it, Those tools are not, you know, one technically not mature enough to handle that task. There's got to be a human in the loop. And that that job is traumatizing. You know? So, yeah. Do do you think we'll get to the point where I will be able to do that more readily and and react faster because once something's up there and been reported, it could be weeks or months before any investigations.

Yeah, right. Yeah, absolutely. So that that site could have been taken down or moved on. I mean, the list goes on and on. Oh yeah. I mean, so

for sure. Keep going. Sorry. No, no, no.

So I'm just guessing. I mean, are the A.I. tools, are they getting closer now where they can turn things around faster without a human in the loop? Ah, can we. Can we train the AIS to do a better job? Two parts. I have a two part answer for that. One, I don't ever want.

I don't ever want no human in the loop when it comes to making a decision about whether or not a person goes to jail or not. I think that that's going to continue to be. Oh, thank you for saying that out loud, Rachel.

It's it's it's that's that's way too scary and that's not appropriate. Like we need whether or not the tool is flawless, perfect whatever we having a human in the loop to be accountable for somebody's livelihood whether or not someone goes to jail. I mean that's that's got to be a piece so that the question is not can the I take over, But like, how much faster can the AI make it for us? How much how much can be I protect the the analyst who's got to go through and look at this stuff and verify how harmful it is, You know, how much can that I actually say, okay, that looks just like that other thing, so I'm going to blur it and, you know, indicate that. All right. That's pretty similar to this or hey, we've definitely already this is the same fingerprint as one we've already seen. It's the same. Yeah.

So just right. Yeah. But how much can how much can we prevent that victimization of those children? How much can we speed through that process and then reduce the trauma incurred on the extremely brave people that are investing there? I mean, that are that are also being vicariously traumatized by going through this kind of material. So it just crossed my mind at the same time that we can improve these tools.

Right. That can do a better job at classification. Are we going to start seeing generated images? Oh, yeah.

Yeah. And and is there any law? Have you read any policies or laws around? Well, it's not a real person so it doesn't. Yeah, it's not child.

That's an interesting question. I saw something in Louisiana out of Louisiana earlier this year that was along those lines. You know, hey, if it's an image of a child, it's still a child. But I think that one's still to to be determined because it's the tree falls in a forest and no one is around to hear. It doesn't make a sound.

If someone creates child pornography, that is not actually a real person. Is there any harm? I, I refuse to comment on that because that's just disgusting. Yeah, no, I agree. So

but you can see where A.I. is now can be used for absolutely in in both of these cases. So that's something that we have to do. We legislate that. Do we regulate that? Do we educate? Oh, well, I think that I think there's a couple of a couple of things to look at because you can legislate, you can legislate in one spot.

You could legislate federally and to try to accomplish prevention of harm. And then everybody here says, oh, well, we'll just move our stuff to Russia because see Sam, child sexual assault material or abuse material, see Sams decriminalized in Russia. So let's just move them to those servers. So unless there's a unless there's a uniform agreement and I don't I don't know that we ever get a uniform global agreement on that language of what is permissible and what is not permissible. I think we're going to we're going to be hamstrung for a long time. But but is there any movement in in getting a global policy or global laws there hands on this.

So so this this organization in hope that that I became aware of via one of the partners that that we're working with for our project. You know they have a they just launched a universal classification schema for categorizing different images. And the hope is that that gets fed to the intent is that that gets fed into artificially intelligent models so that they can assist along this along this investigative path.

And that is, I think they're headquartered out of the Netherlands, which used to have like the largest concentration of child pornography being hosted not on purpose, but, you know, just because because of the the language, the language in the laws, you know, now now it's different. So between the organization in the Netherlands, the US is a big it's a founding partner of this this organization. There's absolutely a global effort. There's there's to and I forget it there's 100 plus countries that have analysts tied directly to and hope and that's capital I capital and hope there's there's more than a hundred countries that are that are involved in this trying to conquer this this problem because again it's all right.

So that's that's really exciting. Right. And and the technology like you said, because AI is so much more advanced now, I can pass in these normalized classifications, which will which will be changing, but I can pass those into A.I. models so that a majority of the work can be done by the hopeful.

Yeah, majority of that work can be done. Certainly it can be classified, identified, and then even in some cases, you know, so if, if, if, if the language will allow and can like triangulate locations based on repeat images and associate with known perpetrators, there's the possibilities for what can be done because he's all right now. It's just like, you know, trying to shovel the sidewalk while it's a blizzard. So the possibilities of what can be done when we take full advantage of these tools, it's like being able to finally take a breath and clear that sidewalk a little bit so people can walk. Are you starting to see organizations like Interpol, FBI, DOJ starting to work together technology wise to because these these child pornographers, they're spanning the globe.

Right. And like you said, they're just going to move to another server or whatever. Are we starting to see these global police organizations work together and collaborate and share technology to help catch these these perpetrators? Absolutely. I mean, absolutely.

Seeing the the interest, the head nodding, the the ambition to do it still faced with the challenge of lots of ingrained you know, everybody acknowledges it's a big problem. Everybody Interpol, all these organizations are trying to work together, band together to get something done. So that's definitely happening. That's exciting. There's just a lot of a lot of hurdles around who gets to, you know, who's on first, who gets to dictate what that actually what the definitions actually are and who's going to pay for it. Because after all, you know, I don't want my taxpayer my tax dollars being spent on something that I'm that we're not sure is going to work.

There's certainly politics around defunding the police, which is heartbreaking, especially. And for issues like like what we're talking about, I mean, yeah, we're they're they're putting their heads together to come up with a solution. But, you know, who's who is going to who is going to execute, who's going to pay for that execution? That really concerns me.

That's yeah. So let's talk about technical areas. D Are there any technical barriers that you're seeing with these organizations working together and coordinating together? Or is it all political or policy and process problem? Well, so, I mean, a lot of them are political and process problems. And if you'll permit me to give one more example of that before I go into the technical. Yeah, yeah, yeah, yeah. You know, Apple had launched or had announced that they were going to detect anything that was on iCloud and was like 18 months ago, they said they were going to do they were going to do system detection in iCloud and 18 months later they're like, never mind, we're not doing this because the technology's not quite there.

And this this is or excuse me, the technology is close enough that we could potentially detecting, detect and flag things. But after a couple of incidences of people having their accounts flagged for pictures they were taking to send to their doctor, for example, their three year old son having a rash in his groin, the public outcry was like, stop, I can't lose my entire digital life. You can't lock down my accounts because you there was a false or false positive.

So that's I guess that's both a I guess that is a technology issue that technology actually has to be. Yeah, that's a technology issue. Then. But it's hard it's hard for an AI to discover in. Yeah, yeah. And right.

So there's still some limitations on the technology side, it sounds like on detection. Yeah, classification, certainly. What about. Well, yeah. Oh, okay.

And so what about, you know, on the prosecution side, the big hang up that we ran into was not a technical one, but it was a education of politicians and judges on what's capable for protecting evidence. Yeah, that's that's another really good point. I mean, when we think about when we think about the when we think about the evidence handling specifically the the known process of physically hand carrying this stuff that nobody this this system that nobody is supposed to handle, you know, it's a felony to do so.

And you think about hand carrying a hard drive or a mobile phone that you suspect has this evidence. And there's only four people in the state that are qualified to analyze it. Not only are those not only is it them being qualified to analyze it, but also they're the ones that have the equipment to deal with it. So. Right.

So that system, though, it is slow, is what has been known, the software that manages those case files. And after that, the image of that hard drive is extracted and that software's that in place as a standard for a long time and that it's, again, the way that an officer processes their case, the way that the digital and the forensic analyst processes their case, the case and the language they use, how they tag things, how they're putting their notes in, that's it's not necessarily uniform. It's not like a you know, you and I get into a spreadsheet, we click a button, we say red, blue, green. And those are the only three options.

You know, it's free text. And it's the same problem that when we talk about health, electronic medical records, you know, how how do we define thresholds so we can share information across across boundaries here? Yeah, That's what I found most interesting was that the sharing of information across boundaries zero zero because there's zero. So if I have a if I have a serial perpetrator that's crossing multiple jurisdictions, I have no idea. Yeah, there's there's no way to correlate that data safely. I mean, because it's and to your point, you know, what we've been working on has been much more specific to the securing that evidence, handling the way that it's the way that it's handled today, because you're not supposed to share because there's we can't just send here's here's your email, here's your attachment note instantly to jail not Pasco Yeah yeah no. How how do we get through? How does law enforcement get through those those hoops, so to speak, in a way that they can guarantee to the judge, hey, this was handled properly according to according to the rules that have been set forth for the handling of our most sensitive types of evidence.

I mean, those laws were not designed for the evidence handling was designed for the handling of physical physical evidence. We're still we're seeing stuff evolve not to handle the digital stuff, but how do how do you teach? How do you teach somebody with no technical background? What's what's more secure from a cyber standpoint? How what do we do do do you think I mean, eventually this will change. It has to as as judges and lawmakers become more educated on the capabilities. But we as technologists, we need to push harder.

From my perspective, we need to be actively out there educating, actively out there, helping our judicial system worldwide understand the art of the possible, and that we can protect people and data at the at the same time. I agree. That's part of that's part of why I get so geeked about the program I'm in is to serve in a translational role where, you know, I've got how you're in it. You're in a really critical I'm open.

So this being able to have this conversation, being able to speak the language of both sides of the fence or, you know, multiple sides of the party and you serve as a negotiator and mediator as we figure out what makes sense and it takes a long time to reach consensus on stuff, especially if you're speaking German and I'm speaking Mandarin Like, we've got to we've got to find common. Yeah, yeah, yeah. We got to find a common way to say no.

We're talking about the same thing. What is our what is our end objective? If our end objective is to keep children safe and prevent the spread of it without getting into, like, political theory. Right. We're not we're not going to stop 100% of everything. But how can we keep children safer? How can we deter criminals from acting in this direction if that's our common goal? Great. Now let's let's translate between, like, all right, here's what's technically feasible. Technically possible, here's how fast we can get it done, Here's how much it's going to cost, and how do we write that into the language faster than waiting for the next political administration to get hired in and want to make changes? Yeah, Rachel, fascinating topic today.

I think you're the really I we don't really talk about policy and people nearly enough on the show. So thank you for your insight. You're welcome any time Darren it's a pleasure to get back together. I'm the podcast. Thank you for listening to Embracing Digital Transformation today.

If you enjoyed our podcast, give it five stars on your favorite podcast, Insider or YouTube channel, you can find out more information about embracing digital transformation and embracingdigital.org. Until next time, go out and do something wonderful.

2023-10-19 15:47

Show Video

Other news