Radio Kempe

21st Century Child Abuse: A conversation with Emily Cashman Kirstein, who leads child safety public policy at Google

The Kempe Center

Emily leads efforts to keep children safe online at one of the world’s largest and most powerful technology companies, Google. Previously, she led the policy team at Thorn, a tech-focused nonprofit working to end online child sexual abuse. And earlier, she spent seven years both in her home state of New Hampshire and in Washington, DC as an assistant to US Senator Jeanne Shaheen (D-NH). 

In the Radio Kempe Podcast, Emily outlines the comprehensive steps Google is taking to keep children safe online, including using new technology tools to match the “hash values” (digital fingerprints) of images of Child Sexual Abuse Material (CSAM) so the images can be quickly reported and removed. These tools include “classifiers” to identify never-seen-before content, and other innovations. She indicated Google is not only using these tools on its own platforms but is making them available to other companies. She also discussed work Google is doing on parental education and mentioned a Google collaboration with the Royal Academy of Pediatrics and Child Health in the UK to better identify children harmed online and intervene. She also discussed the work that Google is doing to identify and mitigate the risks to children from Generative AI. 

00:00:03:02 - 00:00:05:11
You're listening to Radio Kempe.

00:00:05:11 - 00:00:08:06
We value the sense of community that connects

00:00:08:06 - 00:00:11:06
people and helps them find ways to move forward.

00:00:11:14 - 00:00:14:02
Join us on our journey to prevent child abuse

00:00:14:02 - 00:00:17:02
and neglect.

00:00:19:17 - 00:00:21:20
Welcome to Radio Kempe. 

00:00:21:20 - 00:00:26:19
Today's the latest podcast in the series 21st Century Child Abuse.

00:00:27:11 - 00:00:30:10
I'm Ernie Allen and I will be your host today.

00:00:30:10 - 00:00:33:10
I've spent many years in the fight to keep children safe.

00:00:33:17 - 00:00:36:18
I'm an advisor to governments, law enforcement, technology

00:00:36:18 - 00:00:39:18
companies, and others, including the Kemp Center.

00:00:40:13 - 00:00:45:16
Today our guest is Emily Cashman Kirstein, who has led efforts

00:00:45:16 - 00:00:49:15
to keep children safe online as an aide to United States Senator.

00:00:50:06 - 00:00:54:03
As head of public policy for an important nonprofit organization

00:00:54:16 - 00:00:58:00
and now as head of child safety, public policy

00:00:58:09 - 00:01:03:07
for one of the world's most important and powerful tech companies, Google.

00:01:04:10 - 00:01:05:07
Prior to joining

00:01:05:07 - 00:01:09:15
Google, Emily led the policy team at Thorne, a tech focused

00:01:09:15 - 00:01:14:13
nonprofit working to end online child sexual abuse and exploitation.

00:01:15:08 - 00:01:18:04
And previously, she spent seven years,

00:01:18:04 - 00:01:22:09
both in her home state of New Hampshire and on Capitol Hill,

00:01:22:13 - 00:01:25:13
as an assistant to US Senator Jeanne Shaheen.

00:01:26:13 - 00:01:30:17
Emily is helping to guide and shape policy in other ways as well.

00:01:31:06 - 00:01:35:22
She serves on the boards of three major initiatives to protect children online

00:01:36:15 - 00:01:41:02
the Family Online Safety Institute, the Technology Coalition,

00:01:41:15 - 00:01:45:15
and the We Protect Global Alliance, a joint effort of governments,

00:01:45:15 - 00:01:50:01
tech companies, civil society and international organizations.

00:01:50:23 - 00:01:54:19
Her role at Google focuses on an overall approach

00:01:54:19 - 00:01:58:17
to safety, including raising awareness of online risks,

00:01:59:01 - 00:02:04:03
advising on policies and features, and engaging with audiences worldwide.

00:02:05:07 - 00:02:06:10
Emily,

00:02:06:10 - 00:02:09:18
you're a vital and strategic role at Google places

00:02:09:18 - 00:02:14:09
you at the center of a global effort to keep children safe online.

00:02:14:19 - 00:02:17:19
It is broad and wide ranging.

00:02:18:02 - 00:02:21:09
Tell us what you're doing and the impact you're having.

00:02:22:15 - 00:02:23:09
Well thank you Ernie.

00:02:23:09 - 00:02:26:08
First of all, you're quite too kind.

00:02:26:08 - 00:02:30:15
but also, it's really such a pleasure to be with you.

00:02:31:09 - 00:02:34:23
and, with Radio Kempe and the Kempe Center,

00:02:35:19 - 00:02:39:04
it really means a lot to, to talk to you.

00:02:39:18 - 00:02:43:12
being a true pioneer of online child safety and an advocate for kids,

00:02:43:12 - 00:02:47:01
and you and I have worked together in different ways over the years,

00:02:47:01 - 00:02:51:21
and it's just always a pleasure to to catch up, and discuss these issues.

00:02:51:21 - 00:02:54:21
So appreciate that for sure.

00:02:55:20 - 00:03:00:00
at Google, you know, the role I have really supports how,

00:03:01:07 - 00:03:02:04
the company builds

00:03:02:04 - 00:03:05:04
for younger users and that obviously means,

00:03:05:10 - 00:03:09:02
you know, everything from baseline protections that we'll talk about today.

00:03:09:11 - 00:03:13:05
combating csam and keeping kids safer online across our products.

00:03:13:16 - 00:03:16:16
But it's also about respecting,

00:03:16:24 - 00:03:19:10
each family's unique relationship with technology.

00:03:19:10 - 00:03:20:10
How do we do that?

00:03:20:10 - 00:03:25:20
How do we empower younger users to more safely navigate the online world?

00:03:26:07 - 00:03:29:01
and really, that's about building skills for the future.

00:03:29:01 - 00:03:32:24
It's about, you know, them wanting to be part of the online world.

00:03:33:00 - 00:03:38:18
We know that, you know, certainly different from my generation that kids,

00:03:39:03 - 00:03:43:10
you know, I did not think of the online and offline world as the same thing.

00:03:43:10 - 00:03:45:06
They were two very separate things for me.

00:03:45:06 - 00:03:47:20
That's not how kids view it today. That's not how teens view it.

00:03:48:21 - 00:03:49:14
and so how do we

00:03:49:14 - 00:03:52:14
make sure they have the skills to navigate that?

00:03:53:08 - 00:03:55:15
and as you mentioned, I've really been proud

00:03:55:15 - 00:03:58:15
to be part of the fight against Csam from,

00:03:59:04 - 00:04:01:18
a variety of different ways from government side,

00:04:01:18 - 00:04:05:06
from the nonprofit side, and now on the industry side

00:04:05:06 - 00:04:09:06
and on the industry side, I'm in a position to advise,

00:04:09:06 - 00:04:12:18
like you said, on on policies and protections and,

00:04:13:14 - 00:04:16:06
you know, being at a company as, as,

00:04:16:06 - 00:04:19:17
that has such broad scale as Google.

00:04:20:20 - 00:04:24:21
even the smallest positive change, really affects billions of users.

00:04:24:21 - 00:04:28:22
And I'm very lucky that, we're not talking about small changes here.

00:04:29:12 - 00:04:32:00
which I'm looking forward to discussing, but,

00:04:32:00 - 00:04:35:13
it's certainly an exciting and really always interesting position to be in.

00:04:36:18 - 00:04:39:20
Well, it just occurred to me, as you were talking,

00:04:39:20 - 00:04:44:10
that this is a very diverse audience, on our podcast today.

00:04:44:10 - 00:04:47:18
So we should probably let them know exactly

00:04:47:18 - 00:04:51:23
what Csam is that you're talking about child sexual abuse material.

00:04:52:00 - 00:04:54:16
What we used to call child pornography.

00:04:54:16 - 00:04:58:20
But tell me a little bit about what Google is doing to combat that,

00:04:59:02 - 00:05:02:02
which is definitely so prevalent and so pervasive.

00:05:02:14 - 00:05:03:06
Absolutely.

00:05:03:06 - 00:05:06:06
So and that's a really important distinction.

00:05:06:17 - 00:05:10:14
you know, I think the legal definition, of course, is child pornography in the US.

00:05:11:07 - 00:05:13:08
but certainly from,

00:05:13:08 - 00:05:15:22
you know, a specialist perspective as the folks

00:05:15:22 - 00:05:18:24
who are working on the ground on this, we prefer the term child

00:05:18:24 - 00:05:22:14
sexual abuse material, which really, leans into it to what?

00:05:22:14 - 00:05:25:11
This is. Right? It is abuse material.

00:05:25:11 - 00:05:29:13
there is no consent, which pornography sometimes can indicate.

00:05:30:23 - 00:05:33:23
you know, from Google's perspective, we are,

00:05:34:14 - 00:05:36:11
doing a variety of things and have been

00:05:36:11 - 00:05:39:15
very proud to have done this for such a long time.

00:05:39:15 - 00:05:42:15
Right. Things like,

00:05:43:01 - 00:05:45:24
we've been using hash matching since 2008.

00:05:45:24 - 00:05:49:11
And at to your point, to your reminder of our audience here,

00:05:49:20 - 00:05:52:20
hash matching allows us to,

00:05:53:18 - 00:05:55:18
detect known csam

00:05:55:18 - 00:05:58:18
material and an easier way.

00:05:59:13 - 00:06:01:22
and then we have also been using classifiers

00:06:01:22 - 00:06:04:22
which allow us to detect new content. Right.

00:06:04:22 - 00:06:06:24
The theory being,

00:06:06:24 - 00:06:09:16
when you have, known content.

00:06:09:16 - 00:06:13:15
Of course, that has been previously identified, you,

00:06:14:13 - 00:06:16:02
can find that more easily.

00:06:16:02 - 00:06:20:20
There's a digital fingerprint, on classify new material could,

00:06:21:09 - 00:06:23:23
potentially lead to,

00:06:23:23 - 00:06:26:23
a child in need of immediate,

00:06:27:08 - 00:06:30:04
immediate assistance recovery,

00:06:30:04 - 00:06:34:00
and also can potentially stop a file from becoming viral

00:06:34:00 - 00:06:34:23
if you're able to catch it.

00:06:36:02 - 00:06:38:21
earlier on, and I would actually Ernie,

00:06:38:21 - 00:06:42:12
please jump in if I'm getting any of this, if you have any additions here.

00:06:42:12 - 00:06:46:14
But I think, you know, those are two of the most basic pieces

00:06:46:14 - 00:06:49:14
of the puzzle when it comes to the tech side of,

00:06:50:22 - 00:06:54:06
of, combating this sort of abuse.

00:06:55:17 - 00:06:58:20
the one thing I was going to jump in on, I think,

00:06:59:04 - 00:07:03:01
I think it's important to note how Google is addressing

00:07:03:08 - 00:07:06:08
not just historic content that's at the core,

00:07:07:05 - 00:07:12:00
but new content, because in this digital age,

00:07:12:21 - 00:07:15:21
the abusers are creating their own.

00:07:15:21 - 00:07:19:19
And, so I think they're kind of two key points I'd like you to

00:07:19:19 - 00:07:24:09
elaborate on is one is even if it's older, content,

00:07:25:06 - 00:07:28:14
you know, you and I have heard from lots of kids, absolutely.

00:07:28:14 - 00:07:31:20
As they grow up that it never goes away.

00:07:32:03 - 00:07:33:07
That's right.

00:07:33:07 - 00:07:37:17
That it's as long as it's out there, they're being revictimized.

00:07:37:17 - 00:07:39:21
This is not victimless content.

00:07:39:21 - 00:07:43:21
And secondly, the work that Google is doing and you are doing to

00:07:43:21 - 00:07:46:21
identify new content quickly

00:07:46:22 - 00:07:49:22
and get it off the internet, I think is really important.

00:07:50:01 - 00:07:51:03
It's so important.

00:07:51:03 - 00:07:55:00
And just to really dive in to that point, you mentioned about,

00:07:56:07 - 00:07:58:02
known content and content

00:07:58:02 - 00:08:02:01
that has been around for years, right?

00:08:02:01 - 00:08:05:04
Those same files of a child,

00:08:06:12 - 00:08:08:13
who is now an adult,

00:08:08:13 - 00:08:13:08
they're still victimized every time that file is reshared.

00:08:13:08 - 00:08:16:08
And I think sometimes it's easy to not

00:08:16:08 - 00:08:19:08
think about that angle to it.

00:08:19:09 - 00:08:21:18
and it's really important to keep that top of mind in terms of

00:08:21:18 - 00:08:27:02
why it's so important to continue to detect that kind of material.

00:08:27:12 - 00:08:30:11
And, you know, obviously we're doing this sort of work,

00:08:30:11 - 00:08:34:05
as you mentioned, both for new content and, known content.

00:08:34:05 - 00:08:38:07
But one of the things we take very seriously at Google is,

00:08:38:13 - 00:08:42:21
you know, we recognize, that we are in a position to develop

00:08:42:21 - 00:08:46:22
best in class technology and to do this sort of work

00:08:47:08 - 00:08:50:23
and therefore are in a position to share that with others.

00:08:51:05 - 00:08:53:24
And so we have, made,

00:08:55:04 - 00:08:58:20
classifiers and actually video hashing, available.

00:08:58:20 - 00:09:01:20
Right. through was developed by YouTube.

00:09:01:23 - 00:09:05:10
and made that available through what's called our child Safety Toolkit.

00:09:05:20 - 00:09:09:23
We make that available, free of charge to eligible partners because I think,

00:09:10:05 - 00:09:13:19
you know, through work with we protect in the tech coalition and,

00:09:13:24 - 00:09:17:08
and others, you know, it's so important

00:09:17:09 - 00:09:21:08
to make sure that we're lifting all boats.

00:09:21:08 - 00:09:23:04
This isn't about

00:09:23:04 - 00:09:26:04
a lot of it is about the big companies, but it's also about

00:09:26:13 - 00:09:29:24
smaller companies who may not have the,

00:09:31:07 - 00:09:33:22
engineering capacity or,

00:09:33:22 - 00:09:37:23
the ability, such as a large company like Google to develop these tools.

00:09:37:23 - 00:09:42:00
And what we also know from research is how

00:09:42:01 - 00:09:45:21
this is not a platform specific concern or issue

00:09:45:21 - 00:09:50:04
that this content, is platform agnostic.

00:09:50:22 - 00:09:53:17
offenders are platform agnostic.

00:09:53:17 - 00:09:56:04
And in order to better,

00:09:57:05 - 00:09:58:11
protect not only our

00:09:58:11 - 00:10:01:11
platforms, but of course, vulnerable users,

00:10:01:17 - 00:10:06:06
the best thing is for everyone to be using best in class technology, not just us.

00:10:06:16 - 00:10:09:16
And so we take that very seriously as well.

00:10:10:08 - 00:10:12:24
Well, let me, I'd like to pursue this a little bit

00:10:12:24 - 00:10:17:01
based on your, your background and your history, you know, at Google.

00:10:17:01 - 00:10:21:18
But clearly you have a unique experience at the center,

00:10:21:18 - 00:10:25:06
at the very center of these challenges in the United States Senate.

00:10:26:04 - 00:10:29:04
through your efforts at thorn and now at Google,

00:10:29:04 - 00:10:33:10
you have seen the challenges that face firsthand.

00:10:34:05 - 00:10:37:22
How serious are the risks and what more needs to be done?

00:10:39:03 - 00:10:40:06
That's a loaded question.

00:10:40:06 - 00:10:42:19
There's so much there.

00:10:42:19 - 00:10:45:06
you know, and certainly in terms of,

00:10:45:06 - 00:10:48:22
child sexual abuse and exploitation online, it is very serious.

00:10:48:22 - 00:10:51:18
And there's no there's no sugarcoating that.

00:10:52:20 - 00:10:55:03
I think awareness is key.

00:10:55:03 - 00:10:59:21
And one of the things I'm most grateful to have learned at thorn

00:11:00:03 - 00:11:02:21
is how to really understand

00:11:02:21 - 00:11:06:03
the issue and how to communicate effectively about it.

00:11:06:03 - 00:11:08:13
That is key as well.

00:11:08:13 - 00:11:12:05
learning directly from law enforcement officers working on these cases,

00:11:12:12 - 00:11:17:10
learning from engineers who understand the technical challenges

00:11:17:10 - 00:11:22:08
of what it means to detect this content and how hard it can be,

00:11:23:01 - 00:11:26:24
and then framing that truth of the threats,

00:11:27:08 - 00:11:30:10
to children in a way that both respects,

00:11:31:02 - 00:11:34:08
you know, the terrible realities that they lived through as victims

00:11:34:08 - 00:11:41:11
and survivors, but can also call leaders, and the public to action.

00:11:41:18 - 00:11:41:24
Right.

00:11:41:24 - 00:11:47:11
Because it's so easy and understandable to shut out an issue

00:11:47:11 - 00:11:51:16
like this, to shut down hearing about it because it is so awful.

00:11:52:20 - 00:11:54:05
but being able to have

00:11:54:05 - 00:11:57:07
clear understanding and communication, I think, is key.

00:11:57:24 - 00:12:00:08
and in terms of what needs to be done,

00:12:00:08 - 00:12:03:08
obviously you have a whole podcast,

00:12:03:16 - 00:12:05:21
to talk about these things.

00:12:05:21 - 00:12:07:23
and certainly can't be wrapped up in an episode.

00:12:07:23 - 00:12:11:03
But, you know, I've been reflecting, you know, end of year,

00:12:12:09 - 00:12:13:15
about certain things.

00:12:13:15 - 00:12:18:21
And I'm finding energy from the lessons we've collectively learned over the years.

00:12:18:21 - 00:12:19:18
Right.

00:12:19:18 - 00:12:25:12
We know that tech can be used for good, just as it can be used to harm.

00:12:25:21 - 00:12:28:23
That's where technology like hash matching came from,

00:12:29:09 - 00:12:32:20
like classifiers came from, those sorts of things.

00:12:32:20 - 00:12:38:04
We can leverage technology to combat these crimes and get ahead.

00:12:39:15 - 00:12:42:00
you know, we also know and, you know, the Kemp

00:12:42:00 - 00:12:45:24
Center audience, knows this just as well in the offline world.

00:12:45:24 - 00:12:46:06
Right.

00:12:47:15 - 00:12:48:23
we always have to evolve.

00:12:48:23 - 00:12:50:24
Just as in the offline world,

00:12:50:24 - 00:12:54:04
offenders are always going to find a way to abuse the system online.

00:12:54:12 - 00:12:55:20
And we have to say you have it.

00:12:55:20 - 00:12:57:17
That's a truth we know.

00:12:57:17 - 00:13:01:08
And we also know that we cannot do it alone.

00:13:01:08 - 00:13:04:18
And that's, you know, something I say at Google all the time.

00:13:04:18 - 00:13:06:24
We cannot and should not do it alone.

00:13:06:24 - 00:13:10:18
this is not something that can be addressed in a silo,

00:13:11:19 - 00:13:13:18
collaboration, multi-stakeholder

00:13:13:18 - 00:13:17:11
engagements, like, you know, the we Protect Global Alliance certainly.

00:13:17:17 - 00:13:20:17
And within industry. Right. You know,

00:13:21:01 - 00:13:24:00
different companies might have different feelings about each other,

00:13:24:00 - 00:13:27:00
but not when it comes to combating child sexual abuse.

00:13:27:24 - 00:13:32:21
and I think on the multi-stakeholder side, it means being ready and willing

00:13:33:05 - 00:13:37:07
to have tough but respectful conversations amongst each other.

00:13:38:01 - 00:13:39:11
at least in my experience.

00:13:40:13 - 00:13:43:14
everyone at the table is there for the right reasons.

00:13:43:14 - 00:13:45:24
They're there to do better for kids.

00:13:45:24 - 00:13:49:14
And, you know, that does include really dedicated advocates within industry.

00:13:49:24 - 00:13:54:03
And I think, you know, we've gained a lot of insights

00:13:54:03 - 00:13:57:24
and paths forward from having some of those tough but

00:13:57:24 - 00:14:02:08
respectful, conversations and one of the things,

00:14:03:03 - 00:14:06:13
I wanted to mention, especially for the, the Kempe community,

00:14:07:09 - 00:14:12:08
that's directly relevant, and in terms of collaboration.

00:14:12:08 - 00:14:15:24
And one of the points that that we've prioritized at Google is,

00:14:16:11 - 00:14:20:19
we're actually partnering with the UK Royal College of Pediatrics and Child

00:14:20:19 - 00:14:26:01
Health to conduct a comprehensive review of its pubertal status guidelines.

00:14:26:12 - 00:14:29:13
And why is that important, to technology?

00:14:29:13 - 00:14:35:13
It's critical for, because we really rely on expert guidance like that.

00:14:35:13 - 00:14:38:17
In terms of, Csam review, when,

00:14:38:23 - 00:14:44:13
csam material comes across, our platforms and this partnership is going to help

00:14:44:13 - 00:14:49:11
revise and update that internal guidance for our teams so that,

00:14:50:13 - 00:14:52:01
pubertal assessments are as

00:14:52:01 - 00:14:55:17
accurate as possible using evidence, clinical, evidence based,

00:14:56:07 - 00:14:58:18
and to help distinguish, again,

00:14:58:18 - 00:15:02:20
as accurately as possible between medical and non-medical content.

00:15:03:03 - 00:15:04:09
And working with obviously,

00:15:04:09 - 00:15:07:09
the Royal College of Pediatrics and child health is huge.

00:15:07:09 - 00:15:10:07
in terms of making sure we're getting that right.

00:15:10:07 - 00:15:13:07
And the other piece that I mentioned earlier is,

00:15:13:11 - 00:15:16:13
you know, we have the benefit of a lot of this

00:15:16:20 - 00:15:19:20
and being able to have these partnerships, and we are prioritizing,

00:15:20:22 - 00:15:23:10
how to, share that.

00:15:23:10 - 00:15:26:10
So I talked about our child safety toolkit, you know,

00:15:26:14 - 00:15:29:12
sharing that technology with other companies.

00:15:29:12 - 00:15:34:09
And our goal is to help set an industry standard on each determination by sharing

00:15:34:09 - 00:15:38:07
these guidelines, again with eligible partners, to raise all boats.

00:15:39:19 - 00:15:42:24
Well, I think that's a really exciting development.

00:15:42:24 - 00:15:47:14
And, you know, as you know, our audience includes pediatricians

00:15:48:01 - 00:15:51:15
and includes other kinds of physicians and includes social workers,

00:15:51:19 - 00:15:57:23
child welfare, workers, researchers, average citizens.

00:15:58:05 - 00:16:04:00
So the whole issue of identifying how old the victim is, yeah,

00:16:04:11 - 00:16:07:20
it affects the intervention that that's available could take place.

00:16:07:20 - 00:16:12:16
So we're excited about the fact that Google is at the cutting edge

00:16:12:21 - 00:16:15:18
on that and is working with the Royal College

00:16:15:18 - 00:16:19:09
of Pediatrics to develop those kinds of, of guidelines.

00:16:21:15 - 00:16:21:23
you know,

00:16:21:23 - 00:16:24:23
obviously, we know that prevention is key.

00:16:25:08 - 00:16:26:13
Yeah.

00:16:26:13 - 00:16:29:04
how do we better equip

00:16:29:04 - 00:16:32:21
the people on the front lines with the prevention methods,

00:16:33:16 - 00:16:37:17
that make this less, a massive problem than it is today?

00:16:38:13 - 00:16:39:02
Yeah.

00:16:39:02 - 00:16:43:24
You know, one thing I'm always conscious of when talking about prevention

00:16:43:24 - 00:16:47:17
is acknowledging how truly overwhelmed

00:16:48:03 - 00:16:51:03
folks on the frontlines are already right.

00:16:51:08 - 00:16:54:13
whether that's law enforcement or social workers

00:16:54:18 - 00:16:57:18
or teachers or folks in the medical field.

00:16:58:17 - 00:17:02:23
you know, it's a difficult challenge to resolve in the short term.

00:17:03:09 - 00:17:06:20
And, you know, as this ecosystem of online

00:17:06:20 - 00:17:10:07
child safety, we really need to engage more

00:17:10:17 - 00:17:13:08
to determine

00:17:13:08 - 00:17:15:06
how to meet that moment that we're in,

00:17:15:06 - 00:17:18:10
that it's not an ideal place to be, that folks are overwhelmed.

00:17:18:10 - 00:17:22:06
But how do we engage with them and equip them with what they need, given

00:17:23:04 - 00:17:24:02
that's where we are.

00:17:25:07 - 00:17:27:08
and, you know, I think,

00:17:27:08 - 00:17:30:08
at least from my perspective, we've done,

00:17:30:12 - 00:17:34:01
a lot of work with law enforcement to understand that side of the house,

00:17:34:01 - 00:17:38:04
but I really see a lot of opportunity in engaging more deliberately

00:17:38:04 - 00:17:42:24
with educators and, medical practitioners to bring them more in the fold.

00:17:42:24 - 00:17:45:24
How do we take your,

00:17:46:03 - 00:17:50:00
baselines, your realities, in into account

00:17:50:00 - 00:17:53:16
when talking through online protections and things like that? And,

00:17:54:16 - 00:17:57:07
you know, that's just one of those

00:17:57:07 - 00:18:00:12
I wish I had a more positive, you know,

00:18:01:18 - 00:18:02:10
outlook there.

00:18:02:10 - 00:18:07:02
But I think the, the positivity comes from knowing where we are

00:18:07:04 - 00:18:10:17
and trying to move forward in that reality.

00:18:11:20 - 00:18:14:00
but on a more positive note,

00:18:14:00 - 00:18:17:09
and you and I have talked about this a lot, and it's always top of mind,

00:18:18:08 - 00:18:20:10
in multi-stakeholder discussions

00:18:20:10 - 00:18:23:10
is digital citizenship, right?

00:18:25:00 - 00:18:27:12
education and programing,

00:18:27:12 - 00:18:30:00
obviously is a huge part of prevention

00:18:30:00 - 00:18:34:04
and a huge part of this, you know, addressing

00:18:34:05 - 00:18:37:21
child sexual abuse and exploitation, I think ensuring that youth,

00:18:39:12 - 00:18:41:16
at all developmental, excuse me at all

00:18:41:16 - 00:18:45:11
developmental levels have age appropriate resources,

00:18:47:03 - 00:18:50:18
that can instill life lessons about navigating the online world.

00:18:50:18 - 00:18:51:11
Like we talked about.

00:18:51:11 - 00:18:54:01
They don't see a distinction between online and offline.

00:18:54:01 - 00:18:56:07
This is just their world that they live in.

00:18:56:07 - 00:18:59:21
And I know, you know, certainly from the tech company

00:18:59:21 - 00:19:03:03
perspective, you know, companies offer these resources.

00:19:03:03 - 00:19:07:08
We have, at Google, at the internet awesome program for younger kids

00:19:07:08 - 00:19:11:06
that talks about you know, having, strong passwords,

00:19:11:06 - 00:19:14:13
how to think critically, being aware of what you share, things like that.

00:19:15:10 - 00:19:16:17
and also just launched,

00:19:18:07 - 00:19:19:01
a teen joining

00:19:19:01 - 00:19:23:03
AI literacy guide, that has similar lessons but geared toward

00:19:23:03 - 00:19:26:03
this new moment we're in when it comes to generative AI.

00:19:27:03 - 00:19:28:21
and that's just on the tech side, right?

00:19:28:21 - 00:19:29:10
Of course.

00:19:29:10 - 00:19:33:00
Nick MC the US National Center for Missing and Exploited Children

00:19:33:11 - 00:19:36:11
has really important programing in this space.

00:19:37:01 - 00:19:39:12
the Department of Homeland Security, you know, to protect

00:19:39:12 - 00:19:42:12
campaign, has,

00:19:42:16 - 00:19:44:20
prevention, messaging

00:19:44:20 - 00:19:47:20
specific to child sexual abuse and exploitation online.

00:19:48:07 - 00:19:51:07
And I know fawn does a ton of impressive work.

00:19:51:18 - 00:19:54:18
speaking directly to teens and,

00:19:54:18 - 00:19:57:13
I think, you know,

00:19:57:13 - 00:19:59:13
these are all different pieces of the puzzle, right?

00:19:59:13 - 00:20:00:21
We're talking about the ecosystem.

00:20:00:21 - 00:20:01:19
And, you know,

00:20:01:19 - 00:20:06:02
this this side of it has this prevention, this one has this side of prevention.

00:20:06:02 - 00:20:09:05
And I think the challenges is putting all those pieces together.

00:20:09:14 - 00:20:09:22
Right.

00:20:09:22 - 00:20:12:24
And making this be sort of an all of society approach.

00:20:12:24 - 00:20:19:00
How do we get this to be second nature to parents and teachers and teens?

00:20:20:03 - 00:20:21:15
you know, I am I think it'll

00:20:21:15 - 00:20:25:20
take time, but I, I'm hopeful, that we're we're getting to the right place.

00:20:25:20 - 00:20:27:11
What do you think?

00:20:27:11 - 00:20:28:20
Well, I hope so.

00:20:28:20 - 00:20:33:01
I mean, the last session of the Colorado Legislature, the Kemp Center, was

00:20:33:01 - 00:20:37:05
heavily involved in discussions about this because what we heard

00:20:37:05 - 00:20:41:00
in various hearings is parents are feeling overwhelmed.

00:20:41:01 - 00:20:44:11
Yeah, they feel like they don't want one.

00:20:44:11 - 00:20:45:21
They are not.

00:20:45:21 - 00:20:48:21
You know, the digital experts or the kids.

00:20:49:08 - 00:20:50:18
Yeah. Not them.

00:20:50:18 - 00:20:54:21
So we we're living in a time in which parents need help.

00:20:55:15 - 00:20:55:21
Yeah.

00:20:55:21 - 00:21:00:20
And, so one of the challenges, part of the debate in the Colorado legislature

00:21:01:02 - 00:21:05:20
is, you know, is it enough to offer

00:21:06:03 - 00:21:09:18
just information and guidance for parents to the need?

00:21:09:22 - 00:21:13:05
I mean, the Australian Esafety commissioner spoke at,

00:21:14:00 - 00:21:16:09
at one of these podcasts and she talked about,

00:21:17:08 - 00:21:20:03
that this is our seatbelt moment.

00:21:20:03 - 00:21:23:22
You know, we need to figure out how we can have basic standards

00:21:23:22 - 00:21:27:18
that ensure that there's the equivalent of a seatbelt available.

00:21:28:03 - 00:21:33:04
So what I know Google has been at the forefront in, in the effort

00:21:33:04 - 00:21:37:01
to bring about prevention and greater parental knowledge and engagement.

00:21:37:09 - 00:21:38:14
Where's the balance?

00:21:38:14 - 00:21:41:14
How do we get to the point we need to get.

00:21:41:21 - 00:21:43:13
Yeah. No, it's tough.

00:21:43:13 - 00:21:48:06
There's no easy answer because I think there's no one child.

00:21:48:17 - 00:21:49:11
Right.

00:21:49:11 - 00:21:51:18
the every child is different.

00:21:51:18 - 00:21:54:18
Every,

00:21:55:08 - 00:21:58:08
you know, socioeconomic,

00:21:58:20 - 00:22:02:05
or, you know, wherever you are in the world is, is different.

00:22:02:05 - 00:22:04:08
How are we building?

00:22:04:08 - 00:22:07:05
flexibility for parents, for teens?

00:22:07:05 - 00:22:10:05
Not all children have parents who can,

00:22:11:14 - 00:22:13:02
you know,

00:22:13:02 - 00:22:16:24
work on parental tools or have those sorts of protections in place.

00:22:16:24 - 00:22:19:14
Not all parents are in a position to even have time.

00:22:19:14 - 00:22:22:04
You know, there are parents who have 2 or 3 jobs and,

00:22:24:03 - 00:22:27:20
how are we building for those types of kids,

00:22:27:20 - 00:22:31:11
as well as the kids who have parents who are super engaged,

00:22:32:07 - 00:22:35:21
and, and are using the, the parental tools actively.

00:22:35:21 - 00:22:38:21
So, you know, I think a lot of it is,

00:22:39:18 - 00:22:41:22
ramping up and making sure people know

00:22:41:22 - 00:22:45:00
what exists and also building and protections.

00:22:45:00 - 00:22:47:23
It's all there's no one silver bullet to any of this.

00:22:47:23 - 00:22:51:14
This is building in the right product protections.

00:22:52:14 - 00:22:55:16
with I think, you know, from our perspective, making sure

00:22:55:23 - 00:22:58:23
it's reflective and respectful of,

00:22:59:10 - 00:23:01:06
you know, the different developmental stages

00:23:01:06 - 00:23:04:06
that either kids or teens are at.

00:23:04:15 - 00:23:07:15
and making sure parents have a variety

00:23:07:15 - 00:23:10:15
of ways to,

00:23:10:21 - 00:23:13:21
decide what's best for their child, because,

00:23:14:06 - 00:23:17:13
you know, a 13 or 14 year old in one

00:23:17:13 - 00:23:21:03
household is much different than a 1314 year old in another.

00:23:21:09 - 00:23:24:15
And, wanting to make sure that, again,

00:23:24:15 - 00:23:28:15
it's not a one size fits all that, that parents,

00:23:29:08 - 00:23:32:24
are able to be engaged, as much as they can be.

00:23:32:24 - 00:23:34:23
But also, you know, of course, we have baseline.

00:23:34:23 - 00:23:36:18
That's what I was talking about previously.

00:23:36:18 - 00:23:39:19
The baseline protections are critical.

00:23:39:19 - 00:23:40:11
We can't,

00:23:41:23 - 00:23:43:22
you know, respect,

00:23:43:22 - 00:23:47:11
or empower our users if we're not first protecting them.

00:23:47:11 - 00:23:50:11
And I think that's where, a lot of the,

00:23:50:22 - 00:23:55:11
it is where a lot of the child sexual abuse and exploitation work is, rests on

00:23:55:11 - 00:23:58:11
is we have to get that right before we can,

00:23:58:15 - 00:24:01:07
be working on the other pieces.

00:24:01:07 - 00:24:05:08
Well, you mentioned a few minutes ago you mentioned in passing generative AI.

00:24:05:16 - 00:24:06:00
Yeah.

00:24:06:00 - 00:24:11:07
So, like, we can have a discussion that's right on safety and not address AI.

00:24:11:16 - 00:24:15:13
So let me ask you, I know Google has been at the forefront and efforts,

00:24:16:03 - 00:24:18:15
how do we balance

00:24:18:15 - 00:24:22:14
the potential enormous potential from generative AI,

00:24:23:07 - 00:24:26:07
with the risks and the harms, particularly to kids?

00:24:27:01 - 00:24:27:21
Absolutely.

00:24:27:21 - 00:24:30:23
And, you know, it's certainly top of mind for me

00:24:30:23 - 00:24:34:17
and top of mind for, the folks listening, I imagine,

00:24:36:03 - 00:24:37:05
you know,

00:24:37:05 - 00:24:40:15
I don't say this, I promise, as a corporate talking point,

00:24:40:22 - 00:24:44:01
but I've really been proud to be part of the team

00:24:44:01 - 00:24:47:01
that has been thinking through from the beginning

00:24:47:05 - 00:24:51:21
threats, mitigations, opportunities, and how does this all fit in?

00:24:51:21 - 00:24:55:23
Because I think you're right in saying that it is both

00:24:56:14 - 00:24:59:20
a challenge and, you know, an opportunity here.

00:24:59:20 - 00:25:03:12
And how do we make that work in this, this new world we're living in?

00:25:04:24 - 00:25:08:00
you know, we started from thinking through the threats, right?

00:25:08:06 - 00:25:10:20
Here's this new kind of technology.

00:25:10:20 - 00:25:13:02
Where do we think it's going to go?

00:25:13:02 - 00:25:17:18
How do we think, it could, offenders could abuse this system.

00:25:18:05 - 00:25:20:19
And that falls under a variety of care categories.

00:25:20:19 - 00:25:23:19
But I think we thought about it, threefold.

00:25:23:22 - 00:25:29:10
So the first was obviously, the creation of csam of child sexual abuse material.

00:25:29:10 - 00:25:33:09
And, you know, that even is divided further when you think about,

00:25:34:12 - 00:25:36:00
you know, completely synthetic,

00:25:36:00 - 00:25:39:03
completely new material of a,

00:25:41:03 - 00:25:44:03
not a real child, right.

00:25:44:09 - 00:25:48:02
or the morphing or editing of either

00:25:48:13 - 00:25:51:13
a completely innocent photo or,

00:25:52:01 - 00:25:54:09
the morphing of existing system

00:25:54:09 - 00:25:57:09
of an abuse photo or series.

00:25:58:12 - 00:26:01:17
second would be the sexualization of children

00:26:01:17 - 00:26:04:04
across a range of modalities, right.

00:26:04:04 - 00:26:07:23
Like this could be, graphic stories,

00:26:08:17 - 00:26:12:09
sexual stories involving kids, images that aren't,

00:26:12:16 - 00:26:15:16
that don't reach the threshold of being illegal but are still,

00:26:16:08 - 00:26:19:08
but still objectify and sexualize children.

00:26:19:17 - 00:26:23:13
and third, you know, how can I,

00:26:25:09 - 00:26:27:02
potentially be misused

00:26:27:02 - 00:26:30:02
to support other kinds of,

00:26:30:10 - 00:26:32:07
abusive behaviors? Right?

00:26:32:07 - 00:26:35:10
Is that, amplifying or scaling

00:26:35:23 - 00:26:38:16
text instructions on how to carry out abuse

00:26:38:16 - 00:26:42:17
or, supporting offenders in grooming or sextortion

00:26:43:02 - 00:26:47:11
or, really normalizing the sexual interest in children?

00:26:47:11 - 00:26:50:17
And I think that's one of the pieces that

00:26:51:00 - 00:26:53:22
tends to come up, is,

00:26:55:08 - 00:26:56:06
you know, if this is not a

00:26:56:06 - 00:26:59:19
real child, questioning of the level of harm.

00:27:00:06 - 00:27:05:16
And I think this is a key point that having been in the field

00:27:05:16 - 00:27:10:02
and certainly you're aware of the harm here, that's very much

00:27:10:02 - 00:27:13:23
in terms of normalizing this behavior, that consuming

00:27:14:10 - 00:27:17:04
content of a,

00:27:17:04 - 00:27:21:05
a fake child can lead to hands on abuse

00:27:21:05 - 00:27:25:03
and making sure that that threat is also addressed.

00:27:25:23 - 00:27:30:06
and I want to just be clear, of course, we have policies and protections

00:27:30:06 - 00:27:33:11
in place for all of these, but it's really important to think through,

00:27:34:06 - 00:27:35:08
and that was part of our process.

00:27:35:08 - 00:27:39:06
How do we understand the threats before we figure out what the protections are?

00:27:39:14 - 00:27:41:07
So what do those protections look like?

00:27:41:07 - 00:27:44:04
Those are what we talked about previously.

00:27:44:04 - 00:27:45:12
Right. Things we've learned.

00:27:45:12 - 00:27:47:04
We're not starting from scratch.

00:27:47:04 - 00:27:48:21
We're using hash matching. Right.

00:27:48:21 - 00:27:51:21
We are using that, for example, to

00:27:52:14 - 00:27:55:14
identify and remove csam from training data.

00:27:55:14 - 00:27:55:23
Right.

00:27:55:23 - 00:28:00:03
If the model doesn't know what it is, if we're not feeding it,

00:28:00:11 - 00:28:03:18
then it is much less likely to produce it on the output.

00:28:03:21 - 00:28:04:13
Right?

00:28:04:13 - 00:28:08:06
So how are we using the tools that we have now to reduce that risk?

00:28:09:14 - 00:28:12:15
it's also protecting against

00:28:13:03 - 00:28:17:04
prompts like right when you go into a chat bot or a Jenny AI tool.

00:28:17:14 - 00:28:20:07
And, you know, we have tools to determine

00:28:20:07 - 00:28:23:07
if this query is,

00:28:24:02 - 00:28:27:02
likely looking for csam, right or not.

00:28:27:20 - 00:28:31:05
and then not only that, but if we get that part wrong,

00:28:31:09 - 00:28:33:22
the failsafe is on the other end,

00:28:33:22 - 00:28:37:14
making sure that we have different kinds of detection methods in place

00:28:37:19 - 00:28:41:19
to prevent the output of something that would sexualize

00:28:42:02 - 00:28:45:02
or be, child sexual abuse material.

00:28:45:07 - 00:28:50:10
And this is all obviously on top of, robust adversarial testing.

00:28:50:10 - 00:28:53:10
And I think one of the things that,

00:28:54:06 - 00:28:55:17
I'm really proud of at Google

00:28:55:17 - 00:28:58:17
is the variety of,

00:29:00:24 - 00:29:03:21
specialists and the people that are on our child safety team.

00:29:03:21 - 00:29:04:03
Right?

00:29:04:03 - 00:29:07:23
There's former law enforcement, former social workers,

00:29:08:10 - 00:29:11:07
former child development experts, and they all have

00:29:11:07 - 00:29:16:19
a different perspective on how could this be abused differently.

00:29:17:19 - 00:29:19:00
and making sure we're

00:29:19:00 - 00:29:23:03
leveraging that wealth of knowledge when, when thinking through,

00:29:24:20 - 00:29:27:04
how robustly these, these things are tested

00:29:27:04 - 00:29:30:04
and in what ways and,

00:29:31:00 - 00:29:35:13
you know, I'd be remiss to say that collaboration isn't a key part of this.

00:29:35:13 - 00:29:36:06
It is.

00:29:36:06 - 00:29:39:17
These are, again, the lessons learned we talked about from the beginning.

00:29:39:17 - 00:29:42:10
We can't do it alone. We shouldn't do it alone.

00:29:42:10 - 00:29:44:10
And I think this is one of those moments.

00:29:44:10 - 00:29:48:09
It was really great to see not only industry coming together.

00:29:49:01 - 00:29:52:06
to talk through what are best practices here.

00:29:52:22 - 00:29:55:14
but beyond that, we've been having multi-stakeholder conversations

00:29:55:14 - 00:29:58:16
with government, with civil society, certainly with mCMC,

00:29:58:16 - 00:30:02:04
making sure we're in close touch to understand what they're seeing.

00:30:02:12 - 00:30:04:21
and how do we get ahead of it?

00:30:04:21 - 00:30:06:06
you know, the tech coalition,

00:30:08:00 - 00:30:10:05
which is a consortium of, of

00:30:10:05 - 00:30:14:13
companies who, collectively are working against child sexual abuse.

00:30:14:13 - 00:30:18:02
They've been hugely involved in making sure

00:30:18:02 - 00:30:21:14
best practices get out there quickly in terms of generative AI.

00:30:21:14 - 00:30:24:14
And thought, of course, has been,

00:30:24:21 - 00:30:26:00
really involved in this.

00:30:26:00 - 00:30:30:15
And we were really proud to sign their Safety by design generative AI principles.

00:30:30:15 - 00:30:34:05
And those are laying, again, key baselines for all companies.

00:30:34:05 - 00:30:36:14
Right? This isn't just about,

00:30:36:14 - 00:30:38:01
it is about the big ones.

00:30:38:01 - 00:30:40:23
but it's also about making sure startups have resources.

00:30:40:23 - 00:30:41:15
Right?

00:30:41:15 - 00:30:45:06
It's not easy always for startup to know where to look for these things.

00:30:45:06 - 00:30:50:09
So how do we make these resources available to everyone so that we're

00:30:50:09 - 00:30:54:19
all starting from the same page and with, you know, the baseline protections.

00:30:54:19 - 00:30:57:19
So, you know, there's a lot of work going on.

00:30:57:19 - 00:30:58:20
You know, it's

00:30:59:23 - 00:31:01:17
just like any change.

00:31:01:17 - 00:31:05:22
Right, Ernie, you've seen all the changes and how this crime manifests from,

00:31:06:21 - 00:31:08:23
you know, in the mail

00:31:08:23 - 00:31:12:12
through imagery and then from imagery to video.

00:31:12:12 - 00:31:15:17
And now we're at this next nexus of,

00:31:17:09 - 00:31:20:16
you know, technology and, you know,

00:31:21:02 - 00:31:24:07
we know there's more work to be done that this will continue to evolve.

00:31:24:07 - 00:31:29:19
But I think, this really was a safety by design moment

00:31:29:19 - 00:31:34:16
in terms of really thinking things through and learning lessons,

00:31:35:13 - 00:31:39:07
that we've, you know, worked really hard on for the past several years.

00:31:40:17 - 00:31:43:11
Well, I think I think this is really timely.

00:31:43:11 - 00:31:47:24
I mean, the KIPP center has established as a priority

00:31:48:10 - 00:31:51:23
focusing on what they're calling 21st century child abuse,

00:31:52:17 - 00:31:55:17
because the fundamentals may have changed a little.

00:31:55:17 - 00:31:58:16
But the basics, haven't.

00:31:58:16 - 00:32:03:04
And so, your leadership in this has, has been great.

00:32:03:04 - 00:32:07:15
Is there any other messages you want to deliver on behalf of Google

00:32:07:22 - 00:32:08:22
for this audience?

00:32:08:22 - 00:32:12:21
And again, this audience is a diverse audience, includes advocates, includes

00:32:12:21 - 00:32:15:21
people working in the child welfare system,

00:32:15:22 - 00:32:19:17
in various settings where even if you're not in

00:32:19:17 - 00:32:23:10
a traditional home, most of these kids are online today.

00:32:23:19 - 00:32:24:06
That's right.

00:32:24:06 - 00:32:27:00
You know, most of these kids have mobile devices.

00:32:27:00 - 00:32:30:06
So for the pediatricians and the social workers and the,

00:32:30:21 - 00:32:34:21
and the researchers and the others, what would you have them do?

00:32:36:09 - 00:32:37:02
Absolutely.

00:32:37:02 - 00:32:42:13
You know, I think, again, this is a collaborative moment.

00:32:42:22 - 00:32:46:04
This is I think, you know, certainly us,

00:32:46:23 - 00:32:51:14
working on the, with the Royal Academy of Pediatricians

00:32:51:14 - 00:32:54:14
and Health,

00:32:54:17 - 00:32:58:16
has it's showing us how important it is.

00:32:58:16 - 00:32:59:24
I mean, we know how important it is to work

00:32:59:24 - 00:33:02:13
with the medical community, but how can we build on that?

00:33:02:13 - 00:33:06:16
I think we are very open to having conversations with folks.

00:33:07:04 - 00:33:11:00
we do work with, health experts, mental health experts.

00:33:11:00 - 00:33:13:11
Certainly.

00:33:13:11 - 00:33:16:00
beyond the, the cam space.

00:33:16:00 - 00:33:19:21
And I think making sure we have a good understanding of what they're seeing.

00:33:20:04 - 00:33:22:18
And, you know, we talked about this earlier.

00:33:22:18 - 00:33:25:18
If we're not collaborating,

00:33:25:20 - 00:33:28:00
you know, we don't know how we can fix things.

00:33:28:00 - 00:33:31:00
So I think what I would say is,

00:33:31:02 - 00:33:34:07
you know, certainly to, to be in touch to,

00:33:34:20 - 00:33:37:17
you know, and I think from the action I would like to take certainly

00:33:37:17 - 00:33:40:20
is making sure that through these multi-stakeholder,

00:33:42:05 - 00:33:45:09
events, whether it's through the TC, whether it's through we protect,

00:33:45:16 - 00:33:49:04
I think there's just so much more we can be doing to,

00:33:50:04 - 00:33:52:05
bring this community

00:33:52:05 - 00:33:55:08
into the online child safety community because,

00:33:56:16 - 00:33:58:15
again,

00:33:58:15 - 00:34:01:14
working in a silo is not helpful.

00:34:01:14 - 00:34:06:14
And we do need to know what they are hearing from patients, what they are

00:34:06:14 - 00:34:10:00
seeing on the ground, and how can we help

00:34:10:08 - 00:34:13:03
with our learnings and share,

00:34:13:03 - 00:34:16:05
what we know, and have a, you know, collaborative discussion.

00:34:16:05 - 00:34:20:02
So I think it's probably very in line with, with a lot of the themes

00:34:20:02 - 00:34:20:24
we've discussed today.

00:34:20:24 - 00:34:24:05
But I think, learning more about the Kempe Center and the work

00:34:24:05 - 00:34:28:05
that you are doing, has really opened my eyes to,

00:34:29:13 - 00:34:32:19
you know, all the work we have to do to make sure that we're

00:34:32:19 - 00:34:36:00
being inclusive of this community when we're talking about online child safety.

00:34:37:14 - 00:34:40:15
Well, Google certainly doesn't have to do much.

00:34:40:15 - 00:34:43:23
You're only one of the world's most powerful, influential companies.

00:34:44:07 - 00:34:47:07
You have to keep children safe on your platforms,

00:34:47:11 - 00:34:52:04
and you have to protect user privacy at the same time and a lot more.

00:34:52:04 - 00:34:58:02
So, you know, thank you to Google for for what you've done and continue to do.

00:34:58:02 - 00:35:01:08
And Emily, thank you for being with us today

00:35:01:15 - 00:35:05:14
and for your remarkable vision and leadership over so many years.

00:35:05:14 - 00:35:11:21
You've had incredible impact, on policy in the Congress of the United States

00:35:11:21 - 00:35:16:04
on technology innovation at thorn and now helping to shape,

00:35:16:17 - 00:35:20:07
the direction that Google takes on these, serious issues.

00:35:20:07 - 00:35:23:22
So, thank you for for everything you've done and are doing.

00:35:24:11 - 00:35:25:16
Well, you're very kind.

00:35:25:16 - 00:35:29:04
And it's always an honor and a pleasure to be chatting with you.

00:35:29:04 - 00:35:29:17
Of course.

00:35:29:17 - 00:35:30:08
And thank you.

00:35:30:08 - 00:35:32:10
And, and thanks to the Kempe center for,

00:35:32:10 - 00:35:35:16
for putting on this podcast series, it's really, really important.

00:35:36:06 - 00:35:37:07
Well, great. Thank you.

00:35:37:07 - 00:35:40:07
Thank you to our listeners for joining us today.

00:35:40:08 - 00:35:42:24
We hope you will tune in again to Radio Kempe.

00:35:42:24 - 00:35:47:07
as we continue this podcast series on 21st century child abuse.

00:35:53:14 - 00:35:56:03
Thank you for listening to Radio Kempe.

00:35:56:03 - 00:35:59:09
Stay connected by visiting our website at KempeCenter.org

00:35:59:09 - 00:36:02:14
and follow us on social media.