Radio Kempe

Ensuring Child Safety Online in the 21st Century: Policy Implications

The Kempe Center

Summary of Key Findings  

In the current state of technology, we have failed to create any meaningful legal framework to ensure our children are safe in the digital world. The lack of legal protections for children online differs from those of the physical world. Children online are at risk of experiencing poor mental health outcomes, being victims of predatory behavior, being exposed to harmful content, and being illegally sold firearms and/or drugs. Without age assurance or verification in the digital world, adults have unfettered access to children. Currently, most child sex trafficking victims report they are being contacted via text and internet platforms such as social media and gaming. Additionally, the number of images of child sex abuse material (CSAM; formerly referred to as child pornography) sent to authorities increased exponentially in the 21st century, and the vast majority of CSAM images come from social media. Children are also at greater risk of being exposed to adult content, including pornography, whether the exposure was sought out or unwanted. 

Protecting children online would require platforms to verify the ages of their users and to put restrictions in place so children can navigate the digital world safely. Various advanced age verification processes have been developed and are widely available. Despite this, many platforms have failed to implement age protections even when they know children frequent their sites. Thus, lawmakers are compelled to make clear that the same protections for children that exist in the physical world are also required in the digital world. There are many policy changes needed to further ensure technology safety, and a starting point is to: a) require pornography websites to verify user age to ensure only adults are accessing the sites; b) require social media companies and gaming platforms to verify a new account holder’s age; and c) requiring social media platforms to enable maximum default privacy settings for users who are children.

 

Maura Gissen Bio

Maura Gissen is a fifth year Clinical Psychology doctoral student with the University of Colorado Denver and has her master’s in counseling psychology. Maura currently works with the Farley Health Policy Center (FHPC) with CU Anschutz engaging in research and program implementation. More specifically, she has been focused on youth mental health related to diversifying the workforce pipeline, and on child health and safety in digital spaces. Maura has been working in the mental health field for ten years and focuses on the intersection of trauma and systemic disparities for individuals across the lifespan. She is passionate about engaging in clinical therapeutic practice, along with research focused on policy, advocacy, and systems-level change.

 

00:00:03:02 - 00:00:05:11
You're listening to Radio Kempe. 

00:00:05:11 - 00:00:08:06
We value the sense of community that connects

00:00:08:06 - 00:00:11:06
people and helps them find ways to move forward.

00:00:11:14 - 00:00:14:02
Join us on our journey to prevent child abuse

00:00:14:02 - 00:00:17:02
and neglect.

00:00:21:09 - 00:00:23:22
Welcome and welcome back.

00:00:23:22 - 00:00:26:08
This is Radio Kempe.

00:00:26:08 - 00:00:27:04
I'm Kendall Marlowe

00:00:27:04 - 00:00:30:12
with the center for the Prevention and Treatment of Child Abuse and Neglect.

00:00:31:05 - 00:00:33:15
Thanks for joining us.

00:00:33:15 - 00:00:35:21
Hi, Kendall. Hi, there.

00:00:35:21 - 00:00:37:14
It's Maura Gissen.

00:00:37:14 - 00:00:41:21
Maura is going to be with us today and help us explore

00:00:41:21 - 00:00:45:17
a new and rapidly changing landscape.

00:00:45:18 - 00:00:48:02
Maura, thanks for being here.

00:00:48:02 - 00:00:49:16
Yeah. Thanks so much for having me, Kendall.

00:00:49:16 - 00:00:52:16
I'm really excited to to dive into this topic with you.

00:00:54:14 - 00:00:58:24
So in child welfare, have we all ever notice

00:00:58:24 - 00:01:02:04
that it can feel like nothing ever changes?

00:01:03:03 - 00:01:07:17
It can seem like the same set of problems, the same calls for reform,

00:01:08:18 - 00:01:11:18
the same initiatives, maybe dressed up with new branding

00:01:11:21 - 00:01:15:24
and it can feel like the same old battles and nothing ever

00:01:15:24 - 00:01:18:24
changes.

00:01:19:10 - 00:01:21:15
But is that true?

00:01:21:15 - 00:01:24:15
When our society's technology changes,

00:01:25:09 - 00:01:28:12
does that change the harms experienced

00:01:28:20 - 00:01:31:20
by our society's kids?

00:01:32:01 - 00:01:34:05
Is there such a thing

00:01:34:05 - 00:01:38:07
as 21st century child abuse?

00:01:39:15 - 00:01:42:09
More question or a guessing?

00:01:42:09 - 00:01:46:00
Who are you and what brought you to this work?

00:01:47:08 - 00:01:50:10
Yeah, thank you so much and so interesting to hear.

00:01:51:15 - 00:01:53:19
You talk about child welfare in that way.

00:01:53:19 - 00:01:56:19
I'd love to touch back on that at some point, but,

00:01:56:22 - 00:01:58:11
so my name is Maura Gissen.

00:01:58:11 - 00:02:02:05
I'm a fifth year Clinical Psychology PhD student.

00:02:03:00 - 00:02:04:10
Just a little bit of background.

00:02:04:10 - 00:02:07:18
I got my master's in Counseling Psychology,

00:02:08:18 - 00:02:12:16
2015 to 2017, and then I worked at the VA hospital here

00:02:12:16 - 00:02:17:10
in Colorado for two years doing research in, suicide risk and prevention,

00:02:18:04 - 00:02:21:03
and then went on to grad school.

00:02:21:03 - 00:02:23:22
My passion really lies,

00:02:25:02 - 00:02:26:05
at the intersection

00:02:26:05 - 00:02:30:00
of experiences of trauma and systemic disparities.

00:02:30:14 - 00:02:36:06
So thinking about the ways in which our systems and policies and environments

00:02:36:18 - 00:02:41:16
impact us all and impacted, lots of us in disproportionate ways,

00:02:42:00 - 00:02:45:07
and then the ways in which that interacts with experiences of trauma,

00:02:45:07 - 00:02:48:19
especially for kids and then just individuals across the lifespan.

00:02:48:19 - 00:02:54:12
So I've done a lot of work with children and adolescents and also adults.

00:02:54:12 - 00:02:57:09
And what I've learned is that even when I'm working with adults,

00:02:57:09 - 00:03:00:21
a lot of the work we're doing clinically, meaning when I'm doing therapy

00:03:00:21 - 00:03:04:08
and providing therapy, is much of

00:03:04:20 - 00:03:08:07
their distress, is rooted in childhood experiences.

00:03:09:12 - 00:03:13:20
And so I care deeply about childhood

00:03:13:20 - 00:03:16:03
well-being and,

00:03:16:03 - 00:03:19:19
the development, the, the kind of crucial aspects of development.

00:03:20:10 - 00:03:23:23
And I love this work, especially being able to get my doctorate

00:03:23:23 - 00:03:27:04
because it allows me to do individual therapy.

00:03:27:12 - 00:03:31:17
So kind of reaching at that individual level and then to do research

00:03:31:17 - 00:03:36:09
in policy and advocacy spaces that are really touching on,

00:03:37:11 - 00:03:39:21
more of that systems level work.

00:03:39:21 - 00:03:44:10
And so, for so long, you know, when I think about trauma

00:03:44:10 - 00:03:47:16
and when we've been taught about trauma in school, it's

00:03:47:16 - 00:03:50:16
been about things like abuse and neglect and,

00:03:50:20 - 00:03:54:10
different experiences that happen to individuals in their physical world.

00:03:55:06 - 00:04:00:15
But what we're seeing today is this huge, fast paced advancement

00:04:00:15 - 00:04:04:20
taking place in terms of the digital space and the online world.

00:04:05:10 - 00:04:10:13
And that's, you know, even myself being a pretty young adult,

00:04:10:22 - 00:04:14:05
not something that I experienced to the degree that kids are today.

00:04:14:20 - 00:04:18:04
We're seeing, different forms of abuse and high risk,

00:04:18:20 - 00:04:22:04
situations occurring for kids that I think needs to be addressed.

00:04:23:07 - 00:04:25:22
And, and I care deeply about this.

00:04:25:22 - 00:04:29:22
So how did you get involved with something

00:04:29:22 - 00:04:33:12
called the Farley Health Policy Center?

00:04:34:07 - 00:04:39:03
How did you end up writing a policy brief along with your coauthors

00:04:40:05 - 00:04:42:00
for the Farley Health Policy Center?

00:04:42:00 - 00:04:45:00
Folks, listening to this, I think Maura,

00:04:45:07 - 00:04:48:02
many folks know that Kemp's center,

00:04:48:02 - 00:04:51:12
which is a multidisciplinary center at the University of Colorado,

00:04:51:15 - 00:04:54:02
we're about the prevention and treatment of child abuse and neglect.

00:04:54:02 - 00:04:59:08
We've got some 80 some professional educators, researchers,

00:04:59:08 - 00:05:04:20
health care professionals, social workers and attorneys all address and again,

00:05:04:20 - 00:05:07:20
preventative prevention to treatment of child abuse and neglect.

00:05:08:13 - 00:05:10:07
Also at the University

00:05:10:07 - 00:05:13:18
of Colorado is something called the Farley Health Policy Center.

00:05:13:18 - 00:05:16:18
What is that and how did you get involved?

00:05:17:15 - 00:05:21:05
Yeah, the Farley Health Policy Center is an incredible, center.

00:05:21:05 - 00:05:25:05
So I've been working there for about two and a half years now,

00:05:25:05 - 00:05:29:02
and they are a, health policy center

00:05:29:02 - 00:05:33:01
located at the University of Colorado Anschutz Medical Campus.

00:05:34:02 - 00:05:36:24
Their team is extremely interdisciplinary.

00:05:36:24 - 00:05:39:16
So there are,

00:05:39:16 - 00:05:42:21
faculty who are medical doctors, individuals

00:05:42:21 - 00:05:47:07
in the public health space, psychologists, economists, lawyers.

00:05:48:02 - 00:05:51:15
The real goal at the Farley Health Policy Center is to develop

00:05:51:15 - 00:05:55:20
and translate evidence to advance policies and integrate systems

00:05:55:20 - 00:05:58:20
that improve health equity and well-being.

00:05:59:10 - 00:06:03:24
So we have a lot of projects focused on things like payment reform

00:06:03:24 - 00:06:09:06
in the medical space, integrated primary care, a lot of work in the child,

00:06:09:16 - 00:06:12:18
health and wellbeing space, which I'm involved in a lot of that work.

00:06:12:18 - 00:06:17:01
And so I was working closely with Sheil, who is our,

00:06:17:11 - 00:06:20:16
director at the Farley Health Policy Center, doctor Cheryl Wong.

00:06:21:11 - 00:06:26:03
And she, Well, the Farley Center has had a long standing

00:06:26:03 - 00:06:29:03
collaborative relationship with the cancer center,

00:06:29:03 - 00:06:34:00
and we share the vision with them of feeling so strongly

00:06:34:00 - 00:06:37:00
about childhood safety, well-being,

00:06:37:01 - 00:06:40:01
addressing childhood environments

00:06:40:05 - 00:06:45:06
and both the Farley Center and the Kempe Center recognized that there

00:06:45:06 - 00:06:49:12
is this very rapidly changing landscape in terms of the digital

00:06:49:12 - 00:06:51:01
space and how that's changed.

00:06:52:03 - 00:06:55:13
Children's life experiences pretty broadly.

00:06:55:13 - 00:07:01:01
And so together we decided it was important to, address

00:07:01:01 - 00:07:04:17
this, especially when thinking about the regulations that are needed in place.

00:07:05:18 - 00:07:07:21
So we, we partnered with,

00:07:07:21 - 00:07:12:20
Doctor Warren Binford and, I was very fortunate to be able

00:07:12:20 - 00:07:17:04
to take the lead on this brief and feel really lucky to be doing this work.

00:07:18:00 - 00:07:19:24
So what are we addressing?

00:07:19:24 - 00:07:21:01
What's the issue here?

00:07:21:01 - 00:07:24:01
Kids on the internet,

00:07:24:07 - 00:07:27:07
folks who have kids in their home

00:07:27:15 - 00:07:30:16
currently more, know all about this,

00:07:32:07 - 00:07:36:06
but for us civilians or retired veterans

00:07:36:06 - 00:07:39:21
in that branch of service, of child rearing,

00:07:41:19 - 00:07:46:06
how evasive is the internet in kids lives?

00:07:46:06 - 00:07:48:13
Have we actually quantified that at all?

00:07:48:13 - 00:07:49:04
We can.

00:07:49:04 - 00:07:54:04
So we know that nationally, up to 95% of youth age 13 to 17

00:07:54:12 - 00:07:57:11
and approximately 40% of children aged

00:07:57:11 - 00:08:00:11
8 to 12 report using social media.

00:08:00:15 - 00:08:03:24
So when we think about those numbers, I mean, that's the majority

00:08:03:24 - 00:08:04:24
of kids, right?

00:08:06:14 - 00:08:09:14
And I would actually even argue that while,

00:08:10:01 - 00:08:14:10
parents are probably much more in-tune to, parents or caregivers

00:08:14:10 - 00:08:18:08
are much more in tune to children's use of the internet,

00:08:18:18 - 00:08:22:05
that there's actually still pretty significant gaps in terms of like,

00:08:22:15 - 00:08:26:04
what parents are able to monitor and how, you know, the ways

00:08:26:04 - 00:08:30:24
in which they may know or not know that their children are using the internet,

00:08:30:24 - 00:08:34:19
which is actually part of the problem that we're trying to address here.

00:08:34:19 - 00:08:37:19
And so,

00:08:38:00 - 00:08:40:13
I think we need to recognize that

00:08:40:13 - 00:08:43:13
the digital space, social media, gaming,

00:08:44:00 - 00:08:46:14
the online world is extremely pervasive.

00:08:46:14 - 00:08:48:10
It is a part of children's lives.

00:08:48:10 - 00:08:51:10
It is there is no separating the two.

00:08:52:05 - 00:08:54:10
And so it's something that we need to look

00:08:54:10 - 00:08:57:24
at as integral to childhood,

00:08:59:03 - 00:09:00:08
overall, that

00:09:00:08 - 00:09:03:08
there is no separating of the two.

00:09:05:00 - 00:09:08:00
This could be a positive thing for a lot of kids, right?

00:09:08:08 - 00:09:08:22
That's it.

00:09:08:22 - 00:09:13:11
I've got a granddaughter who keeps in touch with her cousins, in a way

00:09:13:11 - 00:09:16:11
that, you know, never would have been a part of my life growing up.

00:09:16:22 - 00:09:18:22
Yeah. What are the harms?

00:09:18:22 - 00:09:21:22
Are there potential harms that come from this?

00:09:22:04 - 00:09:22:16
There are.

00:09:22:16 - 00:09:24:01
And I think I do want to emphasize

00:09:24:01 - 00:09:27:01
what you just said before I go into those harms that,

00:09:28:17 - 00:09:31:08
well, I'll be focusing more on the harms.

00:09:31:08 - 00:09:33:11
And that's what we really focused on with the brief.

00:09:33:11 - 00:09:35:22
I think it's so important to acknowledge that.

00:09:35:22 - 00:09:41:15
That doesn't mean that there aren't benefits and, ways in which children

00:09:41:22 - 00:09:45:10
are engaging in the online world that are really great, right?

00:09:45:18 - 00:09:50:08
Community building, I think especially if we think about marginalized communities,

00:09:50:23 - 00:09:56:21
LGBTQ, population who might not get that same support at home are getting that,

00:09:57:20 - 00:09:59:13
maybe through community online.

00:09:59:13 - 00:10:02:10
So that's a different conversation, perhaps.

00:10:02:10 - 00:10:05:22
But I think it's important to acknowledge and,

00:10:07:13 - 00:10:11:10
we're not ignoring that within this, that the harms that we're discussing,

00:10:12:03 - 00:10:15:24
however, the risks are significant

00:10:16:05 - 00:10:19:09
and they are, only getting worse.

00:10:19:14 - 00:10:24:04
So we know that social media use

00:10:24:11 - 00:10:29:04
can lead to just overall mental health distress and mental health concerns

00:10:29:04 - 00:10:32:18
like depression, anxiety, addiction and all of these different things.

00:10:32:18 - 00:10:36:06
The Surgeon General, put out an advisory a couple years ago

00:10:36:15 - 00:10:39:15
where he really dove into that.

00:10:40:02 - 00:10:43:11
We touched on that, but for the purpose of this brief,

00:10:43:11 - 00:10:47:03
we really wanted to get into the more extreme risks and extreme harms.

00:10:47:03 - 00:10:50:03
And these are extremely pervasive as well.

00:10:50:14 - 00:10:53:12
So what we're talking about, and I think you use this term

00:10:53:12 - 00:10:57:13
earlier, is really 21st century child abuse.

00:10:58:10 - 00:11:01:10
So excuse me that that can look like,

00:11:03:00 - 00:11:04:02
children

00:11:04:02 - 00:11:08:01
essentially what's happening is adults have unfettered access to children

00:11:08:01 - 00:11:12:05
in a way that they wouldn't necessarily in the physical world.

00:11:12:05 - 00:11:12:11
Right.

00:11:12:11 - 00:11:15:19
In the physical world, we have regulations in place.

00:11:16:04 - 00:11:19:08
Children can't walk into a liquor store and buy alcohol.

00:11:19:17 - 00:11:22:24
Kids don't get their permits until they're maybe 15 licensed.

00:11:22:24 - 00:11:25:24
When they're 16, they can't go gamble.

00:11:26:16 - 00:11:31:20
If you go, an adult goes on to a school ground, they have to sign in.

00:11:32:04 - 00:11:34:20
There's all sorts of protections we have in the physical world

00:11:34:20 - 00:11:37:20
for kids for a reason.

00:11:38:04 - 00:11:39:14
We don't have that online.

00:11:39:14 - 00:11:43:24
There are minimal regulations online which lead to things like,

00:11:45:14 - 00:11:49:07
predatory adults being able to get in contact with children,

00:11:49:07 - 00:11:52:11
which can lead to things like what's called sex, sextortion,

00:11:52:11 - 00:11:56:08
which is sexual extortion of children, producing

00:11:56:08 - 00:11:59:19
of, child sexual abuse material.

00:12:01:08 - 00:12:03:09
We see illegal sales

00:12:03:09 - 00:12:06:24
online of firearms and drugs to kids.

00:12:07:18 - 00:12:10:18
Kids are able to access

00:12:10:22 - 00:12:14:15
adult content such as pornography, especially violent pornography

00:12:14:15 - 00:12:18:18
or harmful pornography, even sometimes when they're not seeking it out,

00:12:18:18 - 00:12:21:18
sometimes it's being sent to them.

00:12:22:11 - 00:12:25:11
And these have huge implications for kids,

00:12:25:11 - 00:12:29:06
development for kids, mental health, for kids well-being and their safety.

00:12:29:15 - 00:12:32:15
So with that kind of predatory behavior,

00:12:32:22 - 00:12:35:13
what you're reminding me of is,

00:12:35:13 - 00:12:38:13
in my experience as a child welfare administrator,

00:12:39:02 - 00:12:42:22
it was not a matter of if, it was just a matter of when

00:12:43:17 - 00:12:46:20
after somebody opened it, some kind of congregate

00:12:46:20 - 00:12:50:16
care facility for kids in the child welfare system,

00:12:51:17 - 00:12:54:17
group, home, residential treatment center, whatever we call it.

00:12:54:18 - 00:12:57:06
It was just a matter of time

00:12:57:06 - 00:13:00:06
before I started to get reports

00:13:00:10 - 00:13:02:14
of there are men hanging out

00:13:02:14 - 00:13:06:05
outside this facility, waiting in cars, approaching

00:13:06:05 - 00:13:10:14
kids on the sidewalk, offering them favors, taking them places.

00:13:11:13 - 00:13:13:11
It was not, you know, it

00:13:13:11 - 00:13:16:14
was a feature, not a bug of congregate care.

00:13:17:10 - 00:13:19:18
And the sidewalk

00:13:19:18 - 00:13:22:18
was that arena.

00:13:23:06 - 00:13:26:10
So is that sidewalk now moved to the digital space?

00:13:27:06 - 00:13:28:18
Absolutely, absolutely.

00:13:28:18 - 00:13:32:22
So that's what we've seen as just like you said, like when we think about like,

00:13:32:22 - 00:13:36:08
bus stops and sidewalks like that is now the digital space.

00:13:36:08 - 00:13:41:18
I mean, so just kind of one statistic, one example to paint a picture here for you.

00:13:41:18 - 00:13:47:04
So in there was a survey done in 2020, of undergraduate students

00:13:47:04 - 00:13:50:04
and they were asked about their childhood experiences online.

00:13:50:10 - 00:13:53:05
What that survey that this one survey showed

00:13:53:05 - 00:13:56:24
was that 25% of respondents engaged with adults,

00:13:56:24 - 00:14:02:24
strangers online when they were children and 33% who communicated with adults.

00:14:02:24 - 00:14:06:01
Strangers reported sexual solicitation from them.

00:14:07:20 - 00:14:10:13
Some groups, were shown to be at greater risk.

00:14:10:13 - 00:14:14:21
So LGBTQ plus youth, respondents were shown to be three times

00:14:14:21 - 00:14:17:08
more likely to face high interactions.

00:14:17:08 - 00:14:20:13
And if we think about this survey, the survey was administered

00:14:20:13 - 00:14:23:12
in 20, 25 years ago.

00:14:23:12 - 00:14:27:04
And it was done, on undergraduate students.

00:14:27:04 - 00:14:30:04
So not even kind of that

00:14:30:06 - 00:14:33:06
these were kids 18 and older, most likely.

00:14:33:13 - 00:14:37:05
So we're seeing the numbers go up, which means that we can assume

00:14:37:05 - 00:14:40:05
that those stats are likely even higher.

00:14:41:10 - 00:14:43:02
And something important

00:14:43:02 - 00:14:46:02
to note too, as to what you said, is that,

00:14:46:11 - 00:14:51:03
you know, no, no child is not at risk.

00:14:51:07 - 00:14:54:08
So if your child is online,

00:14:54:20 - 00:14:58:08
they are at risk of having these potentially

00:14:58:14 - 00:15:00:21
predatory adults engage with them.

00:15:00:21 - 00:15:03:21
So maybe, you know,

00:15:04:00 - 00:15:08:12
however long ago, parents thought some sort of security if they had

00:15:08:12 - 00:15:14:18
the privilege of not having their children and these more kind of at risk spaces.

00:15:14:18 - 00:15:18:12
But the online world in digital space is an at risk space.

00:15:18:12 - 00:15:19:23
So your child is at risk.

00:15:19:23 - 00:15:21:21
And I think we need to be aware

00:15:21:21 - 00:15:25:08
of how high these risks are and how prevalent this is.

00:15:25:14 - 00:15:28:10
Do those online interactions

00:15:28:10 - 00:15:32:07
then actually lead to in-person contact?

00:15:32:22 - 00:15:36:10
Are some of the predators who are engaging with kids online,

00:15:37:01 - 00:15:41:03
then meeting them in person and doing things to them in person?

00:15:41:16 - 00:15:43:23
Yes. And that's what's extremely concerning.

00:15:43:23 - 00:15:48:03
So, much of these interactions have led to,

00:15:49:08 - 00:15:50:15
meeting in person

00:15:50:15 - 00:15:54:01
and then some kind of, sexual assaults or abuse occurring.

00:15:54:11 - 00:15:58:14
So some of the pathways that lead to that we see are,

00:15:59:21 - 00:16:03:24
you know, an adult stranger gets in contact with a child online.

00:16:03:24 - 00:16:08:01
They begin to form a relationship, they decide to meet in person,

00:16:08:01 - 00:16:11:09
and then the adult is able to harm this child.

00:16:11:23 - 00:16:14:18
We also see sexual, extortion.

00:16:14:18 - 00:16:20:07
So, adults may end up kind of persuading a child

00:16:20:07 - 00:16:24:23
to send them what we call csam or child sexual abuse material, where a child

00:16:25:07 - 00:16:28:20
takes a video or a photo of themself in a sexual nature,

00:16:30:09 - 00:16:33:15
sends it to this person and now this person has kind of,

00:16:33:15 - 00:16:36:15
for lack of better words, like blackmail over them and is able

00:16:36:15 - 00:16:40:08
to use that to coerce the child

00:16:40:17 - 00:16:45:03
to meet with them, or to continue to send pictures to send money.

00:16:46:05 - 00:16:48:16
There's all sorts of different pathways like these.

00:16:48:16 - 00:16:50:09
These are just a couple examples.

00:16:50:09 - 00:16:54:12
But, these online interactions are absolutely translating

00:16:54:12 - 00:16:59:01
into physical interactions, which are then leading to harm

00:16:59:01 - 00:17:03:19
and trauma and, further abuse in, in the real world.

00:17:03:19 - 00:17:07:20
In the real world, you use the term that literally made

00:17:07:20 - 00:17:10:20
no child sex abuse materials.

00:17:11:12 - 00:17:12:20
Isn't that child porn?

00:17:12:20 - 00:17:15:00
Why don't we call it child porn?

00:17:15:00 - 00:17:15:21
It's a great question.

00:17:15:21 - 00:17:18:21
So that was changed very intentionally. So,

00:17:20:06 - 00:17:23:06
pornography has a lot of,

00:17:23:21 - 00:17:25:14
we can make assumptions about pornography.

00:17:25:14 - 00:17:27:23
Pornography assumes consent.

00:17:27:23 - 00:17:31:23
So between two adults, a child 18

00:17:32:01 - 00:17:35:01
or younger than 18 cannot consent.

00:17:35:16 - 00:17:38:06
They cannot engage in this behavior

00:17:38:06 - 00:17:41:06
with full autonomy and decision making.

00:17:41:15 - 00:17:44:19
Pornography also assumes some level of,

00:17:45:18 - 00:17:49:20
entertainment, watching for forms of pleasure.

00:17:50:07 - 00:17:52:01
This is a crime.

00:17:52:01 - 00:17:53:14
This is abuse.

00:17:53:14 - 00:17:56:07
So child sexual abuse material

00:17:56:07 - 00:18:00:02
is really taking a trauma informed and also a legal.

00:18:00:02 - 00:18:01:20
It has legal implications, right.

00:18:01:20 - 00:18:02:20
If we think about

00:18:02:20 - 00:18:06:08
these are illegal behaviors and actions that are taking place.

00:18:06:19 - 00:18:08:20
And so we don't want to use the word pornography.

00:18:08:20 - 00:18:10:15
That's not accurate.

00:18:10:15 - 00:18:12:01
So that's why that was changed.

00:18:14:03 - 00:18:17:05
So this starts in the digital world.

00:18:17:05 - 00:18:21:19
But then leads to actual physical encounters, physical harm.

00:18:22:18 - 00:18:25:18
I know that I have heard in Colorado,

00:18:26:07 - 00:18:29:07
from what we call from getting the term

00:18:29:07 - 00:18:32:07
right, sexual assault nurse examiners,

00:18:32:13 - 00:18:34:20
these are highly trained

00:18:34:20 - 00:18:40:10
professionals in health care settings like hospitals who are treating victims

00:18:40:10 - 00:18:44:04
of sexual assault, sexual exploitation and sexual trafficking.

00:18:45:03 - 00:18:47:20
I know that they're now reporting

00:18:47:20 - 00:18:50:09
that back in the day,

00:18:50:09 - 00:18:54:21
those initial contacts that a victim might have with their abuser

00:18:54:21 - 00:18:59:16
or were somewhere in person again, the bus stop, the sidewalk.

00:19:00:08 - 00:19:03:08
But no, the majority of the time

00:19:03:11 - 00:19:06:22
they're seeing that the initial contact was online.

00:19:09:11 - 00:19:10:10
Yeah.

00:19:10:10 - 00:19:11:22
Yeah. Exactly.

00:19:11:22 - 00:19:15:12
And so I mean the, the, the number

00:19:15:21 - 00:19:19:02
I don't have it, you know directly in front of me, but the, the vast

00:19:19:02 - 00:19:22:10
majority of these interactions are happening online.

00:19:23:10 - 00:19:24:19
So we need

00:19:24:19 - 00:19:28:07
to be thinking about this in a much different way.

00:19:28:16 - 00:19:32:05
And I don't want to undermine the harms that occur even if,

00:19:32:16 - 00:19:36:18
a child doesn't end up meeting someone who's a predator.

00:19:37:12 - 00:19:42:19
In person, there are still extreme harms that occur from those online interactions

00:19:42:19 - 00:19:47:07
that do need to be addressed at the same kind of level of, concern.

00:19:47:07 - 00:19:52:08
So, absolutely, it's really shifted into the digital space.

00:19:52:21 - 00:19:55:21
So the policy brief that,

00:19:56:13 - 00:20:00:09
you helped write is called Ensuring Child

00:20:00:09 - 00:20:05:01
Safety Online, Ensuring Child Safety Online.

00:20:05:01 - 00:20:09:00
And it is available online

00:20:09:24 - 00:20:13:17
at Farley Health Policy center.org.

00:20:13:17 - 00:20:17:16
That's Farley f a r Lee y Farley health

00:20:17:16 - 00:20:21:15
policy center.org again ensuring child safety online.

00:20:22:19 - 00:20:25:07
You also then

00:20:25:07 - 00:20:27:18
start to talk in that brief

00:20:27:18 - 00:20:31:04
about solutions policy solutions

00:20:32:04 - 00:20:33:19
changes that we can make

00:20:33:19 - 00:20:36:24
in both practice in law.

00:20:37:23 - 00:20:42:03
There's more than one bill currently trying to work its way

00:20:42:03 - 00:20:45:09
through the Colorado State General Assembly

00:20:46:03 - 00:20:48:17
on all of this.

00:20:48:17 - 00:20:50:11
So what do we do about all this?

00:20:50:11 - 00:20:53:11
And can we start with

00:20:54:00 - 00:20:57:00
who's responsibility is it

00:20:57:06 - 00:21:01:07
to protect kids from these particular harms?

00:21:02:07 - 00:21:04:11
We rely on parents, don't we,

00:21:04:11 - 00:21:07:14
to keep kids safe in our society to a great degree.

00:21:08:09 - 00:21:12:05
And I know that parents take that very seriously.

00:21:13:01 - 00:21:15:09
Can parents solve this?

00:21:15:09 - 00:21:16:13
No. Not alone.

00:21:16:13 - 00:21:19:13
And that's what I,

00:21:20:09 - 00:21:22:07
that's a frustrating,

00:21:22:07 - 00:21:24:24
load and burden to put on parents alone.

00:21:24:24 - 00:21:28:23
You know, as I alluded to earlier, parents even let's even say,

00:21:28:23 - 00:21:32:19
like the most involved parent who has a fabulous relationship

00:21:32:19 - 00:21:35:19
with their child, great communication.

00:21:35:21 - 00:21:40:13
They still likely don't know all of what their child is engaging with online.

00:21:41:08 - 00:21:45:18
And children, just to backtrack a little bit, throughout childhood

00:21:45:18 - 00:21:49:03
and adolescence are going through extreme developmental changes, right?

00:21:49:03 - 00:21:51:09
Which put them in a very vulnerable place.

00:21:51:09 - 00:21:54:12
So when I say that, what I mean is that their brains are not fully developed.

00:21:54:12 - 00:21:58:11
They are they don't have a false sense of who they are,

00:21:58:11 - 00:22:00:06
what their likes are, what their dislikes are.

00:22:00:06 - 00:22:01:05
They're figuring that out.

00:22:01:05 - 00:22:01:23
That's often

00:22:01:23 - 00:22:05:10
part of what's fun about childhood, but also makes them more vulnerable.

00:22:05:18 - 00:22:08:10
It makes them more susceptible to,

00:22:08:10 - 00:22:11:20
seeking external validation, social comparison.

00:22:12:15 - 00:22:17:04
They are more impulsive and less, able to understand

00:22:17:04 - 00:22:20:13
and comprehend the risk of certain behaviors that they engage with.

00:22:20:13 - 00:22:25:11
So we put all of these things together and this puts them at greater risk.

00:22:25:11 - 00:22:29:04
So even the most, kind of, for lack of better words

00:22:29:04 - 00:22:34:05
like healthy mentally, well, child is still at risk.

00:22:34:05 - 00:22:35:19
Right. And so

00:22:36:20 - 00:22:40:02
to put all the burden on parents, we can't do that.

00:22:40:02 - 00:22:44:00
This is a really, really collaborative effort that needs to take place.

00:22:44:10 - 00:22:47:10
And when I see that I'm thinking about,

00:22:48:12 - 00:22:51:21
those in big tech and the digital space,

00:22:52:21 - 00:22:56:09
they are the ones who are profiting off of children and off of their users,

00:22:57:21 - 00:23:01:11
which we can assume may also be why they may be incentivized

00:23:01:11 - 00:23:06:08
to push back on some of these, regulations, but they absolutely

00:23:06:08 - 00:23:10:00
need to be a part of taking responsibility and not just put that on parents.

00:23:10:16 - 00:23:14:10
As a clinician, when I hear people say, well,

00:23:14:10 - 00:23:17:24
there's parent restrictions in place that they can put, I mean, that's

00:23:18:05 - 00:23:22:13
that's not a good argument to me because I work with kids across the board,

00:23:22:23 - 00:23:26:22
some of which who are in families where parents are not paying close

00:23:26:22 - 00:23:30:22
attention to them in some where they are, those restrictions don't do enough.

00:23:31:17 - 00:23:33:21
Kids can get past those restrictions.

00:23:35:17 - 00:23:36:16
They're not enough.

00:23:36:16 - 00:23:39:19
So we need they just take the device away.

00:23:40:11 - 00:23:43:11
You know, if my kid keeps, you know,

00:23:43:11 - 00:23:45:16
like, going out on their bicycle, you know,

00:23:45:16 - 00:23:48:21
I there's something I could do in the physical world,

00:23:49:11 - 00:23:52:20
whether it's a helmet on the kid's head or maybe helping him ride.

00:23:52:20 - 00:23:56:09
Or maybe we take a break from riding a bicycle for a day or so to heal up.

00:23:57:00 - 00:24:00:00
There's things I could do to physically protect my kid.

00:24:00:15 - 00:24:03:15
Can't appear to just take away the device.

00:24:04:16 - 00:24:05:10
Yeah, sure.

00:24:05:10 - 00:24:07:09
But, like, the internet still exists, right?

00:24:07:09 - 00:24:09:00
Like our social media still exists.

00:24:09:00 - 00:24:12:03
If you take away my phone, I can just access it on a different device

00:24:12:03 - 00:24:14:17
and even if you put some kind of lock on it,

00:24:14:17 - 00:24:16:08
I could try to create another social media.

00:24:16:08 - 00:24:18:13
I mean, there are so many ways around that.

00:24:18:13 - 00:24:23:01
What's a burner phone for those of us who are older than 16 years old?

00:24:23:01 - 00:24:24:00
What's a burner phone?

00:24:25:07 - 00:24:27:02
So my understanding of a burner phone

00:24:27:02 - 00:24:32:19
is a phone that you can, purchase and, it has a different phone number.

00:24:32:19 - 00:24:35:19
And basically, like a child can use that.

00:24:36:24 - 00:24:39:08
It parents can't like, it can't be tracked.

00:24:39:08 - 00:24:41:10
It can't be traced.

00:24:41:10 - 00:24:42:06
It's a good question.

00:24:42:06 - 00:24:44:19
I don't know, I'm not a burner phone expert,

00:24:44:19 - 00:24:48:14
but my understanding is that it's like a a device that you can get.

00:24:48:14 - 00:24:51:23
So parents wouldn't know that you have another phone.

00:24:52:04 - 00:24:57:15
We heard some testimony in, recent Colorado General Assembly committee

00:24:57:15 - 00:25:03:00
hearings of kids who had their devices taken away by parents

00:25:03:11 - 00:25:06:08
and their friends, bought them a burner phone. Yep.

00:25:06:08 - 00:25:07:14
So there you go.

00:25:07:14 - 00:25:09:06
Yeah. There. Worked right around it.

00:25:09:06 - 00:25:12:24
And our kids also testified there's actually videos online

00:25:13:15 - 00:25:18:09
for how you get around the different parental control tools.

00:25:20:11 - 00:25:21:24
It's all there.

00:25:21:24 - 00:25:23:22
Oh yeah I mean I've worked with kids

00:25:23:22 - 00:25:27:02
who their parents put in those protections.

00:25:27:02 - 00:25:30:01
And then the kid tells me that they're still accessing these things.

00:25:30:01 - 00:25:34:06
And so, you know, we may never have a perfect system.

00:25:34:06 - 00:25:37:06
And like individuals who,

00:25:37:18 - 00:25:39:19
are committing crimes

00:25:39:19 - 00:25:43:17
may still find ways, but we should absolutely be

00:25:43:23 - 00:25:48:10
trying much harder than we are to put much stronger regulations in place.

00:25:48:10 - 00:25:49:14
And so one of those

00:25:49:14 - 00:25:52:22
what are those regulations potentially do if it's not just a matter of

00:25:52:23 - 00:25:56:13
of grabbing the kid's phone and locking it in a drawer,

00:25:56:23 - 00:26:00:14
what can social media platforms, what can companies do?

00:26:01:01 - 00:26:04:14
Yeah, I mean, our brief was really focused on age verification.

00:26:04:14 - 00:26:07:22
So there are many different things, but we wanted to take a focus on age

00:26:07:22 - 00:26:11:03
verification feels like a really productive place to start.

00:26:12:09 - 00:26:13:05
What that looks

00:26:13:05 - 00:26:16:05
like is individuals having to verify their age

00:26:16:10 - 00:26:19:16
before being able to get on, social

00:26:19:16 - 00:26:22:16
media, gaming, adult content, websites,

00:26:22:18 - 00:26:26:13
and then that can look different, like what happens next can look different.

00:26:26:13 - 00:26:30:21
So if it's an adult pornography website, the child shouldn't be able to access it.

00:26:30:21 - 00:26:33:23
If it's social media, then next steps would be

00:26:33:23 - 00:26:35:16
how is that social media tailored?

00:26:35:16 - 00:26:38:20
So that it's a safer space for the child's developmental

00:26:38:20 - 00:26:41:20
and chronological age?

00:26:41:23 - 00:26:44:23
You know, some might say there's age verification

00:26:44:23 - 00:26:48:18
already, like you type in your age or you give your birth date.

00:26:48:18 - 00:26:50:20
I mean, that's not,

00:26:50:20 - 00:26:53:15
that's not the those aren't those regulations aren't working

00:26:53:15 - 00:26:54:16
the way that they need to be.

00:26:54:16 - 00:26:58:24
So with these, kinds of age verification systems

00:26:58:24 - 00:27:02:22
or, technologies, they now are very advanced.

00:27:02:22 - 00:27:06:21
So they have things like facial biometric recognition.

00:27:06:21 - 00:27:10:00
There are some where you can kind of like, even put your hand up and,

00:27:10:08 - 00:27:11:15
and wave your hand around.

00:27:11:15 - 00:27:15:17
And the technology is so advanced that it can pick up on specific

00:27:16:01 - 00:27:19:17
aspects of an individual's features, whether their face or their hand,

00:27:20:00 - 00:27:25:05
which tells them that they are within a certain or above a certain age.

00:27:25:05 - 00:27:27:24
I mean, and they're really, really accurate.

00:27:27:24 - 00:27:30:24
So a lot of individuals will say, well,

00:27:31:08 - 00:27:36:05
there's, you know, concerns about privacy and all of these pieces, but

00:27:36:24 - 00:27:39:18
we have ways to verify age that don't necessarily

00:27:39:18 - 00:27:43:06
require you to submit, government issued I.D.

00:27:43:07 - 00:27:44:01
or to, to,

00:27:45:06 - 00:27:45:15
give,

00:27:45:15 - 00:27:50:22
give your full identity and so they can verify age without knowing identity.

00:27:50:22 - 00:27:54:09
Now, exactly how this is done.

00:27:55:06 - 00:27:58:06
I've been told this is done all over Europe.

00:27:59:16 - 00:28:01:13
And we already have a traction don't

00:28:01:13 - 00:28:04:13
dealt with things like gambling sites and whatnot.

00:28:04:17 - 00:28:05:03
Right.

00:28:05:03 - 00:28:09:20
We already have it on gambling websites and it's already being done in Europe.

00:28:09:20 - 00:28:14:01
I mean the pushback it you know at face value

00:28:14:01 - 00:28:18:18
some of these pushbacks around like privacy make sense.

00:28:18:18 - 00:28:22:23
But then when you actually dig into it they don't because we have ways to do it

00:28:22:23 - 00:28:25:13
without somebody having to provide their identity.

00:28:25:13 - 00:28:28:16
And there can be regulations or laws in place

00:28:28:16 - 00:28:32:15
that require tech companies to then discard that information immediately.

00:28:34:01 - 00:28:36:19
So I don't think

00:28:36:19 - 00:28:39:21
that, you know, these things already exist.

00:28:43:01 - 00:28:44:01
There are

00:28:44:01 - 00:28:47:01
a couple of bills directly addressing this.

00:28:47:02 - 00:28:51:17
Maura and I both know in the Colorado State General Assembly.

00:28:52:20 - 00:28:54:13
Take a look if you're interested.

00:28:54:13 - 00:28:58:15
It's Senate Bill 20 5-086.

00:29:00:19 - 00:29:02:16
More comprehensively, house Bill

00:29:02:16 - 00:29:06:12
25, dash 1287.

00:29:08:04 - 00:29:12:18
These do put responsibility on the platforms, don't they?

00:29:12:18 - 00:29:14:17
Are these changes that you're suggesting?

00:29:14:17 - 00:29:17:01
Age, assurance.

00:29:17:01 - 00:29:20:01
Can Mark Zuckerberg afford it? Yes.

00:29:21:00 - 00:29:22:06
Yes, absolutely.

00:29:22:06 - 00:29:27:23
And also that like, you know I that's a difficult yes he can afford it.

00:29:27:23 - 00:29:33:01
And like why we you're not his accountant but you think I can afford

00:29:33:05 - 00:29:36:05
I can make an assembly line item for age assurance.

00:29:36:06 - 00:29:37:03
Right, right.

00:29:37:03 - 00:29:39:09
I mean, with maybe all the money that they're profiting off

00:29:39:09 - 00:29:42:18
of children on the internet from, and,

00:29:46:11 - 00:29:49:08
I was about to say something, and I've just lost it, but,

00:29:50:13 - 00:29:53:14
Yeah, I mean, they can absolutely afford this.

00:29:53:14 - 00:29:56:12
They absolutely can be taking these steps.

00:29:56:12 - 00:29:57:24
They absolutely should be.

00:29:57:24 - 00:29:59:19
We're taking them in the physical world.

00:29:59:19 - 00:30:02:19
We should be taking them online.

00:30:02:21 - 00:30:04:17
So more, I guess.

00:30:04:17 - 00:30:07:17
And thank you for all of that.

00:30:08:00 - 00:30:10:17
If we get this right,

00:30:10:17 - 00:30:12:05
we get this right.

00:30:12:05 - 00:30:16:12
We address the valid concerns around privacy.

00:30:17:05 - 00:30:20:07
We adopt systems of age

00:30:20:07 - 00:30:24:07
verification for age assurance that do verify age

00:30:24:07 - 00:30:27:15
to a reasonable degree of accuracy without disclosing the identity.

00:30:28:08 - 00:30:30:06
We give people the right.

00:30:30:06 - 00:30:33:15
If you're an adult, you feel you've been unfairly identified

00:30:33:15 - 00:30:34:14
as a child or something.

00:30:34:14 - 00:30:36:23
There's a right to appeal that, all that kind of stuff.

00:30:36:23 - 00:30:39:23
We get it right.

00:30:39:24 - 00:30:42:24
Understanding there's no perfect system.

00:30:43:04 - 00:30:47:02
What's Maura Gibson's vision a few years from now?

00:30:47:02 - 00:30:48:04
If we get it right,

00:30:49:04 - 00:30:50:20
what's the world like?

00:30:50:20 - 00:30:52:23
How's the world better for kids?

00:30:52:23 - 00:30:54:24
Yeah, I think it's such a good question.

00:30:54:24 - 00:30:56:22
It's such a,

00:30:56:22 - 00:30:58:02
a meaningful question.

00:30:58:02 - 00:31:00:18
It's a big question, too.

00:31:00:18 - 00:31:04:08
You know, I think at more of a logistical level, it's,

00:31:04:08 - 00:31:09:08
you know, hopefully adults have far less unfettered access to children.

00:31:09:23 - 00:31:13:02
Hopefully children are more protected online that they aren't able

00:31:13:02 - 00:31:15:15
to access this,

00:31:15:15 - 00:31:17:18
harmful content.

00:31:17:18 - 00:31:20:19
That, children are being sold drugs

00:31:20:19 - 00:31:23:19
and firearms at the levels with which they are now.

00:31:23:22 - 00:31:26:22
Which means that we may see less,

00:31:28:12 - 00:31:30:23
addiction and death.

00:31:30:23 - 00:31:33:21
But ultimately, if we're looking big picture,

00:31:33:21 - 00:31:36:14
what I'm thinking about is the well-being of our children

00:31:36:14 - 00:31:42:01
and what that means to me is the physical and mental health safety of our children.

00:31:42:01 - 00:31:44:00
And so,

00:31:44:00 - 00:31:48:06
we're in a period that I think all of us are attempting to catch up to

00:31:48:06 - 00:31:52:08
with the exposure that we have in the digital space. And,

00:31:54:16 - 00:31:57:22
our children deserve to be able to develop and thrive

00:31:57:22 - 00:32:03:07
in ways that they're protected and feel safe and can go

00:32:03:15 - 00:32:06:11
have childhood experiences, not adult experiences.

00:32:06:11 - 00:32:07:16
They're in their childhood.

00:32:07:16 - 00:32:11:10
And so my hope, but on a larger scale, is that this really

00:32:11:17 - 00:32:14:10
has an impact for kids,

00:32:14:10 - 00:32:17:14
in their developmental stage, in their mental health growth

00:32:17:14 - 00:32:21:03
and their physical safety, and then that they become healthy adults

00:32:21:03 - 00:32:24:06
who can thrive and help with the next generation.

00:32:25:06 - 00:32:28:06
Internet's not going anywhere. So

00:32:28:21 - 00:32:29:20
I love that vision.

00:32:29:20 - 00:32:33:06
Thank you, Maura Gissen, for traveling with us

00:32:33:06 - 00:32:36:06
today through that digital space.

00:32:36:06 - 00:32:42:24
But seeing how it very much impacts the kids who are in our homes,

00:32:42:24 - 00:32:45:03
in our schools

00:32:45:03 - 00:32:49:05
today, keep in touch and let us know what you discover next.

00:32:49:05 - 00:32:51:07
Or IRA absolutely will do.

00:32:51:07 - 00:32:53:23
Thanks so much. I enjoyed this conversation. Thank you.

00:32:55:07 - 00:32:55:17
Thank you,

00:32:55:17 - 00:32:58:17
Maura, and thank you to our listeners.

00:32:58:20 - 00:33:01:20
Join us again soon and often.

00:33:02:03 - 00:33:04:01
We'll be glad to be with you.

00:33:04:01 - 00:33:07:01
This has been Radio Kempe.

00:33:10:05 - 00:33:12:18
Thank you for listening to Radio Kempe.

00:33:12:18 - 00:33:16:00
Stay connected by visiting our website at Kempe

00:33:16:00 - 00:33:19:04
center.org and follow us on social media.