Radio Kempe
Radio Kempe is here to connect you with the information you need to tackle current issues. Join us as we talk about difficult topics. Help us as we test assumptions to challenge traditional ways of thinking. Get curious, tune in, and join us on the journey to prevent child abuse and neglect every month of the year! Do you have a topic that you would like to hear from Radio Kempe? Email us at kempe.center@ucdenver.edu and let us know.
Radio Kempe
21st Century Child Abuse: A conversation with Jacqueline Beauchere, first Global Head of Platform Safety at Snap.inc., and the tech industry’s first Chief Online Safety Officer
Jacqueline Beauchere is the first Global Head of Platform Safety at Snap. Inc. She heads Snap’s online safety strategy and is working on these issues worldwide. Previously, Jacqueline spent more than 20 years at Microsoft where she was the company’s, and the industry’s, first Chief Online Safety Officer. At Snap she is raising awareness of online risks, advising on Snap’s tools and policies, and engaging with audiences worldwide.
In the Radio Kempe Podcast, Jacqueline discusses the steps Snap is taking to counter the risks to kids online. She also addresses how Snap is engaging young people in the search for solutions, including the new Teen Council for Digital Well-Being. She also discusses Snap’s new Digital Well-Being Index which measures the risks to kids country-by-country worldwide. And she addresses the growing problem of sextortion and how to address it.
Join Radio Kempe for an interview with social media innovator Jacqueline Beauchere, as we address the new risks and harms to children in this digital age, and work toward real practical solutions.
00:00:00:0 - 00:00:19:6
Unknown
You're listening to Radio Kempe. We value the sense of community that connects people and helps them find ways to move forward. Join us on our journey to prevent child abuse and neglect.
00:00:20:2 - 00:00:58:8
Unknown
Welcome to Radio Kempe. Today is the latest podcast in the series, 21st Century Child Abuse. I'm Ernie Allen, and I will be your host today. I've spent many years in the fight to keep children safe. I'm an advisor to governments, law enforcement, technology companies, and others, including the camp center. Today our guest is Jacqueline Bashir, the first global head of platform safety at Snapping, where Jacqueline heads snaps online safety strategy and is leading Snap's engagement with partners worldwide.
00:00:58:9 - 00:01:31:6
Unknown
Previously, Jacqueline spent more than 20 years at Microsoft and was that companies and the industry's first chief online safety officer. Her role at snap focuses on an overall approach to safety, including raising awareness of online risks, advising on Snap policies, tools and features, and engaging with audiences worldwide. Jacqueline, your pioneering role snap is attracting attention and producing action worldwide.
00:01:31:7 - 00:01:58:2
Unknown
Tell us what you're doing and the impact you're having. Thank you so much, Ernie, and it's a pleasure to be with you today. That's also very generous at Snap. We take a holistic approach to user safety. We leverage both proactive and reactive measures to help keep our users safe. And if you're new to learning about Snapchat, you may not know that we don't consider the service to be traditional social media.
00:01:58:3 - 00:02:31:4
Unknown
In fact, we refer to Snapchat as the antidote to social media. For instance, we don't have social comparison metrics. Those are like public friend counts or likes. And we don't have these seemingly endless feeds of unmoderated content. These are features that we know can make young people feel self-conscious, uneasy, or worse. So on Snapchat, you actually have to affirmatively accept someone else as a friend before you can begin communicating with them directly.
00:02:31:4 - 00:02:58:9
Unknown
And when it comes to safety, we're determined to make Snapchat a hostile environment for any illegal activity or any conduct that violates our community guidelines. So we conduct industry wide research about the risks that teens and young adults are facing online. We have zero tolerance policies in place for egregious harms like child sexual exploitation and abuse and illicit drug activity, meaning one strike and you're out.
00:02:58:9 - 00:03:29:4
Unknown
We take action at the content, the account and the device level. We're seeking to block those egregious offenders from being able to come back to the platform. And in many instances, we're reporting to authorities as well. We use innovative detection methods to proactively find bad actors and violating content. And we want to shut that down. We also offer multiple mechanisms for blocking and reporting, as well as a suite of tools that we call our family center.
00:03:29:5 - 00:04:09:2
Unknown
This gives parents and caregivers insight into who their teens are communicating with on Snapchat and who is communicating with them. We're also engaging with experts like yourself already, because I firmly believe that there's no one entity or organization that can really solve these novel and nuanced issues alone. We support law enforcement in their investigations, and we're really investing in awareness raising and educational efforts, both in our app and online, because we want to inform teens and families as to the role that they play in helping to protect themselves on Snapchat and across the tech ecosystem.
00:04:09:2 - 00:04:32:3
Unknown
So you also asked about impact. We are seeing some progress. We're finding more bad content and accounts earlier and we're taking action. We're making proactive referrals, in some instances to law enforcement in the hopes of prompting an investigation. And we're reaching out to young people and parents directly for their input and their feedback. We're not perfect. Not by any means.
00:04:32:4 - 00:04:58:3
Unknown
And we routinely say that our our our work in this space may never be done, but we are encouraged. Well, the progress is impressive. Jacqueline, and let me say, we were honored to have you address, the Kemp Center Conference in 2023, the international conference that brought together physicians and social workers and child welfare advocates and researchers and academics.
00:04:58:4 - 00:05:32:5
Unknown
because we share that risk about the growing risks and harms to kids online. a challenge we're referring to is 21st century child abuse. from your perspective, and in light of all the progress you talked about you're making. How serious are the risks to kids online today and how best can we counter them? Well, it was my pleasure to be at that conference and represent Snap, Ernie and to engage with such, such esteemed leaders, and luminaries like yourselves in terms of the risks.
00:05:32:6 - 00:06:11:9
Unknown
They are serious, Ernie, and I don't think we should minimize or diminish them in any way. Like you, I've watched the online threat landscape morph and evolve over the past couple of decades, and we're seeing risks that perhaps only the most fantastical sci fi authors and filmmakers really could have imagined. There's financially motivated sextortion, where perpetrators pretend to be someone that they're not like a potential love interest, and they're aiming to deceive their targets into sending sexually explicit images and then blackmail the individuals for money, or gift cards or personal information or something else of value.
00:06:12:0 - 00:06:39:8
Unknown
And this, of course, is in supposed exchange for not releasing those compromising photos and videos to the parents, and friends and family and the entire social network of those targets. There's also AI generated child sexual exploitation and abuse imagery. This is where perpetrators are putting someone else's face on synthetic sexual imagery, or they're generating wholly synthetic sexual images.
00:06:39:9 - 00:07:04:2
Unknown
And all of that leads hotlines and some helplines across the globe to determine if there are real children depicted in any portion of those photos or videos. Synthetic imagery is also being orchestrated to perpetrate sextortion scams. So in these instances, the perpetrators already claim to have a compromising image of the target. So they cut right to the chase and they contact that target.
00:07:04:3 - 00:07:27:6
Unknown
And then the threats move right away to blackmail and so forth. These are just a couple of examples, Ernie, that I think really dominated the landscape in 2023 across all platforms and devices. And unfortunately they continue today. And there are all types of scams and schemes and fraud. and those risks in some instances are more grave than others.
00:07:27:7 - 00:07:53:7
Unknown
But, with all of that, I think there is help. I think that the best defense, of course, is awareness raising and education. I firmly believe that if more teens and young people and even adults, in some instances, they're not immune from some of these risks, if they were aware of the threats that were out there and they were armed with some common sense tools, I think they could ignore and counter them pretty effectively.
00:07:53:8 - 00:08:18:2
Unknown
I I'm also a firm believer in instilling and promoting agency and critical thinking and resilience among young people. These are skills that are going to serve them well throughout their life and in all walks of life. But young people need to be ready for the online world. as you know, of course, social media services are for teenagers ages 13 and above.
00:08:18:3 - 00:08:48:4
Unknown
And and let's be clear. There's nothing magical that happens on a young person's 13th birthday. Their readiness for social media should be based on a number of factors, and it should include parental or trusted adult involvement in that decision. That means that families need to come together to consider for instance, a teen's age and maturity level, the degree of agency and resilience that they demonstrate, their ability to emotionally self-regulate.
00:08:48:4 - 00:09:09:5
Unknown
And, of course, the individual family values. Teens also need to know that there are in-app tools available to them, so that if they come across someone or something that makes them feel uncomfortable, that they can take action. There are blocking tools. There are reporting and removal tools. We have helpful resources and of course, access to hotlines and helplines.
00:09:09:6 - 00:09:35:3
Unknown
So I think we can all do our part to help raise awareness of some of these online risks and most importantly, be open to young people's, their questions and their concerns. Well, I think that's a really important point. And one of the things that I'm particularly excited about that I know that you're doing is you're involving kids themselves, in the process of helping you confront these issues.
00:09:35:4 - 00:10:05:5
Unknown
You've created the Teen Council for Digital Wellbeing. Tell us about that. Thank you for for asking this question, Ernie. because you're catching me right off last week, which was our inaugural summit for this council. So the actual formation of the council goes back to earlier this year when we held an open application process. And what we're trying to do here is formulate a cohort of young people, teens largely between the ages of 13 and 16.
00:10:05:6 - 00:10:37:1
Unknown
And we want to invite them to snap and to be a part of this program for basically to detail 18 months. And we want them to serve as ambassadors and messengers in their schools and in their communities, learning about digital wellbeing, learning about digital civility, learning about online safety, and the tools and resources that are available to them across the ecosystem so that they can play a role in better protecting themselves, having healthier online experiences, and perhaps lending a hand to others as well.
00:10:37:2 - 00:11:04:2
Unknown
So we formulated our cohort earlier this year in the April-May time frame. We've had some online meetings, virtual meetings with the teens leading up to last week, which was when we held our summit with them, at Snap headquarters in Santa Monica, California. We brought not only the 18 teens, but also their chaperons and parents as well. And we had robust discussions.
00:11:04:2 - 00:11:33:2
Unknown
We had group activities and individual activities. We built some camaraderie and team building, and we also had some fun. And I think these teens are ready to go back into their communities and back into their schools and raise issues like the necessary, the necessity of reporting and the need for reporting. So on a more private platform like Snapchat, we might not know that something is is is going on that might be somewhat untoward.
00:11:33:2 - 00:12:04:2
Unknown
And we rely on our community to tell us what's happening so that we can take action. But unfortunately, there are thoughts with young people that these reports, whatever the platform, are not going to be addressed. that perhaps the perpetrators are not going to experience severe enough penalties. The reports aren't going to be looked at or they they just might, you know, choose not to do it because they are afraid of tattle telling or snitching.
00:12:04:3 - 00:12:26:3
Unknown
And we tried to share with our young people that this is not snitching. This is not tattle telling. This is, in fact, protecting the community. Not only might you be saving yourself some issues, but you also might be protecting others. Because once we know something about, a particular actor or an account or content being out there, we can do something about it.
00:12:26:5 - 00:12:47:1
Unknown
So we really shared with them this is a confidential process. We're not going to tell anybody that you reported them, and we're going to take action. And again, you might have that knock on effect of protecting someone else as well. That's just one example of the importance of one of the issues that teens have probably resisted. And they prefer blocking individuals as to actually reporting them.
00:12:47:2 - 00:13:10:0
Unknown
But going that extra step and taking that extra measure of reporting, someone can really have a community wide protective effect. So we're looking forward to the next year or so with these teens, and to what they're going to be able to reduce their individual projects, their group projects. and I hope everyone watches this space, for this pilot program that right now is in the U.S., but we hope to expand as well.
00:13:10:1 - 00:13:36:1
Unknown
Well, thank thank you for for taking that action and for that initiative because clearly, the most, effective way to protect a teen is for that teen to protect himself. So I think engaging them in solutions is, is really important. You also mentioned that the focus today is us. but clearly these risks reach far beyond our borders.
00:13:36:1 - 00:14:08:0
Unknown
And I know that you launched a digital Wellbeing index to measure the degree of risk country by country, worldwide. Can you tell us what you've learned and how our audience can apply it wherever they happen to be located? Absolutely. Ernie and I would say that we created the Teen Council to animate some of this research. So we're in our third year of this research, and this is not just on Snapchat, but in fact on all platforms and services and devices.
00:14:08:0 - 00:14:38:5
Unknown
We conducted this research annually for the last three years in Australia, France, Germany, India, the UK and the US, and we're pulling teens between the ages of 1317, young adults between the ages of 18 and 24, and parents who have teenagers between the ages of 13 and 19. And we're asking them about the risks that they're facing online, the relationships that they're forging, particularly with their parents.
00:14:38:6 - 00:15:10:2
Unknown
And we're asking them about their reaction to what I call 20 different sentiment statements. And this is across five categories. And those are the sentiment statements that make up our digital well-being index. So the index itself is based on something we call the per nut model PRM a where per US stands for positive emotion engagement relationships, negative emotion and achievement.
00:15:10:3 - 00:15:37:9
Unknown
And that's actually based on a longstanding academic model called the Perma model about well-being. And we've adapted it for the online space. So we release these findings every year on International Safer Internet Day, which comes in early February. And I can share with you the the year two findings that we released earlier this year. So just at a very high level, a couple of key things.
00:15:38:0 - 00:16:10:9
Unknown
I said, we're asking about risks. So step 98% of those respondents, Gen Z teens and young adults, said that they had experienced some online risk in early 2023, and that was up two percentage points from the previous year, 57%. So nearly 6 in 10 of those respondents said that they or a friend were involved with intimate or sexual imagery in the prior three months to the survey being fielded.
00:16:10:9 - 00:16:47:6
Unknown
That means they either received that imagery, they were asked for it of themselves, or they shared or distributed photos or videos of someone else. And the important point here in this sexual imagery category is 33%. A third of respondents said that this imagery had spread beyond the intended recipient. And then finally, for a look at parents, half of the parents said that they were unsure about the best ways to actively monitor their teen's online activities.
00:16:47:8 - 00:17:12:5
Unknown
So it's incumbent upon us to give them more tools, more resources, more materials to help them understand the various platforms and what they can do to help protect their teens. As for the Digital Wellbeing Index itself, it was unchanged in year two and it stands at a reading of 62. So on a scale of between 0 and 100, it's a pretty average reading.
00:17:12:6 - 00:17:41:3
Unknown
It's not terribly good, but it's also not particularly worrisome. One other aspect of the index, which I think is important, is we have all these teens, all these respondents, all these countries on this scale of 0 to 100. We've then broken out that scale into four buckets. Some people might be struggling online, more middling, thriving or flourishing. And it basically breaks down in kind of the 8020 rule.
00:17:41:4 - 00:18:11:6
Unknown
We've got 10% roughly on either end of that spectrum, both struggling and flourishing. And then we've got roughly 80% in the middle in the middling and thriving buckets. It's our obligation. It's our responsibility to try to do what we can to move teens and young people up that continuum so that we have more young people thriving and flourishing in the digital environment and and fewer who are actually struggling.
00:18:11:7 - 00:18:50:7
Unknown
Well, you talk about the the impact. I, for one, was stunned by the number of that that 78% number, that 78% of respondents indicated they had encountered this kind of content. I think that that helps us, to understand the potential scope, and the risk. and, you know, recently, I know you joined with the US Department of Homeland Security, and a new campaign called No to Protect, focusing specifically on those online sexual risks to kids.
00:18:50:8 - 00:19:15:8
Unknown
tell us about that and what you're learning. Certainly, Ernie, as I said earlier, I'm a big proponent of awareness raising and education. This was the focus of my focus of my earliest work in online safety some 25 years ago. And I think the most effective and impactful campaigns are those that span stakeholder groups and basically blanket the country.
00:19:15:9 - 00:19:43:6
Unknown
So what the what the Department of Homeland Security in the US has done is exactly what we need to help combat some of these child sexual exploitation and abuse related issues. We need that singular, galvanizing message that both the private sector and the public sector can can get behind. So Snap was actually the first entity to support the DHS is no to protect campaign.
00:19:43:8 - 00:20:10:5
Unknown
So we signed an MOU in support of the campaign at the start of this year. We donated advertising space to, you know, to protect their now posting educational material on Snapchat. So we're enabling them to reach teens where they are. We know that they're on Snapchat. We are featuring the campaign on our platform and on our Privacy Safety Hub, which is our website for safety and privacy issues.
00:20:10:6 - 00:20:47:2
Unknown
We actually invested in and conducted some new research with teens and young, young adults just in the United States about the various dimensions of child sexual exploitation and abuse online. And that research is helping to further inform the campaign. And it's informing our own efforts to keep fighting this frankly appalling abuse across platforms and services. And Ernie, as I wrote in the blog that I posted about our support for this campaign, we all know that the sexual exploitation and abuse of children is illegal.
00:20:47:4 - 00:21:19:1
Unknown
It's vile. And as a topic of polite conversation, it's largely taboo. But these crimes can't be ignored. They need to be discussed in the halls of government, at boardroom tables and at kitchen tables. And that's exactly what a campaign like Note to Protect is encouraging and inspiring. Well, I think that's that's really laudatory. And I hope, our listeners to this podcast, we'll learn more about it and help spread that message.
00:21:19:1 - 00:21:45:0
Unknown
I want to return for just a minute to a point you raised earlier when I asked you, about the elements of the of the problem, the new challenges we're facing. You specifically mentioned sextortion, which I know the FBI director region recently said is, an explosive problem, an epidemic. you did multi-country research on sextortion.
00:21:45:1 - 00:22:12:8
Unknown
what are the highlights? How does the US compare with other countries, and what is Snap doing to protect your users from sextortion? So, Ernie, this is actually part of the Digital Wellbeing Index research that we did last year. We also did a deep dive into sextortion, as part of our second year of that research. And we learned quite a bit.
00:22:12:9 - 00:22:40:5
Unknown
We learned that nearly two thirds, 65% of these generation Z teens and young adults that we pulled in these six countries again Australia, France, Germany, India, UK, in the US, they said that they or their friends had been targeted in online sextortion schemes. Now, that doesn't mean that all those individuals fell for sextortion schemes, but they were targeted.
00:22:40:6 - 00:23:09:4
Unknown
And again, this is not just on Snapchat. This is all platforms, all services, all devices. They said that they or their friends were targeted in online catfishing scams. Now that's where criminals pretend to be someone that they're not, and they try to lure that victim into sharing personal information or producing sexual imagery. They were also hacked by criminals.
00:23:09:4 - 00:23:35:1
Unknown
In these instances, those criminals are gaining unauthorized access to the target's devices or social media accounts, and they're actually stealing that intimate imagery or other private information. But in both scenarios, it's the photos and videos that are produced or accessed. They're then used to threaten or blackmail the young people. And again, the abusers want money. They want gift cards.
00:23:35:1 - 00:24:14:3
Unknown
They want more sexual imagery. They want other personal information, all again in supposedly exchange for not releasing the material to the young person's family and friends and social circle. So we learned that basically half 51% of the respondents said that they or their friends were targeted for or victims of catfishing. So catfishing by by means of sextortion by via catfishing, then 47% said that they or their their friends, devices and social media accounts were hacked again for sextortion via these hacking efforts.
00:24:14:4 - 00:24:43:2
Unknown
And I think, Ernie, we are we are taking a lot of measures, at Snap in terms of combating sextortion and particularly financial sextortion on our platform. For instance, about a year and a few months ago, maybe 12, 14 months ago, we instituted a new reporting reason specifically around financial sextortion. Now, you might know that kids are not necessarily going to access and say, yeah, I want to report financial sextortion.
00:24:43:2 - 00:25:13:0
Unknown
They might not even know what's happening to them. So we, in conjunction with some of our expert advisors and consulting with experts, we came up with their new reporting reason, which basically says they leaked, are threatening to leak my nudes. This has galvanized young people to come forward, and adults as well, to tell us that they are being extorted, they are being blackmailed or abused and we can take action.
00:25:13:0 - 00:25:45:0
Unknown
We've also added some new in-app resources to learn about financial sextortion, to learn about the consequences of sexting and sharing nudes, to learn about child sex trafficking. To learn about grooming all of these various aspects and dimensions of child sexual abuse online are a little bit interwoven, and they kind of overlap. So we're offering those resources by means of what we call our state safety snapshot episodes in the app, and people can access those directly to learn a little bit more.
00:25:45:1 - 00:26:07:6
Unknown
We are also looking for signals and patterns across the platform to proactively get ahead of some of this. So when we talk to law enforcement, law enforcement always wants to, as in their words, get to the left of boom boom. Of course, being the threats, being the black male, being the the culmination of the abuse that's taking place online.
00:26:07:8 - 00:26:50:8
Unknown
We want to move to prevention and protection. So we're actually looking for patterns and signals across the service that might be indicative of perpetrator activity, particularly linked to sextortion. These are just a couple of the things that that we're doing. We really have a range of actions both proactive and reactive. But I'd like to share with the audience, Ernie, if I may, some advice and guidance that we have for parents of caregivers and other trusted adults in these scenarios, because sadly, we know that some young people feel that there's no way out when they're faced with sextortion, and some of them are making that ultimate sacrifice.
00:26:50:8 - 00:27:22:1
Unknown
And we have to show them that there is hope. This is just something that they need to get past, but they will get past it. So we want parents and caregivers and other trusted adults, clergy, counselors, coaches to have regular open and honest conversations with teens. Of course, encourage them not to share their devices or their passwords with anyone, and that includes best friends because they're opening themselves up to potential sextortion by hacking with giving someone else access.
00:27:22:2 - 00:27:47:7
Unknown
We also want to encourage them to exercise good judgment and think critically and we want them to get help from technology. Like I mentioned earlier, Snapchat Family Center, which will give parents and caregivers anyone over the age of 25 the ability to invite a teen to participate in a family center where they can see who the teen is connecting with on Snapchat, but they don't see any of the teens messages or snaps.
00:27:47:8 - 00:28:16:0
Unknown
And that's really important that we are. We are trying to strike a balance, protecting young people and their privacy as they're coming into this very critical developmental phase in their in their growth. But we also want to give parents that insight. And again, coming back to the advice and guidance, if these issues arise again, report the incident to the platform, block the offender, don't respond to any of the of the demands, don't meet any of the demands.
00:28:16:0 - 00:28:44:4
Unknown
And if it's appropriate, report to local law enforcement. Well, I think that's great advice obviously is the law enforcement component of this is really troubling. And once again, from your research, the numbers, of the numbers of users who are being impacted by this is, is frightening. and I think the FBI director also said that many of these sex stalkers are rings operating from other countries.
00:28:44:5 - 00:29:06:4
Unknown
So this is, so building awareness of this is important. And I think your leadership in this area is extraordinary. Well, you've addressed a wide range of risk to kids online and the work you're doing. I guess my final question for you is how serious do you think these risks are and what more needs to be done?
00:29:06:4 - 00:29:29:0
Unknown
And particularly, what can our audience do, to help you address this? I think we've seen by what we've discussed that the risks are pretty serious and the stakes are pretty high. But I think the most important thing that this audience can be open to is conversations with young people, with teens and asking them what they're doing online, really.
00:29:29:0 - 00:29:53:3
Unknown
And we have to come to those conversations, Ernie, without judgment, without pretext. And really listen with the intent of being moved and to really try to understand and bridge that gap between perhaps older generations that didn't grow up online and don't understand what young people are going through and young people themselves. This is exactly what we heard from our teen council last week.
00:29:53:4 - 00:30:12:4
Unknown
They're being more appreciative of what their parents are going through and how parents and caregivers are having to juggle and jostle so many different things, so many different apps, so many different devices, so much possibility for risk and exposure. Whereas the teens, this is just their life. This is how they were brought up. This is how they're growing up.
00:30:12:5 - 00:30:50:8
Unknown
This is their very existence. So let's ask, let's inquire. Let's strike up a conversation to learn and again to listen with the intent of being moved. And we have the ability to point out the potential for risk and where things could potentially go wrong. Because we've been around, we've been around the block, we've seen some of these things happen, and we can just help enlighten young people as to just be a little bit more critical, maybe be a little bit more skeptical, and approach these issues with, with, with eyes wide open.
00:30:50:9 - 00:31:20:6
Unknown
Well, thank you to Jack one share. The global head of Platform safety and Snap Inc.. thank you for being with us today. Thank you for your remarkable vision and leadership over many years, including well before Snap and your many years at Microsoft. You've had incredible impact not just on one company, but on an entire industry. So we're grateful for all you've done to enable progress in the fight against 21st century child abuse.
00:31:20:7 - 00:31:33:8
Unknown
And thank you to our listeners for joining us today. We hope you will tune in again to Radio Kempe as we continue this podcast series on 21st century child abuse.
00:31:33:8 - 00:31:50:4
Unknown
Thank you for listening to Radio Kempe. Stay connected by visiting our website at Camp center.org and follow us on social media.