Artwork

Innhold levert av National Center on Sexual Exploitation. Alt podcastinnhold, inkludert episoder, grafikk og podcastbeskrivelser, lastes opp og leveres direkte av National Center on Sexual Exploitation eller deres podcastplattformpartner. Hvis du tror at noen bruker det opphavsrettsbeskyttede verket ditt uten din tillatelse, kan du følge prosessen skissert her https://no.player.fm/legal.
Player FM - Podcast-app
Gå frakoblet med Player FM -appen!

Why did Twitter Allow Child Sexual Abuse Materials (Child Porn) on its Platform?

41:31
 
Del
 

Manage episode 292221725 series 1271976
Innhold levert av National Center on Sexual Exploitation. Alt podcastinnhold, inkludert episoder, grafikk og podcastbeskrivelser, lastes opp og leveres direkte av National Center on Sexual Exploitation eller deres podcastplattformpartner. Hvis du tror at noen bruker det opphavsrettsbeskyttede verket ditt uten din tillatelse, kan du følge prosessen skissert her https://no.player.fm/legal.

Child sexual abuse materials (or CSAM, aka child pornography) has surged over 106% during COVID according to the National Center on Missing and Exploited Children. And tragically, CSAM isn't only happening on the Dark Web - it's also flourishing on mainstream social media platforms like Twitter.

Survivor John Doe was only 16 when he discovered exploitative child sexual abuse materials of himself at age 13 were posted on Twitter. The video managed to accrue over 160,000 views before Twitter finally took it down—despite multiple reports from both John Doe and his mother verifying his status as a minor.

Lisa Habba, Esq. and Peter Gentala, Esq. joined this episode of the Ending Sexploitation podcast to share the story of John Doe, and another male survivor, who are suing Twitter for facilitating their child sexual abuse materials.

The discussion includes the legal challenges of the case and why Twitter assumes it should be immune from any liability, despite fostering an environment that appears to allow child sexual abuse materials to flourish.

Take Action:

If you or someone you know has been harmed by sexual exploitation via Twitter please contact the Haba Law Firm and the NCOSE Law Center.

Learn more about this case and help spread the word on how Twitter is complicit with the distribution of child sexual abuse material (CSAM.)

  continue reading

49 episoder

Artwork
iconDel
 
Manage episode 292221725 series 1271976
Innhold levert av National Center on Sexual Exploitation. Alt podcastinnhold, inkludert episoder, grafikk og podcastbeskrivelser, lastes opp og leveres direkte av National Center on Sexual Exploitation eller deres podcastplattformpartner. Hvis du tror at noen bruker det opphavsrettsbeskyttede verket ditt uten din tillatelse, kan du følge prosessen skissert her https://no.player.fm/legal.

Child sexual abuse materials (or CSAM, aka child pornography) has surged over 106% during COVID according to the National Center on Missing and Exploited Children. And tragically, CSAM isn't only happening on the Dark Web - it's also flourishing on mainstream social media platforms like Twitter.

Survivor John Doe was only 16 when he discovered exploitative child sexual abuse materials of himself at age 13 were posted on Twitter. The video managed to accrue over 160,000 views before Twitter finally took it down—despite multiple reports from both John Doe and his mother verifying his status as a minor.

Lisa Habba, Esq. and Peter Gentala, Esq. joined this episode of the Ending Sexploitation podcast to share the story of John Doe, and another male survivor, who are suing Twitter for facilitating their child sexual abuse materials.

The discussion includes the legal challenges of the case and why Twitter assumes it should be immune from any liability, despite fostering an environment that appears to allow child sexual abuse materials to flourish.

Take Action:

If you or someone you know has been harmed by sexual exploitation via Twitter please contact the Haba Law Firm and the NCOSE Law Center.

Learn more about this case and help spread the word on how Twitter is complicit with the distribution of child sexual abuse material (CSAM.)

  continue reading

49 episoder

Toate episoadele

×
 
Loading …

Velkommen til Player FM!

Player FM scanner netter for høykvalitets podcaster som du kan nyte nå. Det er den beste podcastappen og fungerer på Android, iPhone og internett. Registrer deg for å synkronisere abonnement på flere enheter.

 

Hurtigreferanseguide

Copyright 2024 | Sitemap | Personvern | Vilkår for bruk | | opphavsrett