Header Ads

Twitter Tries To Dismiss Another Child Porn Case Citing Section 230: Even If Allegations ‘True,’ We’re Not ‘Liable’

 Twitter is trying to dismiss a second child pornography lawsuit filed against them by an underaged vicim, citing outlined protections under Section 230 of the Communications Decency Act — a controversial clause that has protected Big Tech from viewpoint discrimination and, in this case, they claim, child pornography being hosting on their platforms.

In the court filing, Twitter argued that even if “all” of the minors’ “allegations” are accepted “as true, there is no legal basis for holding Twitter liable for the Perpetrators’ despicable acts.”

Twitter was hit with a lawsuit in January alleging that a young boy who was solicited and recruited for sex trafficking, known as John Doe #1, had to endure his own sexual abuse material being promoted on Twitter, even after attempts were made to remove the content.

A second alleged victim, known as John Doe #2, later joined the federal lawsuit. “Both plaintiffs were harmed by Twitter’s distribution of material depicting their sexual abuse and trafficking, and by Twitter’s knowing refusal to remove the images of their sexual abuse (child pornography) when notified by John Doe #1 and his parents,” read a press release from the National Center on Sexual Exploitation (NCOSE).

“To encourage platforms to moderate and remove offensive content without risking incurring potentially ruinous legal costs, in 1996 Congress enacted Section 230 of the Communications Decency Act (‘CDA § 230’), granting online platforms like Twitter broad immunity from legal claims arising out of the failure to remove content,” a motion to dismiss argued. “Given that Twitter’s alleged liability here rests on its failure to remove content quickly enough from its platform, dismissal of the FAC with prejudice is warranted on this ground alone.”

Section 230 of the U.S. Code states:

No provider or user of an interactive computer service shall be held liable on account of any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.

Twitter noted that the company eventually “did remove the videos and suspend the accounts that had posted them,” adding, “That the offending content was not taken down immediately does not make Twitter liable under any applicable law.”

“Mistakes or delays, however, do not make Twitter a knowing participant in a sex trafficking venture, as Plaintiffs here have alleged,”  the court filing said. “Plaintiffs do not (and cannot) allege, as they must, that Twitter ever had any actual connection to the Perpetrators, took any part in their crimes, or benefitted from them. Thus, even accepting all of Plaintiffs’ allegations as true, there is no legal basis for holding Twitter liable for the Perpetrators’ despicable acts.”

On the heels of the suits, advocates are pushing a petition on Change.org to create an easier two-step reporting process on Twitter for victims of child sexual abuse material.

“The reporting process for victims of child sexual abuse material and Twitter users should be a two step process. Clear, direct, and easy for children to report,” reads the petition, which has racked up more than 13,000 signatures.

“Twitter is currently being sued by two minor survivors of child sexual exploitation. John Doe #1 and John Doe #2 were both 13 years of age in the video shared on Twitter,” the petition adds. “Their abuse was watched 167,000 views and 2,223 retweets. John Doe #1 was a minor when he started reporting the video. He provided government ID to the platform showing that he was a minor.”

No comments