Reddit faces a lawsuit over sexually suggestive images of a minor.
A woman, identified only has Jane Doe, is suing Reddit for allowing her ex-boyfriend to post pornographic images of her as a 16-year-old. Her lawsuit argues that “Reddit knowingly benefits from lax enforcement of its content polices, including for child pornography.” She alleged that two years ago, in 2019, an ex-boyfriend posted sexually explicit photos and videos that he’d taken without her knowledge or consent. Reddit’s moderators sometimes “waited several days” before removing the site content. The site also allowed her ex to create a new account when his old account was banned, according to the plaintiff.
“Because Reddit refused to help, it fell to Jane Doe to monitor no less than 36 subreddits – that she knows of – which Reddit allowed her ex-boyfriend to repeatedly post child pornography,” the complaint reads. “Reddit’s refusal to act has meant that for the past several years Jane Doe has been forced to log on to Reddit and spend hours looking through some of its darkest and most disturbing subreddits so that she can locate the posts of her underage self and then fight with Reddit to have them removed.”
Photo by Soumil Kumar from Pexels
Jane Doe’s case relies on a set of bills signed by President Trump referred to as FOSTA-SESTA, which make it easier to combat illegal sex trafficking online. It was specifically intended to take down escort sites like Backpage, which allegedly make it easy for sex traffickers to operate. The move was deemed controversial because it supersedes Section 230 of the 1996 Communications Decency Act, which states that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
The suit accuses Reddit’s site content of containing “distributing child pornography” and the company of “failing to report child sexual abuse material, and violating the Trafficking Victims Protection Act.” It says that Reddit knew this type of content was being posted, stating, “Reddit has itself admitted that it is aware of the presence of child pornography.” And the filing includes now defunct subreddits with titles such as “jailbait.”
“Child sexual abuse material (CSAM) has no place on the Reddit platform. We actively maintain policies and procedures that don’t just follow the law, but go above and beyond it,” a Reddit spokesperson responded. “We deploy both automated tools and human intelligence to proactively detect and prevent the dissemination of CSAM material. When we find such material, we purge it and permanently ban the user from accessing Reddit. We also take the steps required under law to report the relevant user(s) and preserve any necessary user data.”
Reddit also banned site content including child sexualization back in 2012, saying that interpreting “vague and debated legal guidelines over what counted as illegal child pornography risked pulling Reddit into a legal quagmire.” However, the current suit contends that it did not enforce its own limits. “Reddit’s internal security has been compromised by its choice to rely on unpaid moderators, which fail to enforce the standards that are supposed to protect Reddit users and others,” the lawsuit states.
Reddit faces lawsuit for failing to remove child sexual abuse material
A new law intended to curb sex trafficking threatens the future of the internet as we know it