Hello everyone,
We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.
We keep working on a solution, we have a few things in the works but that won’t help us now.
Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.
Edit: @Striker@lemmy.world the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn’t his community it would have been another one. And it is clear this could happen on any instance.
But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what’s next very soon.
Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It’s been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn’t the first time we felt helpless. Anyway, I hope we can announce something more positive soon.
Doesn’t that send a clear message to the perpetrators that they can cause any community to be shut down and killed and all they have to do is post CSAM to it? What makes you or anyone else think that, upon seeing that lemmyshitpost is gone, that the perpetrators will all just quit. Was lemmyshitpost the only community they were able to post in?
Yup. The perpetrators win.
If you were in their shoes, would you want to risk going to jail for kiddy porn, risk having your name associated with CSM online, or drain your personal savings account to fight these folks?
These mods are not protected by a well funded private legal team. This isn’t Reddit.
You don’t have to explain how liability works. I get it. What I don’t get is how removing that specific community is going to limit their liability when the perpetrators will just target a different community.
Sign-ups are manual approval applications, no more automated sign-ups from them, if they have existing accounts and target another community it’ll be closed as well and those accounts banned, there isn’t a stream of new accounts though because all accounts going forward need to be manually approved.
One of the ways you avoid liability is you show that you’re actively taking measures to prevent illegal content.
The perps are taking a big risk as well. Finding and uploading csam means being in possession of it. So we can at least take solace in knowing it’s not a tool that just anyone wiill use to take down a community.
Uploading to websites counts as distribution. The authorities will actually care about this. It’s not just some small thing that is technically a crime. It’s big time crime being used for skme thing petty.
So while the perp might win in the short term, they are risking their lives using this tactic. I’m not terribly worried about it becoming a common tactic
I’d anything, if I were the one doing this, I’d be worried that I might be pissing off the wrong group of people. If they keep at it and become a bigger problem, everyone is going to be looking for them. And then that person is going to big boy prison.
That is a great point. I don’t know if the admin team are proactively reporting that activity to law enforcement, but I hope they are.