Dear Lemmy.world Community,
Recently posts were made to the AskLemmy community that go against not just our own policies but the basic ethics and morals of humanity as a whole. We acknowledge the gravity of the situation and the impact it may have had on our users. We want to assure you that we take this matter seriously and are committed to making significant improvements to prevent such incidents in the future. Considering I’m reluctant to say exactly what these horrific and repugnant images were, I’m sure you can probably guess what we’ve had to deal with and what some of our users unfortunately had to see. I’ll add the thing we’re talking about in spoilers to the end of the post to spare the hearts and minds of those who don’t know.
Our foremost priority is the safety and well-being of our community members. We understand the need for a swift and effective response to inappropriate content, and we recognize that our current systems, protocols and policies were not adequate. We are immediately taking immediate steps to strengthen our moderation and administrative teams, implementing additional tools, and building enhanced pathways to ensure a more robust and proactive approach to content moderation. Not to mention ensuring ways that these reports are seen more quickly and succinctly by mod and admin teams.
The first step will be limiting the image hosting sites that Lemmy.world will allow. We understand that this can cause frustration for some of our users but we also hope that you can understand the gravity of the situation and why we find it necessary. Not just to protect all of our users from seeing this but also to protect ourselves as a site. That being said we would like input in what image sites we will be whitelisting. While we run a filter over all images uploaded to Lemmy.world itself, this same filter doesn’t apply to other sites which leads to the necessity of us having to whitelist sites.
This is a community made by all of us, not just by the admins. Which leads to the second step. We will be looking for more moderators and community members that live in more diverse time zones. We recognize that at the moment it’s relatively heavily based between Europe and North America and want to strengthen other time zones to limit any delays as much as humanly possible in the future.
We understand that trust is essential, especially when dealing with something as awful as this, and we appreciate your patience as we work diligently to rectify this situation. Our goal is to create an environment where all users feel secure and respected and more importantly safe. Your feedback is crucial to us, and we encourage you to continue sharing your thoughts and concerns.
Every moment is an opportunity to learn and build, even the darkest ones.
Thank you for your understanding.
Sincerely,
The Lemmy.world Administration
spoiler
CSAM
Removed by mod
Ill adults and poor kids generate and sell CSAM. Common to advertise on IG, sell on TG. Huge problem as that Stanford report shows.
Telegram got right on it (not). Fuckers.
Removed by mod
As screenshotted, linked, and stated - Instagram (dot com), Telegram (dot org), others.
Forgot a part of the chain: they’ll advertise to you on Instagram, message you on Telegram, and link you to Mega[upload].
But didn’t that shut down?
Not sure what you mean but check the Stanford study. The problem is enormous.
Nice try. That is something either an emerging pedo or an fbi agent would say.
… Or just someone interested academically because it seems everyone’s banning it left right and center, so I’m just logically interested in how they even manage to get it.
You got something to hide, bud?
As an Admin for another instance who had to clear some of it out, but not wanting to go into specifics, a spammer doesn’t necessarily have to in order to cause consternation. If they posted legal porn without the original context and then gave it a title saying it’s CSAM then it has the same effect without having to actually download imagery that would get you thrown in jail. And that’s what I tell myself so I can sleep at night.
As an instance that doesn’t allow NSFW material it’s easy enough to see it’s NSFW and get rid of it without having to go into forensic details. I have no idea how those running NSFW manage, because they have the opposite problem (potentially illegal material masquerading as legal porn) and I don’t want to know. However, anyone frequenting those instances need to have that in mind.