We are federated with the paedophile instance people have been complaining about. And I don’t think I want to wait until their content starts showing in my time line. We need an admin ASAP or jump ship.
See here: https://sh.itjust.works/post/6400367
Hopefully we won’t see any of that here because nobody here is subscribing to that instance (yet). The lack of an admin does mean that we won’t get any new users who might want to fuck things up, but that’s a pretty small silver lining on a pretty ominous-looking cloud.
Lemmy.world’s defed list has a few other noncey instances blocked which we’ve not had any trouble with, any reason to assume this one will cause more problems?
Also paedophile, please, we don’t need American spellings adding insult to injury!
That’s a very good point, and sorry about the American spelling!
Is there a good alternative instance to this one that has an active admin? Seems like the writing’s on the wall for this instance if we don’t get our admin back…
If the Admin doesn6come.back we will spin up a new instance. Hopefully, the new update will make Tom break cover and we can sort this one out but we are actively working on a plan for a replacement.
Has anyone actually verified the claims, or are they taking the word “paraphilia” which can be used to describe just about any fetish, to mean “this has pedophilia”?
I admit, I haven’t looked into the claim. But the only supporting evidence I have seen is the fact that pedophilia is a paraphilia.
And to be clear on why I am asking, the term paraphilia also covers things such as the entirety of BDSM, voyeurism , and exhibitionism.
Did you follow the link to the sh.itjust.works post?
This is not just some adult kink.
There is plenty of evidence of pedophilia there.
I did a dive on that instance, it’s about a month old and not super active so you can pretty much browse down all of the posts all the way back to the start. There is a diversity of discussion regarding from weird trans identities to extreme sexual desires and kinks. And yes, a good percentages of the posts openly talk about being sexually attracted to minors and it is filled with discussions trying to normalize and justify having sex with kids.
The problem runs much deeper than one instance, unfortunately. This thread is full of people insisting that AI-generated CSAM should be legal (and thus normalised).
To be clear, I’m quite prepared to believe that at least some of those posters have not thought it through and just don’t have a good grasp of structural influences on behaviour, rather than having any personal interest in this sort of material. That kind of simplistic ultra-individualism is common on the Fediverse, unfortunately. But there is a culture problem when this sort of undergraduate philosophising gets so little pushback, and provides cover for much more sinister motivations.
With apologies to Gramsci, social media is dying and its replacement cannot be born; in this interregnum a great variety of morbid symptoms appear.
I am not about to visit the website, but people posted screenshots on the post that I linked to.
Yeah, seems like the instance should be generally defederated from if they are fine leaving blatant stuff like that up.
(You didn’t include a link)
Edit: Seems you may have edited and I saw a cached version. I do now see a link.
Thanks for flagging it up. We are working on contingency plans but this kind of thing will bring the timetable forward.
Yeah gonna have to find a new instance if this isn’t sorted out very very quickly.
One concern is that apps tend to cache images locally, so even troublesome photos being in the timeline could mean it’s stored temporarily on your device.
This happens the other week with those TikTok videos that were actually CSAM.
It really does need an admin to be on the ball.
Very well could be 4chan bait.