Other samples:
Android: https://github.com/nipunru/nsfw-detector-android
Flutter (BSD-3): https://github.com/ahsanalidev/flutter_nsfw
Keras MIT https://github.com/bhky/opennsfw2
I feel it’s a good idea for those building native clients for Lemmy implement projects like these to run offline inferences on feed content for the time-being. To cover content that are not marked NSFW and should be.
What does everyone think, about enforcing further censorship, especially in open-source clients, on the client side as long as it pertains to this type of content?
Edit:
There’s also this, but it takes a bit more effort to implement properly. And provides a hash that can be used for reporting needs. https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX .
Python package MIT: https://pypi.org/project/opennsfw-standalone/
IANAL, but as far as I know you dont have to proactively remove illegal content, just the stuff you were made aware of.
So all this drama about federating illegal content is very much overblown.
Edit: sorry about calling it “drama”, didnt know the full extent of whats currently happening. (malicious users spamming CP)
Seeing how I had to become very knowledgeable because I’m an instance owner in the last few hours because of Lemmy and bad actors, this is absolutely not true.
Im curious: What are the legal duties of a fediverse hoster regarding illegal content currently? Do you really have to remove illegal content proactively? Because as far as I know, thats just in the EU and only if you are one of the major digital services(which fediverse server hosters arent)
@toothbrush @scrubbles A webboard moderator was arrested in Thailand because she didn’t remove possibly illegally messages soon enough to satisfy the police. Anyway, I have never heard that any Facebook officers were arrested.
https://www.hrw.org/news/2012/04/23/thailand-internet-provider-faces-lese-majeste-conviction
I’m pretty sure the mods and admins of lemmyshitpost are fully aware of the illegal content being reported to them in one of the most popular Lemmy tiddeRverse communities, so i’m not entirely confident the ‘proactive removal’ info is relevant in this situation.
If 10 volunteers can’t keep up with it, most of which have now quit, I find it really hard to see this as “drama” personally. I see it as a serious issue which has real life consequences for both the instance owner (risk of being raided) and the moderators subjected to reviewing it.
I suspect you wouldn’t describe it as overblown if you were in the same situation as the mods. I occasionally sift through the modlog and there are occasionally some seriously vile takes in there, spam posts and abuse removed by these volunteers on a daily basis, all to keep our feeds clean. Add traumatic content on top of that too, and it’s no surprise some mods have left and they’ve shuttered the comm.
Apologies if I come off as abrasive in this comment in general, but I just vehemently disagree with the take that this is just some “overblown drama”
Ah sorry, I didnt know that there is an attack going on currently, i just saw a bunch of posts about lemmy being illegal to operate because of the risk of CP federation. And then this post which seemed to imply that one needs constant automated illegal content filtering, which as far as i know isnt required by law, unless you operate a major service that is reachable in the EU, and fediverse servers arent major enough for that.
Yeah on top of that it sounds like the people who did see it are pretty shaken up, apparently it was real fucked up. So not only blocking it from ever hitting the servers for legal reasons, but on top of that just so no one needs to see that. There are third party tools that will analyze it and block it automatically, and we’re hoping to get those online quick
Yea, just leave the CSAM till someone reports it, great solution!
Well, thats how it generally worked as far as I know. Im not saying that you can host illegal stuff as long as no one reports it. Im saying its impossible to know instantly if someones posting something illegal to your server, youd have to see it first. Or else pretty much the entire internet would be illegal, because any user can upload something illegal at any time, and youd instantly be guilty of hosting illegal content? I doubt it.