I posted the other day that you can clean up your object storage from CSAM using my AI-based tool. Many people expressed the wish to use it on their local file storage-based pict-rs. So I’ve just extended its functionality to allow exactly that.
The new lemmy_safety_local_storage.py
will go through your pict-rs volume in the filesystem and scan each image for CSAM, and delete it. The requirements are
- A linux account with read-write access to the volume files
- A private key authentication for that account
As my main instance is using object storage, my testing is limited to my dev instance, and there it all looks OK to me. But do run it with --dry_run
if you’re worried. You can delete lemmy_safety.db
and rerun to enforce the delete after (method to utilize the --dry_run results coming soon)
PS: if you were using the object storage cleanup, that script has been renamed to lemmy_safety_object_storage.py
Oh come on. Being ND doesn’t mean your mind jumps to sharing child porn. That’s a fuckin cop out.
You are saying the mind jumps, but that is the topic. I meant to say that being ND can create a desire for clarity in communication. A direct or terse argument.
Speaking from experience as an ND person, I think the “strong sense of justice” comes into play here as well. In this situation, not wanting to “let someone get away with” publicly asking for CSAM by pushing for a clarification that they aren’t. I get where Bamboo was coming from, but they stated in a response further down that they understood BitOneZero’s comment. I also understand BitOneZero being upset at the implication given the context.
Oh, gosh. You are right. I forgot to empathize with the other commenter I was arguing with. And BitOneZero as well. Thank you, I’ve learned something.