I posted the other day that you can clean up your object storage from CSAM using my AI-based tool. Many people expressed the wish to use it on their local file storage-based pict-rs. So I’ve just extended its functionality to allow exactly that.

The new lemmy_safety_local_storage.py will go through your pict-rs volume in the filesystem and scan each image for CSAM, and delete it. The requirements are

  • A linux account with read-write access to the volume files
  • A private key authentication for that account

As my main instance is using object storage, my testing is limited to my dev instance, and there it all looks OK to me. But do run it with --dry_run if you’re worried. You can delete lemmy_safety.db and rerun to enforce the delete after (method to utilize the --dry_run results coming soon)

PS: if you were using the object storage cleanup, that script has been renamed to lemmy_safety_object_storage.py

  • andrew@lemmy.stuart.fun
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    Because of the way pictrs organizes photos, which I believe is by hash (could be random id but I suspect not), you should be able to share filenames for cleanup by neighbors without having to share the contents.

    Even if it’s not organized that way automatically, though, you can pretty easily use sha256sum to get a shareable hash before deleting the content.

    • BitOneZero @ .world@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      I think timestamps of files would be one of the easier things, and try to track back to postings and comments that references the upload… ideally the logged-in account (which is the standard install of lemmy, only logged-in users can upload to pictrs)

      • Norah - She/They
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I think both are equally important. I think there’s potential for a centralised database of the metadata involved here. There is likely just one person, or a very small group, committing this current attack. It’s very likely a troll that’s gotten a small handful of images from some dreadful shock site, rather than someone who knows how to access this content more broadly. It might be possible to entirely block the ability of new accounts to keep posting those images using the hashes.