Hello everyone,

We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

We keep working on a solution, we have a few things in the works but that won’t help us now.

Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.

Edit: @Striker@lemmy.world the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn’t his community it would have been another one. And it is clear this could happen on any instance.

But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what’s next very soon.

Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It’s been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn’t the first time we felt helpless. Anyway, I hope we can announce something more positive soon.

  • Striker
    link
    fedilink
    33010 months ago

    I would like to extend my sincerest apologies to all of the users here who liked lemmy shit posting. I feel like I let the situation grow too out of control before getting help. Don’t worry I am not quitting. I fully intend on staying around. The other two deserted the community but I won’t. Dm me If you wish to apply for mod.

    Sincerest thanks to the admin team for dealing with this situation. I wish I linked in with you all earlier.

    • @lwadmin@lemmy.worldOPM
      link
      fedilink
      20910 months ago

      @Striker@lemmy.world this is not your fault. You stepped up when we asked you to and actively reached out for help getting the community moderated. But even with extra moderators this can not be stopped. Lemmy needs better moderation tools.

      • Rob T Firefly
        link
        fedilink
        English
        5310 months ago

        Hopefully the devs will take the lesson from this incident and put some better tools together.

        • Whitehat Hacker
          link
          fedilink
          English
          5010 months ago

          There’s a Matrix Room for building mod tools here maybe we might want to bring up this issue there, just in case they aren’t already aware.

          • @Bread@sh.itjust.works
            link
            fedilink
            English
            3410 months ago

            Its not easy to build a social media app, forking it won’t make it any easier to solve this particular problem. Joining forces to tackle an inevitable problem is the only solution. The Lemmy devs are more than willing to accept pull requests for software improvements.

          • @x1gma@lemmy.world
            link
            fedilink
            3010 months ago

            And who’s gonna maintain the fork? Even less developers from a split community? You have absolutely no idea what you’re talking about.

    • gabe [he/him]
      link
      fedilink
      7410 months ago

      Please, please, please do not blame yourself for this. This is not your fault. You did what you were supposed to do as a mod and stepped up and asked for help when you needed to, lemmy just needs better tools. Please take care of yourself.

    • Dandroid
      link
      fedilink
      3110 months ago

      This isn’t your fault. Thank you for all you have done in regards to this situation thus far.

    • Draconic NEO
      link
      fedilink
      24
      edit-2
      10 months ago

      It’s not your fault, these people attacked and we don’t have the proper moderation tools to defend ourselves yet. Hopefully in the future this will change though. As it stands you did the best that you could.

    • @Mr_Blott@feddit.uk
      link
      fedilink
      2010 months ago

      Definitely not your fault mate, you did what anyone would do, it’s a new community and shit happens

    • Flying Squid
      link
      fedilink
      1510 months ago

      I love your community and I know it is hard for you to handle this but it isn’t your fault! I hope no one here blames you because it’s 100% the fault of these sick freaks posting CSAM.

    • Nerd02
      link
      fedilink
      English
      1210 months ago

      You don’t have to apologize for having done your job. You did everything right and we appreciate it a lot. I’ve spent the whole day trying to remove this shit from my own instance and understanding how purges, removals and pictrs work. I feel you, my man. The only ones at fault here are the sickos who shared that stuff, you keep holding on.

    • Alien Nathan Edward
      link
      fedilink
      1010 months ago

      You didn’t do anything wrong, this isn’t your fault and we’re grateful for the effort. These monsters will be slain, and we will get our community back.

    • @MsPenguinette@lemmy.world
      link
      fedilink
      610 months ago

      You do a great job. I’ve reported quite a few shit heads there and it gets handled well and quickly. You have no way of knowing if some roach is gonna die after getting squashed or if they are going to keep coming back

    • GONADS125
      link
      fedilink
      510 months ago

      You’ve already had to take all that on, don’t add self-blame on top of it. This wasn’t your fault and no reasonable person would blame you. I really feel for what you and the admins have had to endure.

      Don’t hesitate to reach out to supports or speak to a mental health professional if you’ve picked up trauma from the shit you’ve had to see. There’s no shame in getting help.

    • @Becoming@lemmy.world
      link
      fedilink
      English
      310 months ago

      As so many others have said, there’s no need for an apology. Thank you for all of the work that you have been doing!

      The fact that you are staying on as mod speaks to your character and commitment to the community.

    • ivanafterall
      link
      fedilink
      19210 months ago

      This isn’t as crazy as it may sound either. I saw a similar situation, contacted them with the information I had, and the field agent was super nice/helpful and followed up multiple times with calls/updates.

        • BitOneZero @ .world
          link
          fedilink
          4410 months ago

          yha, what do people think the FBI is for… this isn’t crazy. They can get access to ISP logs, VPN provider logs, etc.

          • deweydecibel
            link
            fedilink
            English
            1010 months ago

            I think what they’re saying is that contacting the FBI may seem daunting to someone who has never dealt with something like this before, but that they don’t need to worry about it. Just contact them.

          • Aki
            link
            fedilink
            310 months ago

            Under US jurisdiction, yeah. Could be slightly more difficult depending on the country, LEGAT can’t conduct unilateral operations so they’ll have to cooperate with foreign authorities. These assholes can get away with exploiting jurisdictional boundaries. Hopefully they will be caught, but oh well.

    • Ertebolle
      link
      fedilink
      11310 months ago

      This is good advice; I suspect they’re outside of the FBI’s jurisdiction, but they could also be random idiots, in which case they’re random idiots who are about to become registered sex offenders.

          • @jarfil@lemmy.world
            link
            fedilink
            810 months ago

            I wish the American government Went after minor csam events as much as they go after copyright/IP violations.

            Easy: claim copyright/IP on the CSAM… uh, no, wait…

            • @jarfil@lemmy.world
              link
              fedilink
              10
              edit-2
              10 months ago

              There is no CP and no porn in Japan… add some tiny censor bars, and it’s just some wholesome family tentacle fun!

              That one backfired spectacularly.

        • @Resonosity@lemmy.ca
          link
          fedilink
          210 months ago

          Yeah there was even that case where a citizen and resident of Mexico was arrested and detained in the US for breaking US law, even tho it technically didn’t apply to them since they were under Mexican sovereignty… Borders mean little to the US

      • The Picard Maneuver
        link
        fedilink
        4510 months ago

        They might be, but I’d imagine most countries have laws on the books about this sort of stuff too.

        • @droans@lemmy.world
          link
          fedilink
          1910 months ago

          And it’s something that the nations usually have no issues cooperating with.

          The FBI has assisted in a lot of global raids related to CSAM.

          • @assassin_aragorn@lemmy.world
            link
            fedilink
            410 months ago

            There are few situations where pretty much everyone universally agrees to work together. This is one of those situations. Across cultures and nations, pedos are seen as some of the most vile people alive.

        • @jarfil@lemmy.world
          link
          fedilink
          6
          edit-2
          10 months ago

          Wait, is this like China having police offices in other countries?

          I knew the US collects taxes on their citizens no matter where they live, but isn’t this kind of excessive? Wasn’t INTERPOL supposed to take care of international crime?

          • @dylanTheDeveloper@lemmy.world
            link
            fedilink
            3
            edit-2
            10 months ago

            For more than eight decades, the FBI has stationed special agents and other personnel overseas. We help protect Americans back home by building relationships with principal law enforcement, intelligence, and security services around the globe.

            It is similar to China’s international police but keep in mind quite a few other countries have a similar setup

            • @jarfil@lemmy.world
              link
              fedilink
              110 months ago

              I’m just surprised that it’s FBI personnel, I thought the CIA was in charge of international affairs, with INTERPOL acting as liaison for the FBI with other countries.

              IIRC in the EU we have EUROPOL acting as liaison between the national law enforcement branches, and while there is nothing stopping personnel from one country to enter another, I don’t think they do. But maybe that’s more like the state vs. federal jurisdictions in the US. On the other hand, it’s been some time since I’ve looked deeper into it, and things keep changing.

      • @CantSt0pPoppin@lemmy.world
        link
        fedilink
        2310 months ago

        I have to wonder if Interpol could help with issues like this I know there are agencies that work together globally to help protect missing and exploited children.

        • GeekFTW
          link
          fedilink
          3010 months ago

          ‘Criminal activity should be reported to your local or national police. INTERPOL does not carry out investigations or arrest people; this is the responsibility of national police.’

          From their website.

            • GeekFTW
              link
              fedilink
              1
              edit-2
              10 months ago

              “Interpol provides investigative support, expertise and training to law enforcement worldwide, focusing on three major areas of transnational crime: terrorism, cybercrime and organized crime. Its broad mandate covers virtually every kind of crime, including crimes against humanity, child pornography, drug trafficking and production, political corruption, intellectual property infringement, as well as white-collar crime. The agency also facilitates cooperation among national law enforcement institutions through criminal databases and communications networks. Contrary to popular belief, Interpol is itself not a law enforcement agency.”
              https://en.wikipedia.org/wiki/Interpol

        • @TheTimeKnife@lemmy.world
          link
          fedilink
          1910 months ago

          The FBI reports it to interpol I believe, interpol is more of like an international warrant system built from treaties.

        • Ab_intra
          link
          fedilink
          710 months ago

          FBI would be great in this case tbh. They have the resources.

      • synae[he/him]
        link
        fedilink
        610 months ago

        Perhaps most importantly, it establishes that the mods/admins/etc of the community are not complicit in dissemination of the material. If anyone (isp, cloud provider, law enforcement, etc) tries to shut them down for it, they can point to their active and prudent engagement of proper authorities.

      • @Railing5132@lemmy.world
        link
        fedilink
        510 months ago

        More importantly, and germaine to our conversation, the FBI has the contacts and motivation to work with their international partners wherever the data leads.

  • @CantSt0pPoppin@lemmy.world
    link
    fedilink
    16710 months ago

    This is seriously sad and awful that people would go this far to derail a community. It makes me concerned for other communities as well. Since they have succeeded in having shitpost closed does this mean they will just move on to the next community? That being said here is some very useful information on the subject and what can be done to help curb CSAM.

    The National Center for Missing & Exploited Children (NCMEC) CyberTipline: You can report CSAM to the CyberTipline online or by calling 1-800-843-5678. Your report will be forwarded to a law enforcement agency for investigation. The National Sexual Assault Hotline: If you or someone you know has been sexually assaulted, you can call the National Sexual Assault Hotline at 800-656-HOPE (4673) or chat online. The hotline is available 24/7 and provides free, confidential support.

    The National Child Abuse Hotline: If you suspect child abuse, you can call the National Child Abuse Hotline at 800-4-A-CHILD (422-4453). The hotline is available 24/7 and provides free, confidential support. Thorn: Thorn is a non-profit organization that works to fight child sexual abuse. They provide resources on how to prevent CSAM and how to report it.

    Stop It Now!: Stop It Now! is an organization that works to prevent child sexual abuse. They provide resources on how to talk to children about sexual abuse and how to report it.

    Childhelp USA: Childhelp USA is a non-profit organization that provides crisis intervention and prevention services to children and families. They have a 24/7 hotline at 1-800-422-4453. Here are some tips to prevent CSAM:

    Talk to your children about online safety and the dangers of CSAM.

    Teach your children about the importance of keeping their personal information private. Monitor your children’s online activity.

    Be aware of the signs of CSAM, such as children being secretive or withdrawn, or having changes in their behavior. Report any suspected CSAM to the authorities immediately.

  • @dragontamer@lemmy.world
    link
    fedilink
    English
    14710 months ago

    Not that I’m familiar with Rust at all, but… perhaps we need to talk about this.

    The only thing that could have prevented this is better moderation tools. And while a lot of the instance admins have been asking for this, it doesn’t seem to be on the developers roadmap for the time being. There are just two full-time developers on this project and they seem to have other priorities. No offense to them but it doesn’t inspire much faith for the future of Lemmy.

    Lets be productive. What exactly are the moderation features needed, and what would be easiest to implement into the Lemmy source code? Are you talking about a mass-ban of users from specific instances? A ban of new accounts from instances? Like, what moderation tool exactly is needed here?

    • Agamemnon
      link
      fedilink
      11810 months ago

      Speculating:

      Restricting posting from accounts that don’t meet some adjustable criteria. Like account age, comment count, prior moderation action, average comment length (upvote quota maybe not, because not all instances use it)

      Automatic hash comparison of uploaded images with database of registered illegal content.

      • @dragontamer@lemmy.world
        link
        fedilink
        English
        6410 months ago

        On various old-school forums, there’s a simple (and automated) system of trust that progresses from new users (who might be spam)… where every new user might need a manual “approve post” before it shows up. (And this existed in Reddit in some communities too).

        And then full powers granted to the user eventually (or in the case of StackOverlow, automated access to the moderator queue).

      • Mossy Feathers (They/Them)
        link
        fedilink
        1110 months ago

        What are the chances of a hash collision in this instance? I know accidental hash collisions are usually super rare, but with enough people it’d probably still happen every now and then, especially if the system is designed to detect images similar to the original illegal image (to catch any minor edits).

        Is there a way to use multiple hashes from different sources to help reduce collisions? For an example, checking both the MD5 and SHA256 hashes instead of just one or the other, and then it only gets flagged if both match within a certain degree.

        • @TsarVul@lemmy.world
          link
          fedilink
          2610 months ago

          Traditional hash like MD5 and SHA256 are not locality-sensitive. Can’t be used to detect match with certain degree. Otherwise, yes you are correct. Perceptual hashes can create false positive. Very unlikely, but yes it is possible. This is not a problem with perfect solution. Extraordinary edge cases must be resolved on a case by case basis.

          And yes, simplest solution must be implemented first always. Tracking post reputation, captcha before post, wait for account to mature before can post, etc. The problem is that right now the only defense we have access to are mods. Mods are people, usually with eyeballs. Eyeballs which will be poisoned by CSAM so we can post memes and funnies without issues. This is not fair to them. We must do all we can, and if all we can includes perceptual hashing, we have moral obligation to do so.

          • Mossy Feathers (They/Them)
            link
            fedilink
            5
            edit-2
            10 months ago

            Something I thought about that might be helpful is if mods had the ability to add a post delay on a community basis. Basically, the delay would be moderator adjustable, but only moderators and admins would be able to see the post for X number of minutes after being posted. It’d help for situations like ongoing attacks where you don’t necessarily want to have to manually approve posts, but you want a chance to catch any garbage before the post goes public.

            Edit: and yeah, one of the reasons I’m aware that perceptual hashes can have collisions is because a number of image viewers/cataloging tools like xnview mp or hydrus network use hash collisions to help identify duplicate images. However, I’ve seen collisions between unrelated images when lowering the sensitivity which is why I was wondering if there was a way to use multiple hashing algorithms to help reduce false positives without sacrificing the usefulness of it.

    • @TsarVul@lemmy.world
      link
      fedilink
      4110 months ago

      I guess it’d be a matter of incorporating something that hashes whatever it is that’s being uploaded. One takes that hash and checks it against a database of known CSAM. If match, stop upload, ban user and complain to closest officer of the law. Reddit uses PhotoDNA and CSAI-Match. This is not a simple task.

      • @diffuselight@lemmy.world
        link
        fedilink
        2610 months ago

        None of that really works anymore in the age of AI inpainting. Hashes / Perceptual worked well before but the people doing this are specifically interested in causing destruction and chaos with this content. they don’t need it to be authentic to do that.

        It’s a problem that requires AI on the defensive side but even that is just going to be eternal arms race. This problem cannot be solved with technology, only mitigated.

        The ability to exchange hashes on moderation actions against content may offer a way out, but it will change the decentralized nature of everything - basically bringing us back to the early days of the usenet, Usenet Death Penaty, etc.

        • @dragontamer@lemmy.world
          link
          fedilink
          English
          50
          edit-2
          10 months ago

          Not true.

          A simple CAPTCHA got rid of a huge set of idiotic script-kiddies. CSAM being what it is, could (and should) result in an immediate IP ban. So if you’re “dumb” enough to try to upload a well-known CSAM hash, then you absolutely deserve the harshest immediate ban automatically.


          You’re pretty much like the story of the economist who refuses to believe that $20 exists on a sidewalk. “Oh, but if that $20 really existed on the sidewalk there, then it would have been arbitraged away already”. Well guess what? Human nature ain’t economic theory. Human nature ain’t cybersecurity.

          Idiots will do dumb, easy attacks because they’re dumb and easy. We need to defend against the dumb-and-easy attacks, before spending more time working on the harder, rarer attacks.

          • @rolaulten@startrek.website
            link
            fedilink
            310 months ago

            I’m sorry but you don’t want to use permanent IP bans. Most residential circuits are DHCP meaning banning via IP only has a short term positive effect.

            That said automatic scanning of known hashes, and automatically reporting to relevant authorities with relevant details should be doable (provided there is a database somewhere - I honestly have never looked).

        • @TsarVul@lemmy.world
          link
          fedilink
          2110 months ago

          Good question. Yes. Also artefacts from compression can fuck it up. However hash comparison returns percentage of match. If match is good enough, it is CSAM. Davai ban. There is bigger issue however for developers of Lemmy, I assume. It is a philosophical pizdec. It is that if we elect to use PhotoDNA and CSAI Match, Lemmy is now at the whims of Microsoft and Google respectively.

            • @Serinus@lemmy.world
              link
              fedilink
              11
              edit-2
              10 months ago

              The bigger thing is that hash detection tools don’t want to give access to just anyone, and just anyone can run a Lemmy instance. The concern is that you’re effectively giving the CSAM people a way to know if they’ll be detected.

              Perhaps they can allow some of the biggest Lemmy instances to use the tech, but I wouldn’t expect it to be available to everyone.

          • @what_is_a_name@lemmy.world
            link
            fedilink
            810 months ago

            Mod tools are not Lemmy. Give admins and mods an option. Even a paid one. Hell. Admins of Lemmy.world could have us donate extra to cover costs of api services.

            • @TsarVul@lemmy.world
              link
              fedilink
              1610 months ago

              I agree. Perhaps what Lemmy developers can do is they can put slot for generic middleware before whatever the POST request is in Lemmy API for uploading content? This way, owner of instance can choose to put whatever middleware for CSAM they want. This way, we are not dependent on developers of Lemmy for solution to pedo problem.

        • @Nollij@sopuli.xyz
          link
          fedilink
          English
          1710 months ago

          If they hash the file binary data, like CRC32 or SHA, yes. But there are other hash types out there, which are more like “fingerprints” of an image. Think of how Shazam or Sound Hound can recognize a song playing, despite the extra wind, static, etc that’s present. There are similar algorithms for images/videos.

          No idea how difficult those are to implement, though.

          • @Railcar8095@lemm.ee
            link
            fedilink
            710 months ago

            There are FOSS applications that can do that (czkawka for example). What I’m not sure it’s if the specific algorithm used is available and, more importantly, if the csam hashes are available for general audiences. I would assume if they are any attacker could check first and get the right amount of changes.

        • Alien Nathan Edward
          link
          fedilink
          510 months ago

          One bit, in fact. Luckily there are other ways of comparing images without actually showing them to human eyes that allow you to calculate a percentage of similarity.

    • @Serinus@lemmy.world
      link
      fedilink
      13
      edit-2
      10 months ago

      The best feature the current Lemmy devs could work on is making the process to onboard new devs smoother. We shouldn’t expect anything more than that for the near future.

      I haven’t actually tried cloning and compiling, so if anyone has comments here they’re more than welcome.

    • gabe [he/him]
      link
      fedilink
      1110 months ago

      I think having a means of viewing uploaded images as an admin would be helpful, as well disabling external image caching. Like an “uploaded” gallery for admins to view that can potentially hook into Photodna/CSAI-Match or whatever.

    • @MrPoopyButthole@lemmy.world
      link
      fedilink
      English
      1010 months ago

      I think it would be an AI autoscan that flags some posts for mod approval before they show up to the public and perhaps more fine-grained controls for how media is posted like for instance only allowing certain image hosting sites and no directly uploaded images.

    • @BURN@lemmy.world
      link
      fedilink
      310 months ago

      Probably hashing and scanning any uploaded media against some of the known DBs of CSAM hashes.

      Iirc that’s how Reddit/FB/Insta/Etc. handle it

        • @CoderKat@lemm.ee
          link
          fedilink
          English
          110 months ago

          The sad thing is that all we can usually do is make it harder for attackers. Which is absolutely still worth doing, to be clear. But if an attacker wants to cause trouble badly enough, there’s always ways around everything. Eg, image detection can be foiled with enough transformation, account age limits can be gotten past by a patient attacker. Minimum karma can be botted (even easier than ever with AI) and Lemmy is especially easy to bot karma because you can just spin up an instance with all the bots your heart desires. If posts have to be approved, attackers can even just hotlink to innocent images and then change the image after it’s approved.

          Law enforcement can do a lot more than we can, by subpoenaing ISPs or VPNs. But law enforcement is slow and unreliable, so that’s also imperfect.

  • @Pat12@lemmy.world
    link
    fedilink
    14710 months ago

    There are just two full-time developers on this project and they seem to have other priorities. No offense to them but it doesn’t inspire much faith for the future of Lemmy.

    this doesn’t seem like a respectful comment to make. People have responsibilities; they aren’t paid for this. It doesn’t seem to fair to make criticisms of something when we aren’t doing anything to provide a solution. A better comment would be “there are just 2 full time developers on this project and they have other priorities. we are working on increasing the number of full time developers.”

  • Ghostalmedia
    link
    fedilink
    English
    14310 months ago

    The amount of people in these comments asking the mods not to cave is bonkers.

    This isn’t Reddit. These are hobbyists without legal teams to a) fend off false allegations or b) comply with laws that they don’t have any deep understanding of.

  • @Poppa_Mo@lemmy.world
    link
    fedilink
    14110 months ago

    This is flat out disgusting. Extremely questionable someone having an arsenal of this crap to spread to begin with. I hope they catch charges.

  • @utopianfiat@lemmy.world
    link
    fedilink
    English
    11010 months ago

    I hope the devs take this seriously as an existential threat to the fediverse. Lemmyshitpost was one of the largest communities on the network both in AUPH and subscribers. If taking the community down is the only option here, that’s extremely insufficient and bodes death for the platform at the hands of uncontrolled spam.

  • godless
    link
    fedilink
    10710 months ago

    Fucking bastards. I don’t even know what beef they have with the community and why, but using THAT method to get them to shut down is nothing short of despicable. What absolute scum.

  • Margot Robbie
    link
    fedilink
    7710 months ago

    We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

    It’s likely that we’ll be seeing a large number of instances switch to whitelist based federation instead of the current blacklist based one, especially for niche instances that does not want to deal with this at all (and I don’t blame them).

  • HexesofVexes
    link
    fedilink
    7310 months ago

    Sounds like the 4chan raids of old.

    Batten down, report the offender’s to the authorities, and then clean up the mess!

    Good job so far _

  • @krayj@sh.itjust.works
    link
    fedilink
    English
    73
    edit-2
    10 months ago

    How does closing lemmyshitpost do anything to solve the issue? Isn’t it a foregone conclusion that the offenders would just start targeting other communities or was there something unique about lemmyshitpost that made it more susceptible?

    • @Cabrio@lemmy.world
      link
      fedilink
      4710 months ago

      It stops their instance hosting CSAM and removes their legal liability to deal with something they don’t have the capacity to at this point in time.

      How would you respond to having someone else forcibly load up your pc with child porn over the Internet? Would you take it offline?

      • @krayj@sh.itjust.works
        link
        fedilink
        English
        34
        edit-2
        10 months ago

        How would you respond to having someone else forcibly load up your pc with child porn over the Internet? Would you take it offline?

        But that’s not what happened. They didn’t take the server offline. They banned a community. If some remote person had access to my pc and they were loading it up with child porn, I would not expect that deleting the folder would fix the problem. So I don’t understand what your analogy is trying to accomplish because it’s faulty.

        Also, I think you are confusing my question as some kind of disapproval. It isn’t. If closing a community solves the problem then I fully support the admin team actions.

        I’m just questioning whether that really solves the problem or not. It was a community created on Lemmy.world, not some other instance. So if the perpetrators were capable of posting to it, they are capable of posting to any community on lemmy.world. You get that, yeah?

        My question is just a request for clarification. How does shutting down 1 community stop the perpetrators from posting the same stuff to other communities?

        • Ghostalmedia
          link
          fedilink
          English
          2110 months ago

          Fact of the matter is that these mods are not lawyers, and even if they were not liable, they would not have the means to fight this in court if someone falsely, or legitimately, claimed they were liable. They’re hobbits with day jobs.

          I also mod a few large communities here, and if I’m ever in that boat, I would also jump. I have other shit to do, and I don’t have the time or energy to fight trolls like that.

          If this was Reddit, I’d let all the paid admins, legal, PR, SysOps, engineers and UX folks figure it out. But this isn’t Reddit. It’s all on the hobbyist mods to figure it out. Many are not going to have the energy to put up with it.

          • @krayj@sh.itjust.works
            link
            fedilink
            English
            1110 months ago

            How does it limit liability when they could continue posting that content to any/every other community on lemmy.world?

            • @Cabrio@lemmy.world
              link
              fedilink
              12
              edit-2
              10 months ago

              But it does remove the immediate issue of CSAM coming from shitpost so world isn’t hosting that content.

                • @stealthnerd@lemmy.world
                  link
                  fedilink
                  310 months ago

                  They’re taking a whack-a-mole approach for sure but it’s either that or shut the whole instance down. I imagine their hope is that either the bad guys give up/lose interest or that it buys them some time.

                  Either way, it shows they are taking action which ultimately should help limit their liability.

    • Whitehat Hacker
      link
      fedilink
      English
      2510 months ago

      They also changed the account sign ups to be application only so people can’t create accounts without being approved.

    • Ghostalmedia
      link
      fedilink
      English
      1810 months ago

      It doesn’t solve the bigger moderation problem, but it solves the immediate issue for the mods who don’t want to go to jail for modding a community hosting CSM.

      • @krayj@sh.itjust.works
        link
        fedilink
        English
        1010 months ago

        Doesn’t that send a clear message to the perpetrators that they can cause any community to be shut down and killed and all they have to do is post CSAM to it? What makes you or anyone else think that, upon seeing that lemmyshitpost is gone, that the perpetrators will all just quit. Was lemmyshitpost the only community they were able to post in?

        • Ghostalmedia
          link
          fedilink
          English
          2110 months ago

          Yup. The perpetrators win.

          If you were in their shoes, would you want to risk going to jail for kiddy porn, risk having your name associated with CSM online, or drain your personal savings account to fight these folks?

          These mods are not protected by a well funded private legal team. This isn’t Reddit.

          • @krayj@sh.itjust.works
            link
            fedilink
            English
            1010 months ago

            You don’t have to explain how liability works. I get it. What I don’t get is how removing that specific community is going to limit their liability when the perpetrators will just target a different community.

            • Whitehat Hacker
              link
              fedilink
              English
              1110 months ago

              Sign-ups are manual approval applications, no more automated sign-ups from them, if they have existing accounts and target another community it’ll be closed as well and those accounts banned, there isn’t a stream of new accounts though because all accounts going forward need to be manually approved.

            • @ttmrichter@lemmy.world
              link
              fedilink
              210 months ago

              One of the ways you avoid liability is you show that you’re actively taking measures to prevent illegal content.

        • @MsPenguinette@lemmy.world
          link
          fedilink
          210 months ago

          The perps are taking a big risk as well. Finding and uploading csam means being in possession of it. So we can at least take solace in knowing it’s not a tool that just anyone wiill use to take down a community.

          Uploading to websites counts as distribution. The authorities will actually care about this. It’s not just some small thing that is technically a crime. It’s big time crime being used for skme thing petty.

          So while the perp might win in the short term, they are risking their lives using this tactic. I’m not terribly worried about it becoming a common tactic

          I’d anything, if I were the one doing this, I’d be worried that I might be pissing off the wrong group of people. If they keep at it and become a bigger problem, everyone is going to be looking for them. And then that person is going to big boy prison.

          • @krayj@sh.itjust.works
            link
            fedilink
            English
            110 months ago

            That is a great point. I don’t know if the admin team are proactively reporting that activity to law enforcement, but I hope they are.

  • @nogrub@lemmy.world
    link
    fedilink
    6410 months ago

    good thing you did it the way you did nobody should have to look at awful stuff like this. keep your mind healthy nobody should have to deal with that

  • Leraje
    link
    English
    6410 months ago

    Is it possible to (at least temporarily):

    1. Turn off instance image hosting (disable pictrs)
    2. Disallow image and video posts across all communities
    3. As in Firefish, turn off caching of remote images from other instances.

    whilst longer term solutions are sought? This would at least ensure poor mods aren’t exposed to this shit and an instance could be more positive they’re not inadvertently hosting CSAM.