Hello everyone,

We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

We keep working on a solution, we have a few things in the works but that won’t help us now.

Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.

Edit: @Striker@lemmy.world the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn’t his community it would have been another one. And it is clear this could happen on any instance.

But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what’s next very soon.

Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It’s been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn’t the first time we felt helpless. Anyway, I hope we can announce something more positive soon.

    • Are people having a difficult time reading today? It’s not just you. Maybe it’s this topic and how it intermeshes with technology. Some people seem to think that there’s a technical solution for this already (one that works as well if not better than human moderators).

      No, I don’t think you personally are advocating for CSAM to be allowed. I think commenters are getting a little uppity about missing out on their favorite community while the admins deal with content that is:

      • harmful to children
      • damaging to the admin’s psyche
      • damaging to the user’s psyche
      • against the law

      Imagine you owned an instance, and you found 100 moderators for your communities. You rest your head on the pillow and go to sleep. You wake up and find that some user has written a script to post CSAM on all your communities, because “fuck you that’s why”. You get on the line with your moderators and they tell you they’ve been battling this all night, just banning people and deleting comments on site. They tell you they’ve had to turn off a few communities and that some users are complaining. Your hard work for weeks and months to get this instance to a healthy place is being tested. You get an email from your hosting service, saying that they have reports that your site contains CSAM and that’s against ToS - they give you a day to get it under control before they boot your server or turn it over to police. Imagine in this case you make the drastic move to simply pull the plug - taking the entire instance offline until you can sort it through. Now imagine some users come in and start complaining about how you dear admin are killing the fediverse. Personally, I have no sympathy for those user who complain about their community or instance being taken offline while admins deal with real shit.

      • cubedsteaks@lemmy.today
        link
        fedilink
        arrow-up
        5
        ·
        1 year ago

        Maybe it’s this topic and how it intermeshes with technology. Some people seem to think that there’s a technical solution for this already (one that works as well if not better than human moderators).

        So 4chan has this problem a lot but they are also based in the US where its most definitely illegal and they IP ban people and I think for the most part it works. It did suck though - I don’t go on there anymore but in the last few years I did, if I was on mobile, I would often get hit with a region ban because so many people in that area were banned that they just decided to block an entire IP region to prevent anyone else posting illegal content.

        maybe look into IP and region banning to prevent someone from just making new accounts.

        • You’re discussing how to ban people, this isn’t the problem.

          The problem is this: In the last hour, 10,000 images were uploaded. Some of those contain CSAM. Now, you have 1 hour to find all the CSAM photos (0 to 10,000 of them). In the next hour, another 10,000 images will be uploaded, some of them containing CSAM…

          Unless you have a lot of human moderators, you’re going to use automated tools and get false-postives or false-negatives.

          A site like 4 chan banning whole regions isn’t a great example of handling this well. I don’t think I need to explain (but maybe I do) that one person in a region who is posting CSAM doesn’t mean the entire region posts CSAM. You could just opt to block all regions by pulling the site off the internet. Not to mention, does this now mean that 4 chan allows CSAM for certain regions? Yikes. “Children can be abused only in these countries” “I’m sorry but your countries laws prevent images of children being abused, so this content is banned”. Yikes.

          maybe look into IP and region banning to prevent someone from just making new accounts.

          Again, the technical issue isn’t on banning. Here’s the code to ban user at IP 1.2.3.4:

          if ( $_SERVER[‘REMOTE_ADDR’] === ‘1.2.3.4’ ) { die(‘nope!’); }

          Here’s the code to ban a user at a specific region (pseudocode):

          $geoip = new GeoIPDB(); $region = $geoip->get_region( ‘1.2.3.4’ ); if ( $region === ‘USA’ ) { die(‘nope!’); }

          This isn’t difficult.

          Now, for the code to DETECT CSAM:

          look for skin tone tints (take into account all skin tone colors), look for quantity of skin on image (this would make close-ups of arms possible nude detections), detect a person in the photo, determine the person’s age by the photo… don’t detect images of art or of artful nudes, etc… or you know this is a lot of work, let’s make the humans detect instead.

          • cubedsteaks@lemmy.today
            link
            fedilink
            arrow-up
            3
            ·
            1 year ago

            Region banning would prevent anyone in the area from posting. I even mentioned that I use to come across bans for other people. In the case of 4chan, when they region ban, its possible someone else will be prevented from posting.

            Not to mention, does this now mean that 4 chan allows CSAM for certain regions? Yikes No its against their TOS entirely. It’s readable on their site and they do enforce rules even though they also enable people to be shitty in other ways.

            Now, if you want to talk about legality in other countries - that’s a different discussion. The internet is open to the WORLD. And all I would be comfortable confirming is that it’s definitely illegal in the US where I am. I’m not gonna get into other countries where it might not be illegal. I don’t know enough about those places to be able to tell you more.

            Basically a region ban would be similar to just pulling that instance down. Preventing whatever region that person was posting in would prevent them from posting as well as making local accounts to try and post more.

            When I would be downtown where I live, and got a ban that wasn’t meant for me, but I was in the region that was banned, I was able to appeal my ban. In order to appeal, you have to be good at using your words because a person has to sit there and read the appeal to make the decision to unban or not. Mine always went through but I also am capable of talking things out and I’m smart enough to know when to properly explain myself.

            Other people didn’t get their appeals and I would see them complain about it elsewhere.

            Anyway, you don’t need to condescend to me. I’m not against what you’re saying. I agree with a lot of what you said in other comments.

            • I mentioned this before but I’m sorry that I didn’t see who I was responding to. I usually respond on the internet to ideas, not people. Today I’ve been responding a lot to the idea that CSAM is easy to fix and that for reasons unknown it just hasn’t been done with lemmy and the way it’s being done with lemmy isn’t “the right way”.

              GeoIP databases aren’t perfect, which is another problem entirely. It’s better than pulling the plug on the entire internet, sure, but it has its own problems.

              I was responding to the idea of gating csam content via geoip as “yikes” because I can’t find myself personally allowing CSAM in some countries, because it’s “legal” in those countries. This is a moral argument I’m making, but I am happy imposing the US law as it relates to CSAM being illegal (not US law such as FOSTA/KOSA, etc… those are a different can of worms entirely) on other countries. Or to put it another way, as an admin, if I get an email saying “actually bro in country xyz we get to abuse children”, it won’t sway me into allowing that content in that country. IF someone in that country wants to put up a site for that country, that’s their problem (and if I could intervene and prevent them from doing so, I would).

              • cubedsteaks@lemmy.today
                link
                fedilink
                arrow-up
                3
                ·
                1 year ago

                Today I’ve been responding a lot to the idea that CSAM is easy to fix and that for reasons unknown it just hasn’t been done with lemmy and the way it’s being done with lemmy isn’t “the right way”.

                Right, it’s definitely not an easy fix and Lemmy doesn’t even operate the way other sites do but today I’m learning that using these instances seems to be easily exploitable.

                The reason I mentioned region banning is because it definitely worked. There weren’t people uploading 10000 images of CSA cause if you tried to, you’d get banned so hard that you’d ruin it for other people posting near by.

                I was responding to the idea of gating csam content via geoip as “yikes” because I can’t find myself personally allowing CSAM in some countries, because it’s “legal” in those countries

                I agree. Honestly, if I was in charge in anyway - those countries just wouldn’t be allowed access. And that does happen. I use to work for an app where we had people working in the Philippines who couldn’t access the app itself. We had to just give them info and they would feed it to the customers. And it was because their country is blocked from viewing the app in the first place. They’re just straight up not allowed to use it there.

                Like I’m totally with you. Fuck MAPs, fuck all of em. If some archaic country still participates in something that is obviously harmful to people - yeah, impose these laws on them. Tell them to fuck off until they stop this shit.

                And lets be real. It’s gonna be years before they ever stop.

                • The reason I mentioned region banning is because it definitely worked. There weren’t people uploading 10000 images of CSA cause if you tried to, you’d get banned so hard that you’d ruin it for other people posting near by.

                  That’s an interesting point. I didn’t take into account that 4chan might use region banning as a way to shame other anons by removing access from their country. That’s an interesting approach and I guess that’s something that lemmy admins could use in their toolbag. Users would absolutely hate it more than a simple community being banned, but whatever works or at least helps decrease the amount of this in existance.

                  • cubedsteaks@lemmy.today
                    link
                    fedilink
                    arrow-up
                    2
                    ·
                    1 year ago

                    Oh right, and it wasn’t country based. I’m in a large city and only the downtown region of my city was banned. If I went back home, I could easily get on the site and post.

          • dragontamer@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Again, the technical issue isn’t on banning. Here’s the code to ban user at IP 1.2.3.4:

            How does an IP Ban work when this attack came through a different, legitimate, federated Lemmy server?

      • utopianfiat@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        I think commenters are getting a little uppity

        What praytell the fuck do you mean by this term specifically

        • I’ve spent the better part of this morning explaining to people the fact that a community needs to be shut down in order for volunteers to work on cleaning it up in the time they have available.

          commenters seem to be pretty upset that something as “drastic” as turning off a community needs to be done. Some commenters have gone so far as to say that the policy of turning off communities in response to handling CSAM is what will “kill the fediverse”.

          I think the normal response to this is: “Wow, this sucks. Thanks admins for doing your best work. I understand the community make not come back for a bit, take all the time you need!”. Yet, I hear “it’s the dev’s fault for not putting in the code for blocking CSAM and taking a community offline is unacceptable”. I call that “upity” but there’s probably better words for it.

      • Katana314@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        I don’t think the comment above was trying to express dissatisfaction towards Lemmy’s hosts for failure to respond. They’re simply stating that the way things are all set up, much as we might like it, has serious problems - ones that may end up being considered unsolvable. As you said, we might be heading for an eventual plug pull.

        It’s like pointing out that cars produce fossil fuel exhaust. It sucks, and we’re seeing it as unsustainable, but there’s no convenient alternative yet.

        • Things are setup the way they are because it’s the best way that admins (not just of lemmy instances but of major sites like reddit and facebook) have found to handle these situations.

          You could take it a step further and give law enforcement their own backdoor to your site, as Facebook has done, but I would not advocate for that solution. We are in a special place in the internet where we can somewhat self-police our own content, assuming we actually self-police our own content. The way we do this is the way these admins are currently handling this.

          It may be reasonable to think that sites like reddit and facebook have it all figured out, but all they have is similar code to what lemmy has, but with a bit more money to pay some content moderators on trust and safety to actually remove this content before users get a chance to see it. The difference between those sites and lemmy is $$$ and that’s not something that’s likely to change anytime soon.