Edit - This is a post to the meta group of Blåhaj Lemmy. It is not intended for the entire lemmyverse. If you are not on Blåhaj Lemmy and plan on dropping in to offer your opinion on how we are doing things in a way you don’t agree with, your post will be removed.

==

A user on our instance reported a post on lemmynsfw as CSAM. Upon seeing the post, I looked at the community it was part of, and immediately purged all traces of that community from our instance.

I approached the admins of lemmynsfw and they assured me that the models with content in the community were all verified as being over 18. The fact that the community is explicitly focused on making the models appear as if they’re not 18 was fine with them. The fact that both myself and one a member of this instance assumed it was CSAM, was fine with them. I was in fact told that I was body shaming.

I’m sorry for the lack of warning, but a community skirting the line trying to look like CSAM isn’t a line I’m willing to walk. I have defederated lemmynsfw and won’t be reinstating it whilst that community is active.

  • kardum
    link
    fedilink
    English
    arrow-up
    37
    ·
    edit-2
    1 year ago

    i had no problem distinguishing the models on the community from children.

    maybe it’s more difficult in some cases without looking for the onlyfans link or sth similar of the model somewhere in the post, but that’s just human anatomy.

    that’s why the guy at the gas station asks for my ID card, because it is not always super clear. but apparently clear enough for reddit admins and PR people from ad companies.

    i agree playing into the innocent baby aspect is probably not great for sexual morals and i wouldn’t recommend this comm to a local priest or a nun, but this type of content thrives on pretty much every mainstream platform in some shape or form.

    i get it, if this instance wants to be sexually pure and removed from evil carnal desires tho. that’s kind of cool too for sure.

    • AdaOPMA
      link
      fedilink
      English
      arrow-up
      17
      ·
      edit-2
      1 year ago

      i had no problem distinguishing the models on the community from children.

      You didn’t see the content I saw. Content that was reported as CSAM by someone on this instance, who also thought it was CSAM.

      maybe it’s more difficult in some cases without looking for the onlyfans link or sth similar of the model somewhere in the post, but that’s just human anatomy.

      Again, a group that is focused on models in which that is the only way you can tell that they’re not underage, is a group that is focused on appealing to people who want underage models. That is a hard no.

      Spin it how you like, but I am not going to be allowing material that is easily mistaken from CSAM

      • kardum
        link
        fedilink
        English
        arrow-up
        17
        ·
        1 year ago

        I thought about this some more and I can feel a lot more sympathy for your decision now.

        It must be horrible to get a user report about CSAM and then see a picture, which could be really CSAM on first glance.

        Even if every user report was wrong from now until infinity, that initial CSAM suspicion, because of the false user report, probably makes moderating just a soul-crushing activity.

        It is great if admins from other instances are willing to handle with these horror reports, just to give their users a bigger platform, but this service is not something that can be taken for granted.

        I’m sorry for coming across as ignorant, I just did not consider your perspective that much really.

        • NuMetalAlchemist
          link
          fedilink
          English
          arrow-up
          23
          ·
          1 year ago

          “Even if every user report was wrong from now until infinity, that initial CSAM suspicion, because of the false user report, probably makes moderating just a soul-crushing activity.

          Then they shouldn’t be doing it. If seeing something that looks even slightly off-putting causes this level of over-reaction, Ada doesn’t need to be moderating a community for marginalized/at-risk people. I myself am a CSA survivor, and seeing my trauma being equated to some legal adults playing pretend is fuckin’ bullshit. Seeing my trauma being equated to drawn pictures is fuckin’ bullshit. My trauma being equated to AI generated shit is fuckin’ bullshit. I’ll tell you one thing, as a CSA kid, one thing I cannot stand is someone making decisions on my behalf. To protect me. Fuck you, I’ll fuckin bite anyone that tries to take away my free agency again.

          • AdaOPMA
            link
            fedilink
            English
            arrow-up
            8
            ·
            1 year ago

            I myself am a CSA survivor

            FYI, so am I

            • NuMetalAlchemist
              link
              fedilink
              English
              arrow-up
              17
              ·
              1 year ago

              Cool, welcome to the real world where one size does not fit all. We handle our trauma differently. But I don’t subject others to my hangups. I don’t use it as a cudgel to squash dissent. Your trauma is not your fault, but it is your responsibility, not ours, to deal with.

              • AdaOPMA
                link
                fedilink
                English
                arrow-up
                5
                ·
                1 year ago

                Anyway, we’re done here

                • NuMetalAlchemist
                  link
                  fedilink
                  English
                  arrow-up
                  14
                  ·
                  1 year ago

                  AKA you couldn’t think of a response that didn’t make you sound hateful. Look, I don’t have anything against you personally, Ada. We probably agree on 99.9% of shit. But you are definitely not well suited to admin. And now all the trolls on the fediverse know exactly what legal content to spam your inbox with to make you uncomfortable. Emotional moderators make for short-lived communities.

                  • AdaOPMA
                    link
                    fedilink
                    English
                    arrow-up
                    8
                    ·
                    1 year ago

                    I’ve been moderating and community building for literal decades. I think I’ll be ok

        • gh0stcassette
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 year ago

          I totally get that and definitely don’t blame Ada for defederating (although I don’t think it’s likely it was actually CSAM, nor that the community it was on is Inherently Problematic, as long as everyone in the posts is 18+, people’s kinks are none of my business).

          The thing I don’t get is why reports made by blahaj users on lemmynsfw communities would go to the blahaj moderators at all. That seems like a design flaw in Lemmy, instance mods have no power to moderate content on off-instance communities, so why would they be notified of reports? That seems like it would clutter mod-logs for no reason and cause unnecessary drama (as happened here). Like if every subreddit post report immediately went to the Site Admins, that would be Terrible.

          Though if Lemmy really is built like this for whatever reason, I would probably have done the same thing. I wouldn’t want to have to be Subjected to everything that could be reported on an NSFW instance, there’s probably some Heinous Shit that gets posted at least Occasionally, and I wouldn’t want to see all of it either. I just think it’s Really Stupid that lemmy is built this way, we need better moderation tools

          • AdaOPMA
            link
            fedilink
            English
            arrow-up
            9
            ·
            1 year ago

            The thing I don’t get is why reports made by blahaj users on lemmynsfw communities would go to the blahaj moderators at all.

            Reports go to the admins on the instance the reporter is from, to the admins on the instance the reported account is from and to the admins of the instance the community the post was made to is from. The report also goes to the moderators of the community that the content was posted to.

            Each instance only gets a single report, however many of those boxes it ticks, and that report can be dealt with by admins or moderators.

            However, the results federate differently based on who does the action. So for example, me deleting content from a lemmynsfw community doesn’t federate. It just removes it from my instance. However, a moderator or an admin from lemmynsfw removing lemmynsfw content will federate out.

          • Nia
            link
            fedilink
            English
            arrow-up
            5
            ·
            edit-2
            1 year ago

            deleted by creator

      • NuMetalAlchemist
        link
        fedilink
        English
        arrow-up
        14
        ·
        1 year ago

        “You didn’t see the content I saw.”

        Probably because it was removed for being against the rules?

      • kardum
        link
        fedilink
        English
        arrow-up
        10
        ·
        1 year ago

        Context always matters. I always check if adult material has actually been made by consenting adults. I would feel sick, if not enough information had been provided for that, but I at least have never encountered CSAM fortunately.

        I had classmates in high school with balding or even graying hair and full beards. Some adults older than me, look younger than my nephews. Revenge porn and creepshots are common. (or atleast were, I’m not on platforms where these are popular)

        Without context, porn will always be a morally grey area. Even commercialized hyper-capitalist porn is still an intimate affair.

        That’s why I didn’t use pornhub for example, before every user had to verify themselves before posting. Before that I only read erotica or looked at suggestive drawings.

        I understand your perspective tho. You get hardly paid to keep this instance running, looking at pictures that without context could be CSAM could make this volunteer work very mentally taxing. This is how NSFW works tho.

        Without context, any pornographic material featuring real humans could in truth be some piece of evidence for a horrible crime.

        • AdaOPMA
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          Context always matters. I always check if adult material has actually been made by consenting adults. I would feel sick, if not enough information had been provided for that, but I at least have never encountered CSAM fortunately.

          If I can’t tell, if I have to look something up because the people I’m looking at look like they’re underage, then it doesn’t matter what the answer is, because the issue is that it looks like CSAM even if it’s not. And a community designed in a way that attracts people looking for underage content is not a space I’m willing to federate with.

          • NuMetalAlchemist
            link
            fedilink
            English
            arrow-up
            8
            ·
            1 year ago

            Isn’t it kind of shitty to tell an adult woman she can never be attractive or sexy because she looks too young? Do you truly believe that said person should never be allowed to find love, because it’s creepy? Is she supposed to just give up because you think her body is icky?

            • AdaOPMA
              link
              fedilink
              English
              arrow-up
              5
              ·
              1 year ago

              I’ve covered this many times already.

              The issue isn’t individuals that happen to look younger than they are. The issue is with a community gathering sexual content of people that appear to be children.

              The community that initiated this isn’t even the worst offender on lemmynsfw. There is at least one other that is explicitly focused on this.

              • NuMetalAlchemist
                link
                fedilink
                English
                arrow-up
                13
                ·
                1 year ago

                So we can rely on you to ban any twink community on this instance, right? Cause the whole idea behind twinks is looking smooth, young, and pubescent. So it is a community that glorifies boys that look underage. You feelin icky about that one? Or is that “different”

              • candyman337@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                ·
                1 year ago

                I personally find the subs a little weird, but they have rules explicitly stating that no one is to be under 18, and as others have said it’s all clearly professionally taken watermarked photos.

                Also, If the initial picture in question was a verified adult that posted themselves voluntarily this all seems like a a huge overreaction.

                IMO it’s pretty clear the lemmynsfw mods are doing a lot to remove any actual cp, they banned loli and Shota and other types of content to avoid any type of legal or cp issues.