Not a good look for Mastodon - what can be done to automate the removal of CSAM?

  • sugar_in_your_tea@sh.itjust.works
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    I don’t think the OP ever said the bar was rape, the OP said the article and the person they responded to are treating drawn depictions of imaginary children the same as depictions of actual children. Those are not the same thing at all, yet many people seem to combine them (apparently including US law as of the Protect Act of 2003).

    Some areas make a distinction (e.g. Japan and Germany), whereas others don’t. Regardless of the legal status in your area, the two should be treated separately, even if that means both are banned.

    • balls_expert
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      1 year ago

      “treating them the same” => The threshold for being refused entry into mainstream instances is just already crossed at the lolicon level.

      From the perspective of the fediverse, pictures of child rape and lolicon should just both get you thrown out. That doesn’t mean you’re “treating them the same”. You’re just a social network. There’s nothing you can do above defederating.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        No, more like “treating them the same” => how the data is reported in the study. Whether they’re both against the TOS of the instance you’re on is a separate issue entirely, the problem is the data doesn’t separate the two categories.

        Look elsewhere ITT about that exact perspective. Even the US law (Protect Act of 2003) treats them largely the same (i.e. in the same sentence), and includes other taboo topics like bestiality, even if no actual animals are involved.

        It’s completely fine for neither to be allowed on a social network, what isn’t okay is for research to conflate the two. An instance inconsistently removing lolicon is a very different thing from an instance inconsistently removing actual CP, yet the article combines the two, likely to make it seem like a much worse problem than it is.

        • balls_expert
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          That’s an arbitrary decision to make and doesn’t really need to be debated

          The study is pretty transparent about what “CSAM” is under their definition and they even provide pictures, from a science communication point of view they’re in the clear

          • sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            And their definition kind of sucks. They’re basically saying it’s anything that SafeSearch or PhotoDNA flags, or something that has hashtag hits.

            That said, there’s absolutely some terrible things on Mastodon, including grooming and trading. I’m interested to know what the numbers look like for lolicon and similar vs actual CP, which would give me a much better understanding of how bad the problem is. As in, are the things included in the report outliers, or typical of their sample set?

            I guess I’m looking for a bit more granularity in the report.