• @bleistift2@feddit.de
    link
    fedilink
    English
    298 months ago

    Isn’t it a good thing for pedophiles to have an outlet for their desires that doesn’t involve harming children? Am I not seeing an obvious downside?

    • PorkRollWobbly
      link
      fedilink
      368 months ago

      Pedophilia is not a sexuality and CSAM, AI generated or not, is not a healthy outlet. Pedophilia should be treated as a disease, and pedophiles should receive treatment for that instead.

      • @bleistift2@feddit.de
        link
        fedilink
        English
        38 months ago

        pedophiles should receive treatment for that instead

        In a world where many people cannot afford basic healthcare or – if they can afford it – where healthcare isn’t available in the required quantity, does your argument still hold?

        • @shea
          link
          38 months ago

          the treatment is daily merciless beatings

    • @PotatoKat@lemmy.world
      link
      fedilink
      138 months ago

      If I’m not mistaking I remember reading that consuming CSAM increases the likelihood of offense since it normalizes the act/makes the fantasies more vivid. It makes them more want to act out what they see instead of removing desires.

    • @klingelstreich@feddit.de
      link
      fedilink
      88 months ago

      It depends on whether you hold a world view where every person is valuable and needs help and understanding to become their best self or one where there are good and bad people and the baddies need to be punished and locked away so everyone else can live their life in peace.

            • Norgur
              link
              fedilink
              28 months ago

              That’s a rather useless contribution to the discussion. The initial argument was a line of reasoning why artificial csam might be a benefit so people can vent their otherwise harmful behavior without harming actual people. You just flat out responded “it is enabling and doesn’t stop distribution”. So you just responded with “no, u wrong”. Care to tell us you reasons behind your stance?

                • @bleistift2@feddit.de
                  link
                  fedilink
                  English
                  18 months ago

                  “it is enabling it doesn’t stop distribution“

                  Norgur’s point is that you didn’t provide any reasoning why that should be the case.

            • @Kusimulkku@lemm.ee
              link
              fedilink
              18 months ago

              I’m not saying it’s better alternative, I’m saying it might not make sense to talk about it “involving minors”.

                • Norgur
                  link
                  fedilink
                  18 months ago

                  That’s not picky about wording.
                  While I agree that stuff like that should not exist at all in no way whatsoever, there is a vast difference between it existing because someone abused a child, recorded that and thus scarred the child for life, or if someone made a computer make up pixels in a way that is disgusting.

    • @AspieEgg
      link
      38 months ago

      Don’t AI models need to be trained on the material they are trying to emulate?

      • @LemmysMum@lemmy.world
        link
        fedilink
        28 months ago

        No. I make AI generated imagery so let me clear this one up. AI does make up a facsimile based on training data, but combinations of training data can create results not related to that training data.

        Eg. Combining training data from legal fem boys, petite women, legal teens, along with non-nudes of generated children who don’t represent any real person, along with training data for increasing or decreasing age.

        None of this uses CSAM or pedophilic abuse material and is all legally and easily obtainable, but in combination can generate pedophilic content.

  • @OsrsNeedsF2P@lemmy.ml
    link
    fedilink
    10
    edit-2
    8 months ago

    On one hand, yes, but on the other, Stable Horde developed a model to detect CSAM thanks to Stable Diffusion, and that’s being used to combat pedos globally

  • neuropean
    link
    fedilink
    58 months ago

    What’s interesting is that mammals from mice to dogs don’t draw a distinction between arbitrary ages before trying to copulate. On the other hand, they don’t try to fuck the equivalent of pre-pubescent members of their species either, nothing natural about that.