• LadyAutumn
    link
    fedilink
    English
    arrow-up
    3
    ·
    6 months ago

    It should be illegal either way, to be clear. But you think theyre not training models on CSAM? Youre trusting in the morality/ethics of people creating AI generated child pornography?

    • Greg Clarke@lemmy.ca
      link
      fedilink
      English
      arrow-up
      7
      ·
      6 months ago

      The use of CSAM in training generative AI models is an issue no matter how these models are being used.

      • L_Acacia@lemmy.one
        link
        fedilink
        English
        arrow-up
        6
        ·
        6 months ago

        The training doesn’t use csam, 0% chance big tech would use that in their dataset. The models are somewhat able to link concept like red and car, even if it had never seen a red car before.

        • AdrianTheFrog@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          6 months ago

          Well, with models like SD at least, the datasets are large enough and the employees are few enough that it is impossible to have a human filter every image. They scrape them from the web and try to filter with AI, but there is still a chance of bad images getting through. This is why most companies install filters after the model as well as in the training process.

          • DarkThoughts@fedia.io
            link
            fedilink
            arrow-up
            6
            ·
            6 months ago

            You make it sound like it is so easy to even find such content on the www. The point is, they do not need to be trained on such material. They are trained on regular kids, so they know their sizes, faces, etc. They’re trained on nude bodies, so they also know how hairless genitals or flat chests look like. You don’t need to specifically train a model on nude children to generate nude children.

    • mindbleach@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 months ago

      We’re trusting that billion-dollar corporate efforts don’t possess and label hyper-illegal images, specifically so people can make more of them. Because why the fuck would they.