A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

  • emmy67@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    4 months ago

    Are you stupid? Something has to be in the training model for any generation to be possible. This is just a new way to revitalise kids

    • NauticalNoodle@lemmy.ml
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      4 months ago

      So are you suggesting they can get an unaltered facial I.D. of the kids in the images? —Because that makes it regular csam with a specific victim (as mentioned), not an ai generative illustration.

      • emmy67@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        4 months ago

        No, I am telling you csam images can’t be generated by an algorithm that hasn’t trained on csam

        • NauticalNoodle@lemmy.ml
          link
          fedilink
          arrow-up
          7
          ·
          edit-2
          4 months ago

          That’s patently false.

          I’m not going to continue to entertain this discussion but instead I’m just going to direct you to the multiple other people who have already effectively disproven this argument and similar arguments elsewhere in this post’s discusion. Enjoy.

    • ameancow@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 months ago

      Not necessarily, AI can do wild things with combined attributes.

      That said, I do feel very uncomfortable with the amount of defense of this guy, he was distributing this to people. If he was just generating fake images of fake people using legal training data in his own house for his own viewing, that would be a different story. The amount of people jumping in front of the bullet for this guy when we don’t really know the details is the larger problem.