• randon31415@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    8 months ago

    I am just wondering if they are referring to citivai, which tries really hard to be seen as an art and base model/lora site.

    • CrayonRosary@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      8 months ago

      AI generated images aren’t “deep fakes”. Deep fakes came out a long time before image gen did. You take an existing movie and swap out just the face.

      • kaosof@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        8 months ago

        Porn fakes are old news, yes - but what the post you replied to is talking about is “deep learning” (remember that, before the great deluge of “AI”?) fakes, which to some extent either uses generative networks to swap out or alter the face/body, or straight up generate simulacra graphics/video, as opposed to a human doing for the most part comparatively bad hack jobs with multiple sources as in the past.

      • casual_turtle_stew_enjoyer@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 months ago

        Mmmmmm some of the models on CivitAI, with the right workflows, could effectively create the most versatile “deep fakes” possible. You can put someone’s face on a blank canvas and tell a model specifically trained for realistic pornography to paint a whole scene around the face. The only advantage here is that doing that is kinda pointless when you can just generate any random face to match your specifications. So ultimately much less harmful so long as the user isn’t obsessing over representing a distinct living person.

        Also, these are primarily still images. Some animation models exist, but that process is a lot more hit-and-miss. Overall though, I’d argue this whole use case is significantly less damaging than deep fakes due to principals.