‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • originalfrozenbanana@lemm.ee
    link
    fedilink
    English
    arrow-up
    41
    ·
    edit-2
    11 months ago

    That the chat is full of people defending this is disgusting. This is different than cutting someone’s face out of a photo and pasting it on a magazine nude or imagining a person naked. Deepfakes can be difficult to tell apart from real media. This enables the spread of nonconsensual pornography that an arbitrary person cannot necessarily tell is fake. Even if it were easy to tell, it’s an invasion of privacy to use someone’s likeness against their will without their consent for the purposes you’re using it for.

    The fediverse’s high expectations for privacy seem to go right out the window when violating it gets their dick hard. We should be better.

    • ABCDE@lemmy.world
      link
      fedilink
      English
      arrow-up
      42
      ·
      11 months ago

      What are you arguing with here? No one is saying that. Stop looking for trouble, it’s weird.

    • RobotToaster@mander.xyz
      link
      fedilink
      English
      arrow-up
      13
      ·
      11 months ago

      it’s an invasion of privacy to use someone’s likeness against their will

      Is it? Usually photography in public places is legal.

          • originalfrozenbanana@lemm.ee
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            11 months ago

            I think it’s immoral to do street photography to sexualize the subjects of your photographs. I think it’s immoral to then turn that into pornography of them without their consent. I think it’s weird you don’t. If you can’t tell the difference between street photography and using and manipulating photos of people (public or otherwise) into pornography I can’t fuckin help you

            If you go to a park, take photos of people, then go home and masturbate to them you need to seek professional help.

            • TrickDacy@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              ·
              edit-2
              11 months ago

              What’s so moronic about people like you, is you think that anyone looking to further understand an issue outside of your own current thoughts, clearly is a monster harming people in the worst way you can conjure in your head. The original person saying it’s weird you’re looking for trouble couldn’t have been more dead on.

              • originalfrozenbanana@lemm.ee
                link
                fedilink
                English
                arrow-up
                4
                ·
                11 months ago

                This is an app that creates nude deepfakes of anyone you want it to. It’s not comparable to street photography in any imaginable way. I don’t have to conjure any monsters bro I found one and they’re indignant about being called out as a monster.

                • PopOfAfrica@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  11 months ago

                  This has been done with Photoshop for decades. Photocollage for a hundred years before that. Nobody is arguing that it’s not creepy. It’s just that nothing has changed.

        • TrickDacy@lemmy.world
          link
          fedilink
          English
          arrow-up
          14
          ·
          edit-2
          11 months ago

          Am I violating privacy by picturing women naked?

          Because if it’s as cut and dried as you say, then the answer must be yes, and that’s flat out dumb

          I don’t see this as a privacy issue and I’m not sure how you’re squeezing it into that. I am not sure what it is, but you cannot violate a person’s privacy by imagining or coding an image of them. It’s weird creepy and because it can be mistaken for a real image, not proper to share.

          Can you actually stop clutching pearls for a moment to think this through a little better?

          • originalfrozenbanana@lemm.ee
            link
            fedilink
            English
            arrow-up
            4
            ·
            11 months ago

            Sexualizing strangers isn’t a right or moral afforded to you by society. That’s a braindead take. You can ABSOLUTELY violate someone’s privacy by coding an image of them. That’s both a moral and legal question with an answer.

            Your comment is a self report.

    • chitak166@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      11 months ago

      So it’s okay to make nudes of someone as long as they aren’t realistic?

      Where is the line drawn between being too real and not real enough?

      • originalfrozenbanana@lemm.ee
        link
        fedilink
        English
        arrow-up
        6
        ·
        11 months ago

        If you found out that someone had made a bunch of art of you naked you’d probably think that was weird. I’d argue you shouldn’t do that without consent. Draw lines wherever you think is best.

        • chitak166@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          ·
          11 months ago

          I’d definitely think it was weird! And probably not hang out with them anymore (unless it was really good!)

          But I don’t think there should be a law against them doing that. I can moderate them myself by avoiding them and my friends will follow suit.

          At that point, all they have are nudes of me that nobody I care about will pay attention to. It’s a good litmus test for shitbags!

          • echo64@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            11 months ago

            This is about producing convincing nude reproductions of other people, however. It has a very different psychological impact.

            This technology allows someone to make pornography of anyone else and spread that pornography on the internet. It can cause massive distress, trauma, and relationship issues and impact peoples careers.

            Your freedom to make nude ai images of other people is not worth that. I don’t understand why anyone would think it was okay.

    • PopOfAfrica@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      11 months ago

      People need better online safety education. Why TF are people even posting public pictures of themselves?

    • Grangle1@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      11 months ago

      Unfortunately sounds like par for the course for the internet. I’ve come to believe that the internet has its good uses for things like commerce and general information streaming, but by and large it’s bringing out the worst in humanity far more than the best. Or it’s all run by ultra-horny psychopathic teenagers pretending to be adults yet living on a philosophy of “I’m 13 and this is deep” logic.

      • originalfrozenbanana@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        11 months ago

        I dunno why I am perpetually surprised about this though. This is such a cut and dry moral area and the people who say it isn’t are so clearly telling on themselves it’s kind of shocking, but I guess it shouldn’t be

        • PopOfAfrica@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          11 months ago

          I think the distinction is that half of the thread is treating it as a moral issue, and half of it is treating it as a legal issue. Legally, there’s nothing wrong here.