• RickRussell_CA@lemmy.world
    link
    fedilink
    English
    arrow-up
    31
    ·
    4 months ago

    I suppose the only thing I disagree with is that the law can do anything about it. Obviously, you can go after sites that have money and/or a real business presence, a la Pornhub. But the rest? It’s the wild west.

    • Todd Bonzalez@lemm.ee
      link
      fedilink
      arrow-up
      19
      ·
      4 months ago

      I think it’s best that it be illegal so that we can at least have a reactive response to the problem. If someone abuses someone else by creating simulated pornography (by any means), we should have a crime to charge them with.

      You can’t target the technology, or stop people from using AI to do perverted things, but if they get caught, we should at least respond to the problem.

      I don’t know what a proactive response to this issue looks like. Maybe better public education and a culture that encourages more respect for others?

      • DarkThoughts@fedia.io
        link
        fedilink
        arrow-up
        18
        ·
        4 months ago

        I think it’s best that it be illegal so that we can at least have a reactive response to the problem. If someone abuses someone else by creating simulated pornography (by any means), we should have a crime to charge them with.

        So… Where do you draw the line exactly? Does this include classic photo manipulation too? Written stories (fanfic)? Sketching / doodling of some nude figure with a name pointed towards it? Dirty thoughts that someone has about someone else? I find this response highly questionable and authoritarian. Calling it abuse is also really trivializing actual abuse, which I, as an abuse victim, find pretty apprehensive. If I could swap what was done to me with someone making porn of “me” and getting their rocks off of that then I’d gladly make that exchange.

      • RickRussell_CA@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        I feel I was misconstrued. 1. a law will probably happen, and 2. it will do fuck all because the tool chain and posting/sharing process are going to be completely anonymous.

        Yeah, in specific cases where you can determine deepfake revenge porn of Person A was posted by Person B who had an axe to grind, you might get a prosecution. I just don’t think the dudes making porn on their Nvidia GPUs of Gal Godot f*ckin Boba Fett are ever gonna get caught, and the celebrity cat will stay forever out of the bag.

    • Grimy@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      4 months ago

      You can’t ban the tech but you can ban the act so it’s easier to prosecute people that upload deep fakes of their co-workers.

      • DarkThoughts@fedia.io
        link
        fedilink
        arrow-up
        2
        ·
        4 months ago

        That’s already illegal in most countries, regardless of how it was made. It also has nothing to do with “AI”.

        • Grimy@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          4 months ago

          Obviously, you can go after sites that have money and/or a real business presence, a la Pornhub. But the rest? It’s the wild west.

          I was referring to that part of his comment. It is also not at all illegal in most countries. Its only illegal at state level in the US for example, and not for all of them either. Canada only has 8 provinces with legislation against it.

          I do agree though that it’s not the softwares fault. Bad actors should be punished and nothing more.

  • Veraxus@lemmy.world
    link
    fedilink
    arrow-up
    25
    ·
    4 months ago

    I feel an easy and rational solution is to criminalize a certain category of defamation… presenting something untrue/fabricated as true/real or in a way that it could be construed as true/real.

    Anything other than that narrow application is an infringement on the First Amendment.

    • Admiral Patrick@dubvee.org
      link
      fedilink
      English
      arrow-up
      11
      ·
      4 months ago

      I feel an easy and rational solution is to criminalize a certain category of defamation… presenting something untrue/fabricated as true/real or in a way that it could be construed as true/real.

      I would love that solution, but it definitely wouldn’t have bipartisan support.

      • Veraxus@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        4 months ago

        There are certain political groups that have a vested interest in lying, deceiving, manipulating, and fabricating to get what they want. So… yeah. 😞

        • maynarkh@feddit.nl
          link
          fedilink
          arrow-up
          1
          ·
          4 months ago

          I feel that’s just most political groups nowadays. Not implying both sides are the same, just that everyone likes their lies.

      • Veraxus@lemmy.world
        link
        fedilink
        arrow-up
        8
        ·
        4 months ago

        Exactly. Photoshop has been around for decades. AI is just more of the same. I find it weird how, as technology evolves, people keep fixating on the technologies themselves rather than the universal (sometimes institutional) patterns of abuse.

  • Player2@lemm.ee
    link
    fedilink
    arrow-up
    8
    ·
    4 months ago

    Anyone could run it on their own computer these days, fully local. What could the government do about that even if they wanted to?

    • jeffw@lemmy.worldOP
      link
      fedilink
      arrow-up
      9
      ·
      4 months ago

      Anyone can make CSAM in their basement, what could the government do about that even if they wanted to?

      Anyone can buy a pet from a pet store, take it home and abuse it, why is animal abuse even illegal?

      Should I keep going with more examples?

      • Player2@lemm.ee
        link
        fedilink
        arrow-up
        4
        ·
        4 months ago

        What do you want them to do, have constant monitoring on your computer to see what applications you open? Flag suspicious GPU or power usage and send police to knock on your door? Abusing people or animals requires real outside involvement. You are equating something that a computer generates with real life, while they have nothing to do with each other.

        • jeffw@lemmy.worldOP
          link
          fedilink
          arrow-up
          8
          ·
          edit-2
          4 months ago

          Who is suggesting that?

          Murder is illegal, do we surveil everyone who owns a gun or knife?

          CSAM is illegal, do cameras all report to the government?

          Again, that’s just 2 examples. Lmk if you want more

          • Player2@lemm.ee
            link
            fedilink
            arrow-up
            5
            ·
            4 months ago

            Maybe my wording is unclear. I am wondering how they should be expected to detect it in the first place. Murder leaves a body. Abuse leaves a victim. Generating files on a computer? Nothing of the sort, unless it is shared online. What would a new regulation achieve that isn’t already covered under the illegality of ‘revenge porn?’ Furthermore, how can they possibly even detect anything beyond that without massive privacy breaches as I wrote before?

    • Carrolade@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      4 months ago

      The govt’s job is not to prevent crime from happening, that’s dystopian-tier stuff. Their job is to determine what the law is, and apply consequences to people after they are caught breaking it.

      The job of preventing crime from happening in the first place mainly belongs to lower-level community institutions, starting with parents and teachers.

    • Todd Bonzalez@lemm.ee
      link
      fedilink
      arrow-up
      7
      ·
      4 months ago

      The issue is not with all forms of pornographic AI, but more about deepfakes and nudifying apps that create nonconsensual pornography of real people. It is those people’s consent that is being violated.

      • DarkThoughts@fedia.io
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        4 months ago

        I still don’t understand why this is now an issue but decades of photo editing did not bother anyone at all.

        • CarbonIceDragon@pawb.social
          link
          fedilink
          arrow-up
          4
          ·
          4 months ago

          I mean, it did bother people, it just took more skill and time at using photo manipulation software to make it look convincing such that it was rare for someone to both have the expertise and be willing to put in the time, so it didnt come up often enough to be a point of discussion. The AI just makes it quick and easy enough to become more common.

            • Todd Bonzalez@lemm.ee
              link
              fedilink
              arrow-up
              2
              ·
              4 months ago

              It literally is a one-click solution. People are running nudifying sites that use CLIP, GroundingDINO, SegmentAnything, and Stable Diffusion to autonomously nudify people’s pictures.

              These sites (which I won’t even mention the names of), just ask for a decent quality photo of a woman wearing a crop top or bikini for best results.

              The people who have the know-how to set up Stable Diffusion and all these other AI photomanipulation tools are using those skills to monetize sexual exploitation services. They’re making it so you don’t need to know what you’re doing to participate.

              And sites like Instagram, which are filled with millions of exploitable images of women and girls, has allowed these perverted services to advertise their warez to their users.

              It is now many orders of magnitude easier than it ever has been in history to sexually exploit people’s photographs. That’s a big deal.

              • DarkThoughts@fedia.io
                link
                fedilink
                arrow-up
                1
                ·
                4 months ago

                If you wanna pay for that then you do you. lol But at that point you could’ve also paid a shady artist to do the work for you too.

                Also, maybe don’t pose half naked on the internet already if you don’t want people to see you in a sexual way. That’s just weird, just like this whole IG attention whoring of people nowadays. And no, this isn’t even just a women thing. Just look how thirsty women get under the images of good looking dudes that pose topless, or just your ordinary celeb doing ordinary things (Pedro Pascal = daddy, and yes, that includes more explicit comments too).

                This hypocritical fake outrage is just embarrassing.