Thanks ahead of time for your feedback

  • originalucifer@moist.catsweat.com
    link
    fedilink
    arrow-up
    155
    ·
    3 months ago

    i think its ‘barrier to entry’

    photoshop took skills that not everyone has/had keeping the volume low.

    these new generators require zero skill or technical ability so anyone can do it

    • Sanctus@lemmy.world
      link
      fedilink
      English
      arrow-up
      69
      ·
      3 months ago

      Scale also, you can create nudes of everyone on Earth in a fraction of the time it would take with Photoshop. All for the lowly cost of electricity.

          • Melvin_Ferd@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            3 months ago

            What if my goal is to constantly be led around by media every decade to fear things needlessly when when they use the same lazy appeals every decade.

            The right have immigrant headlines. The left seem to hate AI and technology now.

            What’s mind blowing is how the same headline are used for both.

    • AbouBenAdhem@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      ·
      edit-2
      3 months ago

      When Photoshop first appeared, image manipulations that would seem obvious and amateurish by today’s standards were considered very convincing—the level of skill needed to fool large numbers of people didn’t increase until people became more familiar with the technology and more vigilant at spotting it. I suspect the same process will play out with AI images—in a few years people will be much more experienced at detecting them, and making a convincing fake will take as much effort as it now does in Photoshop.

    • Toes♀@ani.social
      link
      fedilink
      arrow-up
      10
      ·
      3 months ago

      Have you tried to get consistent goal orientated results from these ai tools.

      To reliably generate a person you need to configure many components, fiddle with the prompts and constantly tweak.

      To do this well in my eyes is a fair bit harder than learning how to use the magic wand in Photoshop.

    • GreatAlbatross@feddit.uk
      link
      fedilink
      English
      arrow-up
      6
      ·
      3 months ago

      imho, not dissimilar to model planes>drones.

      To operate a model plane, there was a not-small amount of effort you needed to work through (building, specialist components, local club, access to a proper field, etc.).
      This meant that by the time you were flying, you probably had a pretty good understanding of being responsible with the new skill.

      In the era of self-stabilising GPS guided UAVs delivered next-day ready-to-fly, the barrier to entry flew down.
      And it took a little while for the legislation to catch up from “the clubs are usually sensible” to “don’t fly a 2KG drone over a crowd of people at head height with no experience or training”

    • HobbitFoot @thelemmy.club
      link
      fedilink
      English
      arrow-up
      4
      ·
      3 months ago

      It would also take a lot more effort to get something even remotely believable. You would need to go through thousands of body and face photos to get a decent match and then put in some effort pairing the two photos together. A decent “nude” photo of a celebrity would probably take at least a day to make the first one.

  • EveryMuffinIsNowEncrypted
    link
    fedilink
    English
    arrow-up
    62
    ·
    edit-2
    3 months ago

    Honestly? It was kind of shitty back then and is just as shitty nowadays.

    I mean, I get why people do it. But in my honest opinion, it’s still a blatant violation of that person’s dignity, at least if it’s distributed.

    • Zorque@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      ·
      3 months ago

      It’s not that now it’s bad… it’s that now it’s actually being addressed. Whereas before it was just something people would sweep under the rug as being distasteful, but not worthy of attention.

  • Tarquinn2049@lemmy.world
    link
    fedilink
    arrow-up
    46
    ·
    3 months ago

    It’s a bit of a blend of it has always been a big deal, and that it is indeed more of a big deal still now because of how easy, accessible, and believable the AI can be. Like even nowadays, Photoshop hits only one point of that triangle. But it was even less capable back in the day. It could hit half of one of those points at any given time.

    Basically, a nude generated by a good AI has to be proven false. Because it doesn’t always immediately seem as such at first. If you have seen obvious AI fakes, they are just that, obvious. There are many non-obvious ones that you might have seen and not known they were fake. That is, of course, assuming you have looked.

    The other reason it can be more of a big deal now is that kids have been doing it of other kids. And since the results can be believable, the parents didn’t know they were fake to start with. So it would blow up as if it was real before finding out it was AI. And anything involving that is gonna be a big deal.

      • Tarquinn2049@lemmy.world
        link
        fedilink
        arrow-up
        7
        ·
        3 months ago

        I mean, that was an issue in the first month or so. Though I could see if the automated tools people use for this specific purpose might not stay up to date. I haven’t specifically interacted with those. But proper AI tools have in-filling to correct mistakes like that, you can keep the rest of the image and just “reroll” a section of it until whatever you didn’t like about it is fixed. Super quick and easy.

  • xmunk@sh.itjust.works
    link
    fedilink
    arrow-up
    35
    ·
    3 months ago

    Because previously if someone had the skills to get rich off the skill making convincing fake nudes we could arrest and punish them - people with similar skillsets would usually prefer more legitimate work.

    Now some ass in his basement can crank them out and it’s a futile game of whack-a-mole to kill them dead.

  • southsamurai@sh.itjust.works
    link
    fedilink
    arrow-up
    34
    ·
    3 months ago

    Well, I think everyone has already covered that it was a big deal at the time, it simply wasn’t something we could wipe out as a society.

    And it’s still a big deal.

    However, I don’t think anyone touched on why fake nudes, even ones that are obviously fake, or even labeled as fake by the creator are a problem.

    It comes back to the entire idea of consent. That’s for anyone, but women in particular are heavily sexualized, even well before they’re women. There is a constant, unending pressure on women of knowing that they are going to be sexually objectified. It might not be every day, by everyone alive around them, but it is inescapable.

    One can debate whether or not nudity should be a big deal, whether or not it is sexualized because of the rules a given culture has around nudity, but the hard truth is that nudity is sexualized. Ergo, images of a woman’s body is something that they deserve to have control over access to. If someone consents to images being available, great! If they don’t, then there’s a problem.

    Fakes, even obvious and declared fakes, violate that barrier of body autonomy. They directly ignore the person’s wishes regarding their naked body.

    The better the fake, the worse that violation is, because (as others said), once a fake is good enough, the subject of the fake is put in the position of having to deny it’s them. They shouldn’t have to ever be in that position, no matter who it is.

    Even a porn performer should have the ability to be free of fakes because they didn’t consent to those fakes. They also have a very valid claim on it infringing on their income as well. Now, I’m certain that legal fakes will someday be a thing. There will be contracts for likeness rights to produce fake porn. Bet on it. If I had free income, I would immediately invest in such an endeavor because I guarantee it will make money.

    But, as things stand, fakes are no better than someone taking a picture through a window shade, or using infrared to sneak by clothing. It’s digital, and it’s fake, but it is the direct equivalent of violating someone’s privacy and body autonomy.

    That’s why it’s a big deal to begin with.

    And, yeah, it is something that’s here to stay, it’s unavoidable. And someone is bound to comment that they wouldn’t care. Great, good for you. That doesn’t obligate others to not care too. But, put it to the test and provide a few pictures of yourself in your comment so that someone can make a fake nude of you, then plaster it online with zero context and labeled with at least your user name so everyone running across it can direct responses to it to you.

    It’s all about personal privacy, consent, and body autonomy.

  • DrownedRats@lemmy.world
    link
    fedilink
    arrow-up
    31
    ·
    edit-2
    3 months ago

    Because now, anyone can do it to anyone with zero effort and a single photo.

    Sure, before anyone with decent Photoshop skills could put together a halfway decently convincing fake nude but its still significantly more effort and time than most would be bothered with and even then its fairly easy to spot and dispute a fake.

    Most people weren’t concerned if a celebrity’s fake nudes were spread around before but now that a colleague, student, teacher, family member, or even a random member of the public could generate a convincing photo the threat has become far more real and far more conceivable

    • Randomgal@lemmy.ca
      link
      fedilink
      arrow-up
      6
      ·
      3 months ago

      To be fair. Photoshop has made tasks like this incredible simple. With a “good” photo, the process is much less esoteric now than it was once.

  • Ziggurat@sh.itjust.works
    link
    fedilink
    arrow-up
    25
    ·
    3 months ago

    I have a similar opininipn. People have been forging!/editing photographs and movies for as long as the technique existed.

    Now any stupid kid can do it, the hard part with AI is actually not getting porn. If it can teach everyone that fake photo are a thing, and make nudes worthless (what’s the point of a nude anyway ? Genitals looks like… Genitals)

  • shastaxc@lemm.ee
    link
    fedilink
    arrow-up
    25
    ·
    3 months ago

    If AI is so convincing, why would anyone care about nudes being controversial anymore? You can just assume it’s always fake. If everything is fake, why would anyone care?

    • all-knight-party@kbin.run
      link
      fedilink
      arrow-up
      9
      ·
      3 months ago

      Specifically because it’s convincing. You may just assume everything is fake, that doesn’t mean everyone will. You may not care about seeing someone’s imperceptibly realistic nude, but if it’s depicting them they may care, and they deserve the right for people not to see them like that.

      Just because it’s not logistically feasible to prevent convincing AI nudes from spreading around doesn’t make it ethical

  • lmaydev@lemmy.world
    link
    fedilink
    arrow-up
    23
    ·
    3 months ago

    It was always a big deal. But back then it was often pretty obvious when it was a fake. It’s getting harder and harder to tell.

  • Cyteseer@lemmy.world
    link
    fedilink
    arrow-up
    21
    ·
    3 months ago

    It’s always been a big deal, it just died down as Photoshop as a tool became normalized and people became accustomed to it as a tool.

  • Snot Flickerman
    link
    fedilink
    English
    arrow-up
    21
    ·
    edit-2
    3 months ago

    It was a big deal back then, too, but a lot harder to police, and a lot more obvious that they were fakes.

    Gillian Anderson fakes were real fuckin popular during the time the X-Files were on the air.

    EDIT: Searching for women from the time talking about the phenomenon in the 90’s is difficult because it mostly turns up… troves of fake nudes of these women. Of course.

      • Snot Flickerman
        link
        fedilink
        English
        arrow-up
        12
        ·
        edit-2
        3 months ago

        I recall women heavily disliking it back then, but I also recall that people in general viewed the internet as just full of weirdos and creeps. Internet wasn’t mainstream, by any stretch of the imagination, so I think it likely “got swept under the rug” because of a general feeling of “who cares what weirdos do online? We’re real people and we never use the internet because we have lives.

        Also, fewer lawyers understood the tech at the time, or how to figure out who was producing these images, and how to prosecute them. So I’d wager that part of going after them was held back by tech-unsavvy lawyers who were like “What’s happening where and how? Dowhatnow? Can you FAX it to me?”

        • Don_Dickle@lemmy.worldOP
          link
          fedilink
          arrow-up
          2
          ·
          3 months ago

          Did she ever unleash her wrath like the article says…Maybe the nerd in me but never wanted to see her naked just want to see her in a Princessesque Liae outfit. IYKYK

  • atrielienz@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    edit-2
    3 months ago

    How do you prove it’s not you in either case? Photoshop doesn’t make a whole video of you fucking a sheep. But AI can and is actively being used that way. With Photoshop it was a matter of getting ahold of the file and inspecting it. Even the best Photoshop jobs have some key tells. Artifacting, layering, all kinds of shading and lighting, how big the file is, etc.

    I want to add something. What if all the sudden it’s your 12 year old daughter being portrayed in this fake? What if it’s your mom? It would have been a big deal to you to have that image out there of your loved one back in the 90’s or early 2000’s. It’s the same kind of big deal now but more widespread because it’s so easy now. It’s not okay to just use the image of someone in ways they didn’t consent to. I have a similar issue with facial recognition regardless of the fact that it’s used in public places where I have no control over it.

  • I Cast Fist@programming.dev
    link
    fedilink
    arrow-up
    9
    ·
    3 months ago

    Doctored photos have always been a problem and, legally speaking, could lead to the faker being sued for defamation, depending on what was done with the person’s image.

    AI Photos are only part of the problem. Faking the voice is also possible, as is making “good enough” videos where you just change the head of the actual performer.

    Another part of the problem is that this kind of stuff spreads like wildfire within groups (and it’s ALWAYS groups where the victim is) and any voices stating that it’s fake will be drowned by everyone else.