Nearly a year after AI-generated nude images of high school girls upended a community in southern Spain, a juvenile court this summer sentenced 15 of their classmates to a year of probation.

But the artificial intelligence tool used to create the harmful deepfakes is still easily accessible on the internet, promising to “undress any photo” uploaded to the website within seconds.

Now a new effort to shut down the app and others like it is being pursued in California, where San Francisco this week filed a first-of-its-kind lawsuit that experts say could set a precedent but will also face many hurdles.

  • njm1314@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    4 months ago

    Good Lord how naive are you that you don’t think there’s a difference? I mean honestly what fucking world do you live in? I honestly am amazed that someone could still not understand this. Where have you been the last few weeks?

    • Samvega
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      4 months ago

      Those who enjoy treating women as sexual objects will always be offended by the idea that this exploitation is somehow unfair. Furthermore, they will take great offence at this being pointed out, but will never give an explanation for their offence other than by becoming more offended.