• @ryathal@sh.itjust.works
    link
    fedilink
    495 months ago

    If you replace all your online images then ai can’t look at it. No one else can either, but you stop ai I guess.

      • @stoy@lemmy.zip
        link
        fedilink
        225 months ago

        Isn’t Nightshade defeated by just applying an anti aliasing filter to the image?

        • littleblue✨
          link
          fedilink
          145 months ago

          Yeah, this is some “I don’t consent” FB post level of stupid shit. 🤷🏼‍♂️

            • @wildginger@lemmy.myserv.one
              link
              fedilink
              95 months ago

              “it takes all the running in the world just to stay in place.”

              Normally refers to biology arms races, where a poisonous animal and a poison resistant predator play tit for tat, making stronger poisons and stronger resistances to try and outplay the other just to stay alive.

              Now, artists and AI are doing the same. AI wants to steal art without paying for it, artists dont want their art stolen. Artists come up with little tricks to poison the data set if their art is used, AI comes up with little tricks to strip the poison from the data.

              The dance continues, the dancers straining and struggling, all to stand still.

              • littleblue✨
                link
                fedilink
                8
                edit-2
                5 months ago

                FFS. People. It’s “stealing” as much as visiting a museum and going home to sketch/sculpt/compose is theft.

                When did the chicken-little mindset of old fucks become default reaction set for the whole damn world? 🤮

                Edit: Ah, yes. More uneducated armchair experts yelling rhetoric. How surprising. Please, tell me you came from Reddit without telling me you’re from Reddit. 🤣

                • 1ostA5tro6yne
                  link
                  4
                  edit-2
                  5 months ago

                  “AI” image “generation” has been known to spit back out more or less intact copyrighted works, complete with watermark. It doesn’t create anything it’s just an outright plagiarism machine.

                • @Goldmage263@sh.itjust.works
                  link
                  fedilink
                  45 months ago

                  Except you don’t put in the effort to go yourself and interpret the art or the effort to make it. The human element is completely removed; besides, people should do what they want with their art, including prevent AI from using it to the best of their abilities.

              • @stoy@lemmy.zip
                link
                fedilink
                25 months ago

                Ah, similar struggle my dad described when negotiating sales of complex systems, both parties start with unrealistic demands, just to have stuff to give away to the other side during negotiations.

                He has told me several times that he just wishes that the process was way more streamlined and that the parties could start closer to the realistic goal.

                He has since retired, so he no longer needs to deal with it…

    • TheOneCurly
      link
      fedilink
      English
      205 months ago

      I believe this is suggesting an AI poisoning edit, not removing the image entirely. It should be mostly imperceptible. Plus, you could update with newer methods as they come out.

    • Justas🇱🇹
      link
      fedilink
      55 months ago

      You could generate a different temporary img url every time and nightshade it after the link expires.

    • Justas🇱🇹
      link
      fedilink
      35 months ago

      You could generate a different temporary img url every time and nightshade it after the link expires.

  • Margot Robbie
    link
    fedilink
    155 months ago

    LAION-5B is notoriously badly labeled that having a few poisoned data, even if it worked as advertised, would literally not matter at all.

    Plus, it’s not doing anything to existing diffusion models that used LAION-5B, since many artists are under the mistaken impression that the models will constantly scrapes the Internet for new images and train on them automatically, when training a model to learn new information without catastrophic forgetting is almost impossible (hence, workarounds like LoRAs and such).

    Again, a reminder that the creator of Nightshade and Glaze, Ben Zhao of UChicago, is literally a code thief who stole GPL code for his closed source product (warning: reddit link) to scam artists who doesn’t understand the tech behind ML models.

  • @31337@sh.itjust.works
    link
    fedilink
    1
    edit-2
    5 months ago

    Hmm, looks like it would also mess up classification, recommendation, captioning, etc models using these images. Maybe image and duplicate search as well? Maybe could be used to get around automated copyright strikes?