• @agent_flounder@lemmy.one
    link
    fedilink
    English
    310 months ago

    Generative AI training is not the same thing as human inspiration. And transformative work has this far has only been performed by a human. Not by a machine used by a human.

    Clearly using a machine that simply duplicates a work to resell runs afoul of copyright.

    What about using a machine that slightly rewords that work? Is that still allowed? Or a machine that does a fairly significant rewording? What if it sort of does a mashup of two works? Or a mashup of a dozen? Or of a thousand?

    Under what conditions does it become ok to feed a machine with someone’s art (without their permission) and sell the resulting output?

    • R0cket_M00se
      link
      fedilink
      English
      710 months ago

      That’s the point, it’s almost a ship of Theseus situation.

      At what point does the work become its own compared to a copy? How much variation is required? How many works are needed for sampling before its designing information based on large scale sentence structures instead of just copying exactly what it’s reading?

      Legislation can’t occur until a benchmark is reached or we just say wholesale that AI is copyright infringement based purely on its existence and training.

      • @uriel238
        link
        English
        1
        edit-2
        10 months ago

        To the contrary, litigation occurs all the time and is subject to the opinions of the judge, and there is no consistency between cases. So when caught between an artist’s original work and a derivative product made by a generative AI (and then gone over by a human enough to qualify as a copyrightable work) it’s going to come down to which side has a better legal team and a war chest. Or they’ll just settle out of court.

    • @uriel238
      link
      English
      1
      edit-2
      10 months ago

      When in comes to intellectual property, it is the courts that decide what is sufficiently transformative. Granted, it is purely subjective, and falls into the same I’ll know it when I see it paradigm that obscenity does. An example I commonly use is Ray Parker Jr.'s Ghostbusters which has elements similar to I Want a New Drug by Huey Lewis and the News. It was settled out of court for an undisclosed sum.

      Typically, with the current global intellectual property laws, how a plagiarism dispute is going to turn out is not consistent according to any objective standard. Usually judges rule with the larger company, and more often than that, such disputes get settled out of court. A friend of mine saw the rise of the Riot grrrl movement which birthed the Girl Power motto and line of t-shirts in line with Riot grrrl. Virgin just started their own line of Girl Power fashion, worn by the Spice Girls, knowing they could out litigate any of the street artists who created the original concept and designs.

      But this highlights the central problem here. Big-name authors like George R. R. Martin or Stephen King might be able to raise a stink about their own work being used to train AI, but that has little to do with whether or not they’re right in doing so, but they (or their publishers) have a sizable war chest with which to fight a protracted legal battle, and that’s what it comes down to.

      So what about small authors? What about Lindsay Ellis or Ben Croshaw? (They’ve both published some books. I’ve read none of them but I’ve watched their videos on YouTube.) They’re fucked long before AI training is an issue. In order to get published at all they had to sign away all their intellectual property rights. They own nothing; their publishing house does, and unless their debut novel is a NYT best seller, they’re not going to get a better deal for later novels.

      So to answer your question Under what conditions does it become [acceptable] to feed a machine with someone’s art (without their permission) and sell the resulting output? If you’re asking about legality, generally when you’re a small-time artist and some company wants their next pop-star’s album art to look like something you made. Their lawyers will tear you apart, or compel you to settle out of court for a pittance. Is it ethical? No. But capitalism never is. Following the same logic, if you want to make derivative work from Disney, you better believe their lawyers are going to pour over your work and see if there’s an angle by which they shut you down or take your profits. Disney doesn’t like day-care centers with Disney characters on the walls (despite that it’s free indoctrination), and it is well known for being overly litigious.

      And when it comes to something you took to a publishing house (or record label, or game publisher or whatever), it’s going to be up to them what they’ll do about it, because you no longer own anything. That advance you got was all the money you’ll ever see, and if you’re like the rest of us small-timer artists, you might even hope that your stuff gets pirated and fed into private AI systems, since then you’ll actually influence the world just a tad bit more.

      AI controversies are in addition to the demands of the scriptwriters, actors, special effects workers and (sometimes? hopefully?) game developers striking in Hollywood. When it comes to the b-class actors and tech crews and script-writers, Hollywood has long since picked their works of anything worth taking. AI can’t really hurt them any more than they’ve already been.

      It’s why I think we shouldn’t be focusing on stopping AI, rather should burn the whole industry and rebuild it anew. Maybe massive unionization will help.