Apparently, stealing other people’s work to create product for money is now “fair use” as according to OpenAI because they are “innovating” (stealing). Yeah. Move fast and break things, huh?

“Because copyright today covers virtually every sort of human expression—including blogposts, photographs, forum posts, scraps of software code, and government documents—it would be impossible to train today’s leading AI models without using copyrighted materials,” wrote OpenAI in the House of Lords submission.

OpenAI claimed that the authors in that lawsuit “misconceive[d] the scope of copyright, failing to take into account the limitations and exceptions (including fair use) that properly leave room for innovations like the large language models now at the forefront of artificial intelligence.”

  • Phanatik@kbin.social
    link
    fedilink
    arrow-up
    5
    ·
    1 year ago

    It’s so funny that this is something new. This was Grammarly’s whole schtick since before ChatGPT so how different is Grammarly AI?

    • vexikron@lemmy.zip
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      Here is the bigger picture: The vast majority of tech illiterate people think something is AI because duh its called AI.

      Its literally just the power of branding and marketing on the minds of poorly informed humans.

      Unfortunately this is essentially a reverse Turing Test.

      The vast majority of humans do not know anything about AI, and also a huge majority of them can also barely tell the difference between, currently in some but not all forms, output from what is basically a brute force total internet plagiarism and synthesis software, from many actual human created content in many cases.

      To me this basically just means that about 99% of the time, most humans are actually literally NPCs, and they only do actual creative and unpredictable things very very rarely.