• pixxelkick@lemmy.world
    link
    fedilink
    English
    arrow-up
    26
    ·
    9 months ago

    This sort of ignores the fact that the advances in that technology are widespread applicable to all tasks, we literally just started with text and image generation because:

    1. The training data is plentiful abd basically free to get your hands on

    2. It’s easy to verify it works

    LLMs will crawl so that ship breaking robots can run.

    • EldritchFeminity
      link
      fedilink
      English
      arrow-up
      11
      ·
      9 months ago

      He’s ignoring it because he’s not complaining about the tech, but the way it’s being used. Instead of being used to make it easier for artists and writers to do their jobs, it’s being used to replace them entirely so their bosses don’t have to pay them. It’s like when Disney switched to 3d animation. They didn’t do it because the tech was better and made the job easier. They did it because 2d animators are unionized and 3d animators aren’t, so they could pay the new guys less.

      And these are the kinds of jobs people actually want - to the point where they don’t pay anywhere near as well as they should because companies can exploit people’s passion for what they do.

      Imagine a world of construction workers and road crews, but no civil engineers, architects, or city planners. Imagination and creativity automated away in the name of the almighty profit margin.

      • gmtom@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        9 months ago

        Yep and when we invented mechanical computers, we put human computers out of the job.

        When we invented the automatic loom we put weavers out of the job.

        When we invented electric lights we put lamplighters out of the job.

        When we invented digital art we put many brushmakers, canvas makers, paint makers out of the job.

        This is the cost of progress.

    • sturlabragason@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      9 months ago

      Second this.

      We’re in the first days and everyday I add a new model or tech to my reading list. We’re close to talking to our CPUs. We’re building these stacks. We’re solving the memory problems. Don’t need RAG with a million tokens, guerrilla model can talk with APIs, most models are great at python which is versatile as fuck, I can see the singularity on the horizon.

      Try Ollama if you want to test things yourself.

      Use GPT4 if you want to get an inkling of the potential that’s coming. I mean really use it.