Just in case someone doesn’t know, LLM in this case means “Large Language Model”, which is just the technical term for things like ChatGPT.

  • AtmaJnana@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    10 months ago

    This is like saying people who use cars are “just too lazy to walk.” Or people who use their GPS navigation are “too lazy to use a map.”

    The amount of time and effort matters.

    • DoYouNot@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      10 months ago

      It’s like searching for a picture of Prague, seeing a drawing of Delhi, and then concluding you’ve been there. It’s not about laziness. It’s about accuracy.

      • BeAware@lemmy.dbzer0.comOPM
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 months ago

        Yeah, we’re not there yet, but the way things are going, I don’t see it being THAT far off. Maybe within 5 years it’ll be as accurate as anything else.

        • DoYouNot@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          10 months ago

          Yes, I think if we can get an LLM to work while providing high quality, real world sources it will be a game changing technology across domains. As it stands though, it’s like believing a magician really does magic. The tricks they employ are incredibly useful in a magic show, but if you expect them to really cast a fireball in your defense, you’ll be sorely mistaken.

      • AtmaJnana@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        10 months ago

        Did you read my comment at all? I was replying to a comment about the level of effort, which is what my analogy addresses.

        Your hyperbole not withstanding, if the accuracy isnt good enough for you, dont use it. Lots of people find that LLMs are useful even in their current state of imperfect accuracy.

        • DoYouNot@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          Did you read mine? If you wanted a depiction of a city, it’s more than good enough. In fact it’s amazing what it can do in that respect. My point is: it gets major details wrong in a way that feels right. That’s where the danger lies.

          If your GPS consistently brought you to the wrong place, but you thought it was the right place, do you not think that might be a problem? No matter how many people found it useful, it could be dangerously wrong in some cases.

          My worry is precisely because people find it so useful to “look things up”, paired with the fact that it has a tendency to wildly construct ‘information’ that feels true. It’s a real, serious problem that people need to understand when using it like that.