• Peanut@sopuli.xyz
    link
    fedilink
    arrow-up
    3
    ·
    9 months ago

    It’s like nobody cares to even touch base level understanding of the tools they are using.

    Can we stop framing this as if llms have actual intent?

    This shouldn’t surprise me given how many people think that we have access to the literal word of God, but they don’t even read the damned book they base their lives and social directives around.

    Or is it that “news” sources intentionally leave out basic details to ramp up the story?

    Ignore the note on the page you are using that says info might not be accurate. Blame the chat bot for your unprofessional ineptitude.

    You shouldn’t even be putting that level of blind trust into human beings, or even Wikipedia without checking sources.

    Guess what, when i use bots for info, i ask for the sources, and check the original sources. Really not difficult, and I’m not being paid half as much as the people I keep seeing in these news articles.

    Maybe this should make it more obvious how wealth is not accrued due to competence and ability.

    Or for having reliable news. I feel like i live in a world controlled by children.

    • Chahk@beehaw.org
      link
      fedilink
      arrow-up
      6
      ·
      9 months ago

      Ignore the note on the page you are using that says info might not be accurate. Blame the chat bot for your unprofessional ineptitude.

      I’m sorry, but WTF?! If the page on a government website “might not be accurate” then what fucking use is said website in the first place?

      • Peanut@sopuli.xyz
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        9 months ago

        Chatgpt website*

        That statement is more of an echo of previous similar articles.

        Anyone who uses the api or similar bots for their site, such as this one, should be responsible to do the same. If they are using the api/bot without similar warning, they also don’t understand basic use of the technology. It’s a failure on the human side more that the bot side, but that is not how it tends to be framed

        My point is that it doesn’t matter how good the tools are if people just assume what they are capable of.

        It’s like seeing a bridge that says “600 pound weight limit”. And deciding it can handle a couple tons just because you saw another bridge hold that much.

        Imagine if this situation lead to a bunch of people angry at bridges for being so useless.