• FaceDeer@kbin.social
    link
    fedilink
    arrow-up
    15
    ·
    9 months ago

    I use quotation marks there because what is often referred to as AI today is not whatsoever what the term once described.

    The field of AI has been around for decades and covers a wide range of technologies, many of them much “simpler” than the current crop of generative AI. What is often referred to as AI today is absolutely what the term once described, and still does describe.

    What people seem to be conflating is the general term “AI” and the more specific “AGI”, or Artificial General Intelligence. AGI is the stuff you see on Star Trek. Nobody is claiming that current LLMs are AGI, though they may be a significant step along the way to that.

    I may be sounding nitpicky here, but this is the fundamental issue that the article is complaining about. People are not well educated about what AI actually is and what it’s good at. It’s good at a huge amount of stuff, it’s really revolutionary, but it’s not good at everything. It’s not the fault of AI when people fail to grasp that, no more than it’s the fault of the car when someone gets into it and then is annoyed it won’t take them to the Moon.

    • t3rmit3@beehaw.org
      link
      fedilink
      arrow-up
      8
      ·
      9 months ago

      People are not well educated about what AI actually is and what it’s good at.

      And half the reason they’re not educated about it is that AI companies are actively and intentionally misinforming them about it. AI companies sell people these products using words like “thinking”, “assessing”, “reasoning”, and “learning”, none of which are accurate to AI, but would be to AGI.

    • Scrubbles@poptalk.scrubbles.tech
      link
      fedilink
      English
      arrow-up
      5
      ·
      9 months ago

      The problem is that the average person and politician don’t know this difference, and are running around like skynet is about to kick off any second.