• dactylotheca@suppo.fi
    link
    fedilink
    English
    arrow-up
    17
    ·
    edit-2
    5 months ago

    But it is, and it always has been. Absurdly complexly layered statistics, calculated faster than a human could.

    Well sure, but as someone else said even heat is statistics. Saying “ML is just statistics” is so reductionist as to be meaningless. Heat is just statistics. Biology is just physics. Forests are just trees.

    • Pennomi@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      5 months ago

      It’s like saying a jet engine is essentially just a wheel and axle rotating really fast. I mean, it is, but it’s shaped in such a way that it’s far more useful than just a wheel.

    • andyburke@fedia.io
      link
      fedilink
      arrow-up
      4
      ·
      5 months ago

      Yeah, but the critical question is: is human intelligence statistics?

      Seems no, to me: a human lawyer wouldn’t, for instance, make up case law that doesn’t exist. AI has done that one already. If it had even the most basic understanding of what the law is and does, it would have known not to do that.

      This shit is just megahal on a gpu.

      • dactylotheca@suppo.fi
        link
        fedilink
        English
        arrow-up
        9
        ·
        edit-2
        5 months ago

        Seems no, to me: a human lawyer wouldn’t, for instance, make up case law that doesn’t exist. AI has done that one already. If it had even the most basic understanding of what the law is and does, it would have known not to do that.

        LLMs don’t have an understanding of anything, but that doesn’t mean all AI in perpetuity is incapable of having an understanding of eg. what the law is. Edit: oh and also, it’s not like human lawyers are incapable of mistakenly “inventing” case law just by remembering something wrong.

        As to whether human intelligence is statistics, well… our brains are neural networks, and ultimately neural networks – whether made from meat or otherwise – are “just statistics.” So in a way I guess our intelligence is “just statistics”, but honestly the whole question is sort of missing the point; the problem with AI (which right now really means LLMs) isn’t the fact that they’re “just statistics”, and whether you think human intelligence is or isn’t “just statistics” doesn’t really tell you anything about why our brains perform better than LLMs

      • candybrie@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        5 months ago

        Seems no, to me: a human lawyer wouldn’t, for instance, make up case law that doesn’t exist

        You’ve never seen someone misremember something? The reason human lawyers don’t usually get too far with made-up case law is because they have reference material and they know to go back to it. Not because their brains don’t make stuff up.

        • andyburke@fedia.io
          link
          fedilink
          arrow-up
          4
          ·
          5 months ago

          I think you’re not aware of the AI making up a case name from whole cloth that I am talkimg about? Because that’s not misremembering, it’s exactly what you would expect unintelligent statistical prediction to come up with.

          • candybrie@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            5 months ago

            Have you ever graded free response tests before? I assure you that some people do similar things when pressed to come up with an answer when they don’t know. Often, they know they’re BSing, but they’re still generating random crap that sounds plausible. One of the big differences is that we haven’t told AI to tell us “I’m not sure” when it has high uncertainty; though plenty of people don’t do that either.