• The Snark Urge@lemmy.world
    link
    fedilink
    English
    arrow-up
    92
    ·
    edit-2
    1 month ago

    One time I exposed deep cracks in my calculator’s ability to write words with upside down numbers. I only ever managed to write BOOBS and hELLhOLE.

    LLMs aren’t reasoning. They can do some stuff okay, but they aren’t thinking. Maybe if you had hundreds of them with unique training data all voting on proposals you could get something along the lines of a kind of recognition, but at that point you might as well just simulate cortical columns and try to do Jeff Hawkins’ idea.

    • noodlejetski@lemm.ee
      link
      fedilink
      English
      arrow-up
      45
      ·
      1 month ago

      LLMs aren’t reasoning. They can do some stuff okay, but they aren’t thinking

      and the more people realize it, the better. which is why it’s good that a research like that from a reputable company makes headlines.