• alphafalcon@feddit.de
    link
    fedilink
    arrow-up
    3
    ·
    9 months ago

    I’m with you on LLMs being over hyped although that’s already dying down a bit. But regarding your claim that LLMs cannot “understand context”, I’ve recently read an article that shows that LLMs can have an internal world model:

    https://thegradient.pub/othello/

    Depending on your definition of “understanding” that seems to be an indicator of being more than a pure “stochastic parrot”