It’s interesting, when you ask a LLM something that it doesn’t know, it will tend to just spew out words that sound like they make sense, but are wrong.
So it’s much more useful to have a human that will admit that they don’t have a response for it. Or the human acts like the LLM spewing stupid stuff that sounds right and gets promoted instead.
It’s interesting, when you ask a LLM something that it doesn’t know, it will tend to just spew out words that sound like they make sense, but are wrong.
So it’s much more useful to have a human that will admit that they don’t have a response for it. Or the human acts like the LLM spewing stupid stuff that sounds right and gets promoted instead.