You must log in or register to comment.
It’s gonna turn out to be an filipino call center isn’t it.
we have finally achieved A Guy in India
That’s one way to put a sentient being on the other end of the request, I guess.
That’s OpenAI admitting that o1’s “chain of thought” is faked after the fact. The “chain of thought” does not show any internal processes of the LLM — o1 just returns something that looks a bit like a logical chain of reasoning.
I think it’s fake “reasoning” but I don’t know if (all of) OpenAI thinks that. They probably think hiding this data prevents cot training data from being extracted. I just don’t know how deep the stupid runs.