Greg Clarke

Mastodon: @greg@clar.ke

  • 67 Posts
  • 468 Comments
Joined 2 years ago
cake
Cake day: November 9th, 2022

help-circle














  • Yes of course I’m asserting that. While the performance of LLMs may be plateauing, the cost, context window, and efficiency is still getting much better. When you chat with a modern chat bot it’s not just sending your input to an LLM like the first public version of ChatGPT. Nowadays a single chat bot response may require many LLM requests along with other techniques to mitigate the deficiencies of LLMs. Just ask the free version of ChatGPT a question that requires some calculation and you’ll have a better understanding of what’s going on and the direction of the industry.