• merc@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    2
    ·
    11 months ago

    The model still calculates probability for each repetition

    Which is very cheap.

    as expensive as other queries which is definitely not free

    It’s still very cheap, that’s why they allow people to play with the LLMs. It’s training them that’s expensive.

    • ExLisper@linux.community
      link
      fedilink
      English
      arrow-up
      3
      ·
      11 months ago

      Yes, it’s not expensive but saying that it’s ‘one of the easiest tasks a computer can do’ is simply wrong. It’s not like it’s concatenates strings, it’s still performing complicated calculations using on of the most advanced AI techniques known today and each query can be 1000x times more expensive than a google search. It’s cheap because a lot of things at scale are cheap but pretty much any other publicly available API on the internet is ‘easier’ than this one.