• gandalf_der_12te
    link
    fedilink
    arrow-up
    1
    ·
    6 hours ago

    actually i think it kinda is nice and easy to do, i’m just too lazy/cheap to rent a server with 8GB of RAM, even though it would only cost $15/month or sth.

    • Smorty [she/her]
      link
      fedilink
      arrow-up
      1
      ·
      6 hours ago

      it would also be super slow, u usually want a GPU for LLM inference… but u already know this, u are Gandald der zwölfte after all <3