• Smorty [she/her]
    link
    fedilink
    arrow-up
    2
    ·
    17 hours ago

    big sad :(

    wish it would be nice and easi to do stuff like this - yea hosting it somewhere is probably best for ur moni and phone.

    • gandalf_der_12te
      link
      fedilink
      arrow-up
      1
      ·
      10 hours ago

      actually i think it kinda is nice and easy to do, i’m just too lazy/cheap to rent a server with 8GB of RAM, even though it would only cost $15/month or sth.

      • Smorty [she/her]
        link
        fedilink
        arrow-up
        2
        ·
        9 hours ago

        it would also be super slow, u usually want a GPU for LLM inference… but u already know this, u are Gandald der zwölfte after all <3