• LainTrain@lemmy.dbzer0.comBanned
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              Honestly i think speed is something I don’t care too much about with models, because even things like ChatGPT will be slower than Google for most things, and if something is more complex and a good use case for an LLM it’s unlikely to be the primary bottleneck.

              My gf private chat bot right now is a combination of Mistral 7B with a custom finetune and she it directs some queries to ChatGPT if I ask (I got free tokens way back might as well burn through them).

              How much of an improvement is Mixtral over Mistral in practice?

              • just another dev@lemmy.my-box.dev
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                Sillytavern by any chance?

                And I’d say the difference between mistral and mixtral is pretty big for general usage, feels like it’s a next generation.