They support Claude, ChatGPT, Gemini, HuggingChat, and Mistral.

  • nu11@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    8
    ·
    3 hours ago

    I don’t understand the hate. It’s just a sidebar for the supported LLMs. Maybe I’m misunderstanding?

    Yes, I would prefer Mozilla focus on the browser, but to me, this seems like it was done in an afternoon.

  • onlooker@lemmy.ml
    link
    fedilink
    arrow-up
    8
    ·
    6 hours ago

    For a second I thought it said “experimental failure”. Would be more accurate, I think.

  • ocassionallyaduck@lemmy.world
    link
    fedilink
    arrow-up
    15
    ·
    8 hours ago

    Thing is, for your average user with no GPU and whp never thinks about RAM, running a local LLM is intimidating. But it shouldn’t be. Any system with an integrated GPU, and the more RAM the better, can run simple models locally.

    The not so dirty secret is that ChatGPT 3 vs 4 isn’t that big a difference, and neither are leaps and bounds ahead of the publically available models for about 99% of tasks. For that 1% people will ooh and aah over it, but 99% of use cases are only seeing marginal gains on 4o.

    And the simplified models that run “only” 95% as well? They can use 90% fewer resources give pretty much identical answers outside of hyperspecific use cases.

    Running a a “smol” model as some are called, gets you all the bang for none of the buck, and your data stays on your system and never leaves.

    I’ve been yelling from the rooftops to some stupid corporate types that once the model is trained, it’s trained. Unless you are training models yourself, there is no need for the massive AI clusters, just for the model. Run it local on your hardware at a fraction of the cost.

    • LWD@lemm.ee
      link
      fedilink
      arrow-up
      15
      ·
      6 hours ago

      There’s the tragedy with this new feature: they fast-tracked this past more popular requests, sticking it into Release Firefox.

      But they only rushed the part that connects to third parties. There was also a “localhost” option which was originally alongside the Big Five corporate offerings, but Mozilla ultimately decided to bury that one inside of the about:config settings.

      • MrOtherGuy@lemmy.world
        link
        fedilink
        arrow-up
        6
        ·
        5 hours ago

        I’m guessing that the reason (and a good one at that) is that simply having an option to connect to a local chatbot leads to just confused users because they also need the actual chatbot running on their system. If you can set up that, then you can certainly toggle a simple switch in about:config to show the option.

    • Lojcs@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      4 hours ago

      Last time I tried using a local llm (about a year ago) it generated only a couple words per second and the answers were barely relevant. Also I don’t see how a local llm can fulfill the glorified search engine role that people use llms for.

      • TheDorkfromYork@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 hours ago

        They’re fast and high quality now. ChatGPT is the best, but local llms are great, even with 10gb of vram.

    • ilhamagh@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      5 hours ago

      Can you point me to some resources to running smol llm?

      My use case prob just to help “typing” miscellaneous idea I have or check for my grammatical error, in english.

      Thanks, in advance.

  • marcie (she/her)@lemmy.ml
    link
    fedilink
    arrow-up
    11
    ·
    edit-2
    7 hours ago

    why a fucking chatbot? translate a page better for me you fucking losers, all the translation options suck for privacy outside of specifically trained local AIs. this is the BEST use case for a small local LLM yet mozilla with all its brains and resources couldnt rub two neurons together for this.

    or they could do character prediction on your typing to make typing faster. just some legit examples, why waste resources to build a chat ai into my browser when i can just open a website???

    • Midnitte@beehaw.org
      link
      fedilink
      English
      arrow-up
      3
      ·
      6 hours ago

      Perhaps Mozilla’s biggest “failure” is just communication…

      Firefox actually has this now.

  • Eiri@lemmy.ca
    link
    fedilink
    arrow-up
    14
    ·
    11 hours ago

    I wish I had telemetry on such features.

    I really doubt a significant number of people use AI chatbots often enough that having it in a dedicated sidebar is worth it.

  • graphito@sopuli.xyz
    link
    fedilink
    arrow-up
    4
    ·
    10 hours ago

    The chat isn’t the point, it’s needed as interface for storing your logins to summarization features

    When internet is written by ai, you do need a tldr