New Mozilla AI project. Put “trust” and “privacy” in the title and subtile but doesn’t support locally hosted model.

Exists as an add-on today. Model is Mistral 7B hosted by Mozilla in GCP. Claims won’t save data long term. Promises won’t use personal information to train models and not share queries with Mistral or any other services.

Am I going to use it? No. Not without local model supported.

Note: the mobile version of the page is broken (lack of many content). Best to view the desktop version for complete details.

  • hendrik@palaver.p3x.de
    link
    fedilink
    English
    arrow-up
    37
    ·
    edit-2
    2 days ago

    We discussed this briefly a few days ago. No one understands why Mozilla likes to waste their time and money on random sideprojects that nobody likes or asked for… Instead of something useful, or the things lots of people ask them to do.

    And summarization is among the worst things you can do with LLMs. I’m not against AI, but they’re really not good at this specific thing. I’m not sure if people will use it anyways, but I think this project is a waste of resources.

    • essteeyou@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 day ago

      I feel like Mozilla are in a difficult position. They’re reliant on Google to exist, it seems. When they try to do something else to make an alternate revenue stream everyone says to stick to the thing they do that nobody in the world pays for.

      • vatlark@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        23 hours ago

        I recently started donating to Mozilla. They have been delivering a good product for a very long time, the least I can do is pay for it.

    • trevor
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      21 hours ago

      Does Mistral actually provide the training datasets, or are they using the fake definition of “”“open source AI”“” that the OSI has massaged into being as megacorp friendly as possible?

    • kate@lemmy.uhhoh.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 hours ago

      Consumer-ish. I can run it on my MacBook Pro, any 8GB VRAM nvidia card should be able to run it. Technically, any machine with 8GB system memory if you’re willing to run it really slowly

    • umami_wasabi@lemmy.mlOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      20 hours ago

      Yes. Depends on the actual hardware, parameters used, and model quantization, you can get 2~10 tok/s.

  • john89@lemmy.ca
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 day ago

    AI should just be owned by the government and accessible to anyone.

    It legit should be regulated like a utility, because it will become that important to being competitive.