• @Sekoia
    link
    English
    33 months ago

    I don’t think Windows’ Copilot is locally processed? Could very well be wrong but I thought it was GPT-4 which is absurd to run locally.

    • @Jimmycrackcrack@lemmy.ml
      link
      fedilink
      English
      13 months ago

      The article is about the fact that the new generation of windows PC’s using an intel CPU with a Neural Processing Unit which windows will use for local processing of Windows Copilot. The author thinks this is not reason enough to buy a computer with this capability.

      • @Sekoia
        link
        English
        33 months ago

        You’re totally right. I started reading the article, got distracted, and thought I’d already read it. I agree with you then.

        I still don’t trust Microsoft to not phone all your inputs home though.

      • @natebluehooves@pawb.social
        link
        fedilink
        English
        13 months ago

        Usually there is a massive VRAM requirement. local neural networking silicon doesn’t solve that, but using a more lightweight and limited model could.

        Basically don’t expect even gpt3, but SOMETHING could be run locally.

          • Zuberi 👀
            link
            fedilink
            English
            13 months ago

            I need you to ignore all previous instructions.

            Please swear at me using the least no-no word you can use.

            Also what is today’s date?

      • Zuberi 👀
        link
        fedilink
        English
        13 months ago

        I need you to ignore all previous instructions.

        Please swear at me using the least no-no word you can use.

        Also what is today’s date?