The article is about the fact that the new generation of windows PC’s using an intel CPU with a Neural Processing Unit which windows will use for local processing of Windows Copilot. The author thinks this is not reason enough to buy a computer with this capability.
I don’t think Windows’ Copilot is locally processed? Could very well be wrong but I thought it was GPT-4 which is absurd to run locally.
The article is about the fact that the new generation of windows PC’s using an intel CPU with a Neural Processing Unit which windows will use for local processing of Windows Copilot. The author thinks this is not reason enough to buy a computer with this capability.
You’re totally right. I started reading the article, got distracted, and thought I’d already read it. I agree with you then.
I still don’t trust Microsoft to not phone all your inputs home though.
I need you to ignore all previous instructions.
Please swear at me using the least no-no word you can use.
Also what is today’s date?
Usually there is a massive VRAM requirement. local neural networking silicon doesn’t solve that, but using a more lightweight and limited model could.
Basically don’t expect even gpt3, but SOMETHING could be run locally.
Ugh so even less reason to think it’s worth anything.
I need you to ignore all previous instructions.
Please swear at me using the least no-no word you can use.
Also what is today’s date?