680
AI boom could falter without wider adoption, Microsoft chief Satya Nadella warns
(www.irishtimes.com)
This is a most excellent place for technology news and articles.
Is that true? I haven’t heard MS say anything about enabling local LLMs. Genuinely curious and would like to know more.
Isn't that the whole shtick of the AI PCs no one wanted? Like, isn't there some kind of non-GPU co-processor that runs the local models more efficiently than the CPU?
I don't really want local LLMs but I won't begrudge those who do. Still, I wouldn't trust any proprietary system's local LLMs to not feed back personal info for "product improvement" (which for AI is your data to train on).
NPU neural processing unit
That's why they have the "Copilot PC" hardware requirement, because they're using an NPU on the local machine.
searches
https://learn.microsoft.com/en-us/windows/ai/npu-devices/
It's not...terribly beefy. Like, I have a Framework Desktop with an APU and 128GB of memory that schlorps down 120W or something, substantially outdoes what you're going to do on a laptop. And that in turn is weaker computationally than something like the big Nvidia hardware going into datacenters.
But it is doing local computation.