this post was submitted on 08 Jan 2026
1071 points (99.2% liked)
Technology
78511 readers
3884 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
No as it doesn't compute graphical information and is solely for running computations for "AI stuff".
GPUs aren't just for graphics. They speed up vector operations, including those used in "AI stuff". I just never heard of NPUs before, so I imagine they may be hardwired for graph architecture of neural nets instead of linear algebra, maybe, so that's why they can't be used as GPUs.
Initially, x86 CPUs didn't have a FPU. It cost extra, and was delivered as a separate chip.
Later, GPU is just a overgrown SIMD FPU.
NPU is a specialized GPU that operates on low-precision floating-point numbers, and mostly does matrix-multiply-and-add operations.
There is zero neural processing going on here, which would mean the chip operates using bursts of encoded analog signals, within power consumption of about 20W, and would be able to adjust itself on the fly online, without having a few datacenters spending exceeding amount of energy to update the weights of the model.
NPUs do those calculations far more effectively than a GPU though is what I was meaning.