this post was submitted on 12 Aug 2025
166 points (94.1% liked)

Technology

3819 readers
577 users here now

Which posts fit here?

Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.


Post guidelines

[Opinion] prefixOpinion (op-ed) articles must use [Opinion] prefix before the title.


Rules

1. English onlyTitle and associated content has to be in English.
2. Use original linkPost URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communicationAll communication has to be respectful of differing opinions, viewpoints, and experiences.
4. InclusivityEveryone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacksAny kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangentsStay on topic. Keep it relevant.
7. Instance rules may applyIf something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.


Companion communities

!globalnews@lemmy.zip
!interestingshare@lemmy.zip


Icon attribution | Banner attribution


If someone is interested in moderating this community, message @brikox@lemmy.zip.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] Endmaker@ani.social 6 points 1 day ago* (last edited 1 day ago) (1 children)

Someone with the expertise should correct me if I am wrong; it's been 4-5 years since I learnt about NPUs during my internship so I am very rusty:

You don't even need a GPU if all you want to do is to run - i.e. perform inference with - a neural network (abbreviating it to NN). Just a CPU would do if the NN is sufficiently lightweight. The GPU is only needed to speed up the training of NNs.

The thing is, the CPU is a general-purpose processor, so it won't be able run the NN optimally / as efficiently as possible. Imagine you want to do something that requires the NN and as a result, you can't do anything else on your phone / laptop (it won't be problem for desktops with GPUs though).

Where NPU really shines is when there are performance constraints on the model: when it has to be fast (to be specific: have real-time speed), lightweight and memory efficient. Use cases include mobile computing and IoT.

In fact, there's news about live translation on Apple AirPod. I think this may be the perfect scenario for using NPUs - ideally housed within the earphones directly but if not, within a phone.

Disclaimer: I am only familiar with NPUs in the context of "old-school" convolutional neural networks (boy, tech moves so quickly). I am not familiar with NPUs for transformers - and LLMs by extension - but I won't be surprised if NPUs have been adapted to work with them.

[โ€“] rumba@lemmy.zip 4 points 1 day ago

I'm not exactly an expert either but I believe the NPUs were seeing in the wild here are more like efficiency cores for AI.

Using the GPU would be faster, but have much larger energy consumption. They're basically mathco processors that are good at matrix calculations.