this post was submitted on 07 Jan 2026
385 points (99.5% liked)

Technology

5078 readers
731 users here now

Which posts fit here?

Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.


Post guidelines

[Opinion] prefixOpinion (op-ed) articles must use [Opinion] prefix before the title.


Rules

1. English onlyTitle and associated content has to be in English.
2. Use original linkPost URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communicationAll communication has to be respectful of differing opinions, viewpoints, and experiences.
4. InclusivityEveryone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacksAny kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangentsStay on topic. Keep it relevant.
7. Instance rules may applyIf something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.


Companion communities

!globalnews@lemmy.zip
!interestingshare@lemmy.zip


Icon attribution | Banner attribution


If someone is interested in moderating this community, message @brikox@lemmy.zip.

founded 2 years ago
MODERATORS
 

Dell is now shifting it focus this year away from being ‘all about the AI PC.’

you are viewing a single comment's thread
view the rest of the comments
[–] ch00f@lemmy.world 40 points 4 days ago (8 children)

WTF even is an "AI PC"? I saw an ad for some AI laptop. To my knowledge, nobody is running LLMs on their personal hardware, so do these computers have like...a web browser?

[–] Ledivin@lemmy.world 19 points 4 days ago (1 children)

To my knowledge, nobody is running LLMs on their personal hardware

They absolutely are.

[–] msage@programming.dev 15 points 4 days ago (1 children)

Statistically relevant portion?

You know they were hyperbolic.

[–] Ledivin@lemmy.world 0 points 4 days ago (1 children)

"To my knowledge" really doesn't feel like hyperbole at all, IMO

[–] msage@programming.dev 5 points 3 days ago (1 children)

Do 5% of people you know use local LLMs?

If so, you don't know a lot of people, or you are heavy into the lLLM scene.

[–] Ledivin@lemmy.world 2 points 3 days ago (1 children)

Do 5% of people you know watch hockey regularly? If not, I guess it must not be a real sport, and that definitely has absolutely nothing to do with your own bubble

[–] msage@programming.dev 2 points 3 days ago

Surprisingly a lot of people around me watch hockey.

And I also hear about a lot of people watching it.

But even though I'm very much into IT, I know very few people who selfhost. Despite that being a small community, local genAI seems even smaller that that.

[–] Gsus4@mander.xyz 14 points 4 days ago* (last edited 4 days ago) (1 children)

You can run it on your laptop, I've tried it before (PS e.g. https://www.nomic.ai/gpt4all, it's fun to dig through your documents, but it wasn't as useful as I thought it would be in collecting ideas), what is truly hard is to train. But yeah, what is an AI PC? Is it like a gaming rig with lotsa RAM and GPU(s)?

[–] virku@lemmy.world 6 points 4 days ago

It seems my laptop at work has a neural chip. I guess a special ai only gpu. I don't think I could care less about a laptop feature.

It’s two things:

  • a machine that has NN-optimized segments on the CPU, or a discrete NPU
  • microslop’s idiotic marketing and branding around trying to get everyone to use Copilot
[–] capuccino@lemmy.world 11 points 4 days ago (1 children)

Computers now come with a NPU (Neural Process Unit) to do that job... So yeah.

[–] ch00f@lemmy.world 9 points 4 days ago (3 children)

What kind of consumer-facing software runs on that NPU?

I know Video editing software uses it for things like motion tracking.

It's all stuff your GPU can do, but the NPU can do it for like 1/10th to 1/100th the power.

[–] atomicbocks@sh.itjust.works 9 points 3 days ago

For what it’s worth an NPU is why your phone could tell you that photo is of a cat years before LLMs were the hot new thing. They were originally marketed as accelerators for machine learning applications before everybody started calling that AI.

[–] capuccino@lemmy.world 5 points 4 days ago (1 children)

New versions of Sony Vegas use the NPU to enhance AI features, nothing that humans cannot do before.

[–] maccentric@sh.itjust.works 2 points 3 days ago

Sony still makes laptops? TIL

[–] artyom@piefed.social 10 points 4 days ago (1 children)

Lots of people are. Typically it means they have an NPU.

[–] ch00f@lemmy.world 6 points 4 days ago (1 children)

I'm talking about an ad I saw on broadcast television during a football game. I don't think the broad market of people are downloading models from huggingface or whatever.

[–] artyom@piefed.social 0 points 3 days ago (1 children)

The ad you saw said no one was running local AI?

[–] ch00f@lemmy.world 6 points 3 days ago (1 children)

The ad was people doing generic AI stuff. I think it was even showing Copilot.

Either way, the marketing for AI is far to nebulous for it to matter. Just looking for the ad, I found plenty (like this one) that explicitly mention "on-device AI," but show people just searching for shit or doing nebulous office work. This ad even shows generating images in MS Paint which offloads the AI shit to the cloud.

[–] artyom@piefed.social 2 points 3 days ago

Ah yeah that's true. Just marketing BS to sell hardware mostly.

[–] halcyoncmdr@lemmy.world 8 points 4 days ago

Running an LLM locally is entirely possible with fairly decent modern hardware. You just won't be running the largest versions of the models. You're going to run ones intended for local use, almost certainly Quantized versions. Those usually are intended to cover 90% of use cases. Most people aren't really doing super complicated shit with these advanced models. They're asking it the same questions they typed into Google before, just using phrasing they used 20+ years ago with Ask Jeeves.

Not sure what the dell computers are doing but with something like Alpaca it’s pretty easy to run local LLMs

[–] redknight942@sh.itjust.works 3 points 4 days ago

It is quite easy to run a distilled local model using a decent rig. I have one in that I use right from the terminal.