Technology
Which posts fit here?
Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.
Post guidelines
[Opinion] prefix
Opinion (op-ed) articles must use [Opinion] prefix before the title.
Rules
1. English only
Title and associated content has to be in English.
2. Use original link
Post URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communication
All communication has to be respectful of differing opinions, viewpoints, and experiences.
4. Inclusivity
Everyone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacks
Any kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangents
Stay on topic. Keep it relevant.
7. Instance rules may apply
If something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.
Companion communities
!globalnews@lemmy.zip
!interestingshare@lemmy.zip
Icon attribution | Banner attribution
If someone is interested in moderating this community, message @brikox@lemmy.zip.
view the rest of the comments
WTF even is an "AI PC"? I saw an ad for some AI laptop. To my knowledge, nobody is running LLMs on their personal hardware, so do these computers have like...a web browser?
They absolutely are.
Statistically relevant portion?
You know they were hyperbolic.
"To my knowledge" really doesn't feel like hyperbole at all, IMO
Do 5% of people you know use local LLMs?
If so, you don't know a lot of people, or you are heavy into the lLLM scene.
Do 5% of people you know watch hockey regularly? If not, I guess it must not be a real sport, and that definitely has absolutely nothing to do with your own bubble
Surprisingly a lot of people around me watch hockey.
And I also hear about a lot of people watching it.
But even though I'm very much into IT, I know very few people who selfhost. Despite that being a small community, local genAI seems even smaller that that.
You can run it on your laptop, I've tried it before (PS e.g. https://www.nomic.ai/gpt4all, it's fun to dig through your documents, but it wasn't as useful as I thought it would be in collecting ideas), what is truly hard is to train. But yeah, what is an AI PC? Is it like a gaming rig with lotsa RAM and GPU(s)?
It seems my laptop at work has a neural chip. I guess a special ai only gpu. I don't think I could care less about a laptop feature.
It’s two things:
Computers now come with a NPU (Neural Process Unit) to do that job... So yeah.
What kind of consumer-facing software runs on that NPU?
I know Video editing software uses it for things like motion tracking.
It's all stuff your GPU can do, but the NPU can do it for like 1/10th to 1/100th the power.
For what it’s worth an NPU is why your phone could tell you that photo is of a cat years before LLMs were the hot new thing. They were originally marketed as accelerators for machine learning applications before everybody started calling that AI.
New versions of Sony Vegas use the NPU to enhance AI features, nothing that humans cannot do before.
Sony still makes laptops? TIL
Lots of people are. Typically it means they have an NPU.
I'm talking about an ad I saw on broadcast television during a football game. I don't think the broad market of people are downloading models from huggingface or whatever.
The ad you saw said no one was running local AI?
The ad was people doing generic AI stuff. I think it was even showing Copilot.
Either way, the marketing for AI is far to nebulous for it to matter. Just looking for the ad, I found plenty (like this one) that explicitly mention "on-device AI," but show people just searching for shit or doing nebulous office work. This ad even shows generating images in MS Paint which offloads the AI shit to the cloud.
Ah yeah that's true. Just marketing BS to sell hardware mostly.
Running an LLM locally is entirely possible with fairly decent modern hardware. You just won't be running the largest versions of the models. You're going to run ones intended for local use, almost certainly Quantized versions. Those usually are intended to cover 90% of use cases. Most people aren't really doing super complicated shit with these advanced models. They're asking it the same questions they typed into Google before, just using phrasing they used 20+ years ago with Ask Jeeves.
Not sure what the dell computers are doing but with something like Alpaca it’s pretty easy to run local LLMs
It is quite easy to run a distilled local model using a decent rig. I have one in that I use right from the terminal.