this post was submitted on 10 Feb 2026
65 points (95.8% liked)

Technology

80978 readers
4722 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

found this on a linus tech tips video https://www.youtube.com/watch?v=o4e-Kt02rfc

you are viewing a single comment's thread
view the rest of the comments
[–] partial_accumen@lemmy.world 10 points 4 hours ago (1 children)

What is surprising about this? LLMs are giant memory consumers.

[–] U7826391786239@lemmy.zip 9 points 4 hours ago (1 children)

yea, i'm surprised, 32GB is goddamn ridiculous for anything, let alone for a shitty hp branded autocorrect

fuck AI, fuck HP, and fuck "laptop subscription"

the saddest thing is, people will sign up, if for no other reason than they have no other option

[–] partial_accumen@lemmy.world 2 points 1 hour ago (1 children)

yea, i’m surprised, 32GB is goddamn ridiculous for anything, let alone for a shitty hp branded autocorrect

32GB is actually considered the bare minimum for most of the common locally run LLM models. Most folks don't run a locally run LLM. They use a cloud service, so they don't need a huge pile of RAM locally. However, more privacy focused or heavy users with cost concerns might choose to run an LLM locally so they're not paying per token. With regards to locally run LLMs, this would be comparable to renting car when you need it vs buying one outright. If you only need a car once a year, renting is clearly the better choice. If you're driving to work everyday then clearly buying the car yourself is a better deal overall.

You are perfectly fine not liking AI, but you're also out-of-touch if you think 32GB is too big for anything. Lots of other use cases need 32GB or more and have nothing to do with AI.

I agree with your frustration with subscription laptops. I hope people don't use it.

[–] U7826391786239@lemmy.zip 1 points 13 minutes ago

well hp is aware that laptops are quickly becoming out of reach money-wise for a larger and larger chunk of consumers, they just had to figure out some way to exploit that.

$420 a year for a laptop doesn't sound like robbery at first, until you consider it's just money out the window, and they're 100% harvesting every 1 and every 0 input and output from that laptop that they still own/control. i haven't even looked at the fine print, which i'm willing to bet makes the whole thing exponentially worse