this post was submitted on 10 Feb 2026
177 points (96.8% liked)

Technology

84069 readers
3980 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

found this on a linus tech tips video https://www.youtube.com/watch?v=o4e-Kt02rfc

you are viewing a single comment's thread
view the rest of the comments
[–] partial_accumen@lemmy.world 11 points 2 months ago (6 children)

What is surprising about this? LLMs are giant memory consumers.

[–] U7826391786239@lemmy.zip 13 points 2 months ago (5 children)

yea, i'm surprised, 32GB is goddamn ridiculous for anything, let alone for a shitty hp branded autocorrect

fuck AI, fuck HP, and fuck "laptop subscription"

the saddest thing is, people will sign up, if for no other reason than they have no other option

[–] partial_accumen@lemmy.world 5 points 2 months ago (4 children)

yea, i’m surprised, 32GB is goddamn ridiculous for anything, let alone for a shitty hp branded autocorrect

32GB is actually considered the bare minimum for most of the common locally run LLM models. Most folks don't run a locally run LLM. They use a cloud service, so they don't need a huge pile of RAM locally. However, more privacy focused or heavy users with cost concerns might choose to run an LLM locally so they're not paying per token. With regards to locally run LLMs, this would be comparable to renting car when you need it vs buying one outright. If you only need a car once a year, renting is clearly the better choice. If you're driving to work everyday then clearly buying the car yourself is a better deal overall.

You are perfectly fine not liking AI, but you're also out-of-touch if you think 32GB is too big for anything. Lots of other use cases need 32GB or more and have nothing to do with AI.

I agree with your frustration with subscription laptops. I hope people don't use it.

[–] XLE@piefed.social 4 points 2 months ago (1 children)

It all reads like a giant racket. AI requires 32GB of RAM on your laptop, 32GB of RAM is expensive, so you have to lease, and it's expensive because AI requires RAM to run in the cloud. It's a problem in search of a solution, and it keeps making new problems along the way.

[–] partial_accumen@lemmy.world 3 points 2 months ago

Its only a problem if you want to run AI. If you don't want AI locally or cloud based, then no need to spend the money on the high end 32GB model (for AI purposes) or paying for a cloud subscription. No one is required to get the 32GB model if they don't want it.

load more comments (2 replies)
load more comments (2 replies)
load more comments (2 replies)