this post was submitted on 10 Aug 2025
758 points (99.3% liked)

Technology

73878 readers
3662 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] howrar@lemmy.ca 4 points 1 day ago (1 children)

I don't see why they can't be resold. As long as there's a market for new AI hardware, there will continue to be a market for the older stuff. You don't need the latest and greatest for development purposes, or things that scale horizontally.

[–] mojofrododojo@lemmy.world 11 points 1 day ago (4 children)

I didn't say they couldn't be resold, they simply won't have as wide a potential user market like an generic GPU would. But think about it for a sec, you've got thousands of AI dedicated gpu's going stale whenever a datacenter gets overhauled or a datacenter goes bust.

that's gonna put a lot more product on the market that other datacenters aren't going to touch - no one puts used hardware in their racks - so who's gonna gobble up all this stuff?

not the gamers. who else needs this kind of stuff?

[–] Passerby6497@lemmy.world 6 points 23 hours ago* (last edited 23 hours ago)

no one important puts used hardware in their racks

FTFY. Just about every msp I've worked for has cut corners and went with 2nd hand (or possibly grey market) hardware to save a buck, including the ones who colo in "real" data centers. I would not be surprised to find that we're onboarding these kinds of cards to make a bespoke AI platform for our software customers here in a few years.

[–] BJ_and_the_bear@lemmy.world 11 points 1 day ago (1 children)

It will be good for nerds who want to run models locally. Definitely not a huge maker tho

[–] Tollana1234567@lemmy.today 2 points 1 day ago

too niche of the hundreds of billions they invested and will never get ROI from it.

[–] jkercher@programming.dev 7 points 1 day ago (1 children)

Also depends how hard the AI runs them. A good chunk of the graphics cards that were used as miners came out on life support if not completely toasted. Games generally don't run the piss out of them like that 24/7, and many games are still CPU bound.

[–] mojofrododojo@lemmy.world 2 points 1 day ago

yeah cooked hbf nand ain't doing anyone any favors heh

[–] addie@feddit.uk 4 points 1 day ago (1 children)

I'm not sure that they're even going to be useful for gamers. Datacenter GPUs require a substantial external cooling solution to stop them from just melting. Believe NVidia's new stuff is liquid-only, so even if you've got an HVAC next to your l33t gaming PC, that won't be sufficient.

[–] mojofrododojo@lemmy.world 8 points 1 day ago

not just those constraints, good luck getting a fucking video signal out of 'em when they literally don't have hdmi/dp or any other connectors.