this post was submitted on 30 Oct 2025
808 points (99.6% liked)

Technology

76415 readers
3814 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Jhex@lemmy.world 8 points 14 hours ago (2 children)

And even though NVIDIA is better place as they do produce something, but the something in play has little value out of the AI bubble.

NVIDIA could be left holding the bag on a super increased capacity to produce something that nobody wants anymore (or at least nowhere near at the levels we have now) so they are still very much exposed.

[–] enumerator4829@sh.itjust.works 7 points 13 hours ago (2 children)

I want cheap GPUs at home please!

[–] Dojan@pawb.social 1 points 9 hours ago

I'd love this, but not Nvidia.

[–] Jhex@lemmy.world 2 points 12 hours ago

me too, but the GPU used for AI are not the same as what we would use at home.

maybe the factories can produce both kinds and they would be cheaper, but it is speculation at this point

[–] kadu@scribe.disroot.org 3 points 13 hours ago* (last edited 13 hours ago) (2 children)

but the something in play has little value out of the AI bubble.

You're delusional if you think GPUs are of little value. LLMs and fancy image generation are a bubble.

The gargantuan computational cost of running the machine learning processing that is now required for protein folding and molecular docking is not.

[–] ayyy@sh.itjust.works 7 points 13 hours ago (2 children)

Sure, but the scientists doing those kinds of workflows don’t have anywhere near the money to burn on GPUs. Even before they had all of their funding cut off for being to gay or brown or whatever crap the Nazis have come up with.

[–] kadu@scribe.disroot.org 1 points 7 hours ago

Sure, but the scientists doing those kinds of workflows don’t have anywhere near the money to burn on GPUs

I'm working in a lab that is purchasing a cluster with a price tag you wouldn't believe even if I could share it, which I can't. We are publicly funded. Scientists are buying this hardware, for this price, because the speed up we get is tremendous.

[–] bookmeat@lemmynsfw.com 1 points 11 hours ago

This is just a small part of the perpetual cycle of growth and contraction. Growth comes from breakthroughs and innovations. Contraction comes from mis-allocation of resources and the need to extract efficiency from the breakthrough and innovation.

So now everything is booming and growing. This will slow down and if it becomes efficient enough it will remain useful and accessible. If not, it will be discarded and another breakthrough will take its place.

[–] Jhex@lemmy.world 3 points 12 hours ago

The gargantuan computational cost of running the machine learning processing that is now required for protein folding and molecular docking is not.

Sure but do you need the absolute gargantuan capacity that is being built right now for that? if so, for how long and at what cost?

The point is not that GPU per se are of little value... the point is that what would you do with 10,000 rocket ships if you only have 1000 projects that may be able to use them? and what can those projects actually pay? can they cover the cost of the 10,000 rockets you built?