this post was submitted on 03 Apr 2026
227 points (99.1% liked)
Technology
83330 readers
3610 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Data centers existing makes sense, but this specific aggressive AI data center buildout (with special-purpose hardware) doesn't: the two AI companies you mentioned, OpenAI and Anthropic, aren't making a profit, and they don't appear to have a viable path to one. OpenAI claims it'll be wildly profitable in just a few years, but they don't go into how.
"aren't making a profit" gets into the mess that is book keeping and is a giant rabbit hole people actively avoid because it is just easier to get angry at stupidity rather than complex malfeasance.
But what makes something an "AI data center" outside of the branding?
The reality is that it is a shit ton of computers connected to a really fast internet connection. Preferably through a properly managed set of switches but you do you. And the reason that we still mostly use GPUs for "AI" rather than highly specialized hardware (although, nvidia DID just buy groq a few months back...) is for that reason. They might do linear algebra of quarter precision floats REALLY well but they also do linear algebra of single and double precision floats pretty well too. And the CPUs and mobos (that are mostly optimized for data movement to offload to said GPUs) are no slouches either.
Which is what most of these companies are planning for. openai is, arguably, really fucking stupid. Whereas anthropic have shown decent signs of "diversifying" as it were. And nvidia... if we lived in a world where they could get enough RAM I think they would be fine. As it stands... Jensen (and a LOT of people) are kinda fucked and I expect to see a hard pivot over the next 12 months.
Because if we banned ALL generative AI tomorrow? The people who think you can't use a computer without installing litellm first are gonna be fucked. But everyone else will just put other workloads on there and be... "fine" is a strong word but they won't go bankrupt. And the data centers themselves will still be incredibly valuable.
I wish GPUs in AI data centers (or worse, the ones purchased and not installed yet) were more general-purpose than they appear to be. That's the part that makes them AI data centers: the optimized hardware.
I do agree things are complex. And I like reading about the intricacies of that complexity. The overall picture is still a pretty bad one, though.