this post was submitted on 31 Jan 2026
174 points (95.8% liked)
Technology
79763 readers
3258 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It's never a good idea to have the latest very sensitive GPUs in orbit where cosmic radiation can hit them and disrupt calculations. Or you put some old school robust versions up there, but then they're too slow compared to ground based tech.
GPUs don't last that long anyway. 5 years is a number you see often. Less with constant radiation. There is no upgrade or replace. That means these satellites will just stay up there for a few years until they de-orbit them and then we have a million satellites burning up in the atmosphere.
GPUs are also very power hungry. One modern rack of 72 GPUs sucks 120 kW. All solar panels of the ISS deliver 215 kW. These things won't be small. And you needs hundreds to thousands to get the capacity of one data center.
So far, putting them in orbit doesn't seem to make things easier.
Don't forget heat. You can't just drain the local water supply to cool all your systems in space, you need to actually radiate all those kilowatts of power after your chips convert them into heat.
And you can only use infrared to radiate heat in orbit, which is the least efficient way to transfer heat.
Also thanks too Moore's "Law", pretty much anything launched will have 1/2 the processing power of something on the ground of equivalent size every 2 years.
Part of the success of cloud hosting is that thanks to Moore's law companies were hesitant to buy hardware only to have it quickly become outdated*.
*cloud servers are actually pretty expensive so it really didn't work out like this, but by the time that was obvious, the advantage of cloud was you had support for aaS Software built in (e.g Database, load balancing, caching, etc), and downstream of that is the death of open source vendors being able to get by selling support, but I'm sure that won't have any negative effects 🙄.