...and how should I access this cloud compute? Stick my fingers into a network socket and wiggle them?
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
So, what prediction did Bezos make back then, that seems particularly poignant right now? Bezos thinks that local PC hardware is antiquated, and that the future will revolve around cloud computing scenarios, where you rent your compute from companies like Amazon Web Services or Microsoft Azure.
This isn't a new idea, and it certainly predates Bezos.
I'm older now, but throughout my life there has been a pendulum swing back and forth between local compute power vs remote compute power. The price of RAM going up follows the exact same path this has gone half a dozen times already in the last 50 years. Compute power gets cheap then it gets expensive, then it gets cheap again. Bezos's statements are just the most recent example. He's no prophet. This has just happened before, and it will revert again. Rinse repeat:
- 1970s remote compute power: This couldn't really compute anything locally and required dialing into a mainframe over an analog telephone line to access the remote computing power.

- 1980s local compute power: CPUs got fast and cheap! Now you could do all your processing right on your desk without need of a central computer/mainframe

- 1990s remote compute power: Thin clients! These were underpowered desktop units that could access the compute power in a server such as Citrix Winframe/Metaframe or SunOS (for SunRay thin clients). Honorable mention for retail type units like Microsoft WebTV which was the same concept with different hardware/software.

-
2000s local compute power: This was the widespread adoption of desktop PCs with 3D graphics cards as a standard along with high power CPUs.
-
2010s remote compute power: VDI appears! This is things like VMware Horizon or Citirix Virtual Desktop along with the launch of AWS for the first time.
-
2020s local compute power: Powerful CPUs and massively fast GPUs are now now standard and affordable.
-
2030s remote compute power....in the cloud....probably
In 2040s do we just move our brains into our own self hosted data centers?
For the 2040s, if the pattern holds, local compute power will be come dirt cheap again, and there will be very few reasons to pay someone else to host your compute power remotely. Maybe it will be supercomputers on everyone's wrist or something.
I don't buy computers from wax figures.
No.
Aren't people basically already doing this? There are lots of people who only have their phone and maybe a tablet, and for basically everything that might actually require computing power (i.e. photo editing) they end up using a web app or something.
I have a linuxbox with decent hardware that i can squeeze 4 years out of still, a steamdeck, and steamframes soon. I'll be good for quite a while, long enough to ride out the AI-slop bubble.
Yes rent a computer from the cloud, while your Internet is capped at a terrabyte
elon can't keep his mouth shut, but looks like they're all psychos...
Isn't this what Google did with chrome os and chromebooks?
I already sort of do this with my gaming machine. It lives on a cloud host and I connect with a client.
It’s cheaper and more convenient than buying a new PC - especially since I’ve got three gamers in my house - and offloading graphics means I can get better battery life when playing on my laptop in my hammock.
However, if you’re more than a couple hundred miles from the data center or there’s network problems you won’t be having much fun. That’s the only reason I’d want an actual gaming machine, and even then I’d play via remote desktop from my hammock.
From a different perspective, renting a pc and having a thin client would be a lot better for the environment and cheaper for the consumer.
Sadly I don’t think that’s the goal of these companies