this post was submitted on 18 Jul 2025
163 points (96.0% liked)
Not The Onion
19001 readers
1514 users here now
Welcome
We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!
The Rules
Posts must be:
- Links to news stories from...
- ...credible sources, with...
- ...their original headlines, that...
- ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”
Please also avoid duplicates.
Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.
And that’s basically it!
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
So it takes ChatGPT 10 minutes to an hour of servertime and the energy equivalent of a tank of gas or two to complete a simple task the user could have done in thirty seconds using their 40W brainmeats and a couple of pudgy fingers. That's just great. Good stuff, Altman. /s
You're not wrong today. But this is exactly the basis of the critique of computers in the 50s. And you probably created this post using a mobile Internet connected computer that fits in your pocket.
Okay, down vote away. Lemmy has such an ignorant hate boner against AI.
Computers were fucking trash in the 50s. Dumb tech enthusiasts all said the same shit people say about AI today: computers are unreliable, create more problems than they solve, are ham-fisted solutions to problems that require human interaction, etc. here are the HUGE problems computers had that we solved before the 70s.
Problem: No standard way to represent negative numbers in binary.
Solution: Two's complement became the standard.
Problem: Bit errors from unreliable hardware.
Solution: Hamming codes, CRC, and other ECC methods.
Problem: Inconsistent and error-prone real number math.
Solution: IEEE 754 standardized floating-point formats and behavior.
Problem: Each computer had its own incompatible instruction set.
Solution: Standardized ISAs like x86 and ARM became dominant.
Problem: Memory was slow, small, and expensive.
Solution: Virtual memory, caching, and paging systems.
Problem: Basic operations like sorting were inefficient.
Solution: Research produced efficient algorithms (e.g., Quicksort, Dijkstra’s).
Problem: No formal approach to designing logic circuits.
Solution: Boolean algebra, Karnaugh maps, and FSMs standardized design.
Problem: Programs used unstructured jumps and were hard to follow.
Solution: Structured programming and control constructs (if, while, etc.).
Problem: No standard way to represent letters or symbols.
Solution: ASCII and later Unicode standardized text encoding.
Problem: Code was written in raw machine or assembly code.
Solution: High-level languages and compilers made programming more accessible.
Its just ignorant to be acting like any of the problems we face with AI won't be sorted just as they were with computers.
I agree on the point of solving a problem, it's just a matter of time, skill, and some luck. The biggest problem I see with AI right now is that it's marketed as something it's not. Which leads to a lot of the issues we have with "AI" aka LLMs put in places they shouldn't be. Surprisingly they do manage pretty well a lot of the time, but when they fail it's really bad. I.e., AI as sold is a remarkable illusion that wow, everyone has bought into even knowing full well it's not near perfect.
The only thing that will "fix" current AI is true AGI development that would demonstrate the huge difference. AI/LLMs might be part of the path there, I don't know. It's not the real solution though, no matter how many small countries worth of energy we burn to generate answers.
I say all this as an active casual experimenter with local LLMs. What they can do, and how they do it is amazing, but I also know what I have and it's not what I call AI, that term has been tainted again by marketers trying to cash in on ignorance.
What I am saying is computers were also marketed as something they were not (yet) and eventually became.
And so, history repeats itself.