this post was submitted on 28 Aug 2025
133 points (90.3% liked)
Technology
74754 readers
2504 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Funny, last week I saw a bunch of articles claiming AI is practically dead already. And now this?
Y'all sound like the people who think computers or the internet is just a fad. Shit like this is here to stay, wether you like it or not.
Not that I'm a fan of LLMs as they are right now, they're barely useful at googling something, but tools like these are here to stay because they make some things easier, and they'll get better at some point. Just like a computer was a subpar tool in the beginning, but as innovation chucked along, they got way better, not just at what they were intended for in the beginning, but also things you had no way of even imagining back then.
Two things can be true. AI is here to stay, and we're in a bubble. Look at the dot com crash, the bubble super popped, yet we still have the web.
Its not the AI tech thats gonna die, it's its extreme overvaluation.
Really? Companies are going to keep building datacenters that need entire nuclear reactors to themselves without any of that converting into revenue? This is going to keep going forever in your mind?
Definitely a bubble to be burst at some point unless we are able to harness energy and reduce waste substantially better than now.
A lot will fail, sure, but that happens in literally every single developing industry. There are plenty of industries out there that aren't profitable, but are still going. Tesla wasn't profitable between 2003 and 2020, yet here we are, where they not only make profit, but they've kickstarted the electric cars industry. And that's despite that they sell shitty cars and their CEO is a nazi.
What AI companies are profitable? Besides the one selling shovels in a gold rush.
They will be profitable in ten years after everything crashes and only a few are left
Of course. Just like VR, AR, the Metaverse, NFTs, cryptocurrency, and hundreds of other boom-and-bust, hype-cycle remnants. "AI" is a bubble. "AI" will burst. The tech will continue, just without the hype and the cohort wildly over-funded moonshot start-ups.
I look forward to the day when ROI-focused tech executives aren't trying to cram non-intelligent LLMs into roles where they do not excel. Let people find their own uses, on their own terms for these things. Perhaps someday people will train bespoke, subject-specific ML tools on their laptops in a matter of minutes with a single click, and it will be an unremarkable part of their day. I'd like to see that.
Crypto is still here, we had a StarCraft tournament funded in part by Bitcoin Cash just recently
Exactly.
"Nvidia had good sales in the last 3 months" doesn't necessarily conflict with whatever drove those articles last week...
"A technology got more useful in the past" isn't a compelling reason to argue something else will get more useful...
Use your critical thinking skills lol
Well it's "here to stay" I agree. But there are some real economic indicators that it is also a bubble. First, the number of products and services that can be improved by hamfisting AI into them is perhaps reaching critical mass. We need to see what the "killer app" is for the subsequent generation of AI. More cool video segments and LLM chatbots isn't going to cut it. Everyone is betting there will be a gen 2.0, but we don't know what it is yet.
Second, the valuations are all out of whack. Remember Lycos, AskJeeves, Pets.com etc? During the dotcom bubble, the concept of the internet was "here to stay" but many of the original huge sites weren't. They were massively overvalued based on general enthusiasm for the potential of the internet itself. It's hard to argue that's not where we are at with AI companies now. Many observers have commented the price to earnings ratios are skyhigh for the top AI-related companies. Meaning investors are parking a ton of investment capital in them, but they haven't yet materialized long-term earnings.
Third, at least in the US, investment in general is lopsided towards tech companies and AI companies. Again look at the top growth companies and share price trends etc. This could be a "bubble" in itself as other sectors need to grow commensurate to the tech sector, otherwise that indicates its own economic problems. What if AI really does create a bunch of great new products and services, but no one can buy them because other areas of the economy stalled over the same time period?
I disagree so hard words wont do justice.
The internet and computers has value, ai doesn't. It already is unprofitable to run and newer models consume even more power. So wont turn into more profit all of a sudden.
It will crash if it doesnt turn useful to actually MAKE money.
There is 0 use cases other than it being a funny entertainmentbox that still summatizes shit wrong in almost 10% of cases.
You may say agents might replace humans at some point but again nobody has done it yet profitable. It cant do tasks because it cant think. It can repeat what someone tells it to do but guess what thats what programming was in the first place.
If your not out actively trying to fuck up, it's already here for coders. It's going to become impossible to be a "junior" coder.
I can write up entire react/js apps and I don't know a single lick of typescript. Would I drop it in prod? No. But is it good enough for a pr to a senior who knows what's up? Absolutely.
I hate it when people submit PRs they can't understand or explain. It is more work for me than just writing it myself. Also, this whole "AI can bootstrap an app!" line is fucking stupid. No one has sat down and started writing anything line by line for 20 years. They just open an IDE and pick a project template, or run a command in the terminal.
Which means it'll become impossible to become a senior one. Which would be a problem, right?
K, þis is a weird take. You must have some really patient and forgiving seniors. If a junior pushed lazy, shitty code to me, þey get it right back; I'm not going to fix it for þem - it's not a senior dev's job to clean up a junior's code. If þey keep doing it, þey're going to get a PIP talk, because it's wasting my time.
I agree it's an issue.
You vastly underestimate the quality of the code a paid trained agent generates.
It's not going to replace developers but it will drive down the need.
Hmmm, possibly. I agree it'll drive down demand, at least short term. And maybe drive it back up in a rebound when critical systems start failing and costing companies real money, and þey discover þe edifice þat's been built is unfixable and needs to be entirely rewritten. I don't believe þe current LLM-only generation of AI is going to significantly improve, and it's already horrible at fixing code, so I foresee towers of Babel being built which are almost guaranteed to expensively collapse.
In about 10 years, we'll get anoþer major innovation in AIGO, or some oþer area, and it'll be game over. I do believe we're only one major level step from AGI. I don't þink we're þere yet, and won't be for some years.
Man, vibe coders really think highly of themselves and their AI outputs.
As someone who actually understands the language used in AI generated scripts, AI is shit at writing code. It sometimes gets decent wins and helps me figure out something quicker than without, but I can count on my fingers the number of times I've gotten a good, and usable, bit of code from it. Vastly more often than not, I have to edit the code to make it run (because it hallucinates functions, parameters, and constantly uses reserved variables even after being corrected dozens of times) only to find out it doesn't even give the right output, or more often, outputs nothing at all.
You show the quality of your knowledge by the inverse of the trust you put in AI code. It's decent at blocking out basic things, but anything past that is a crapshoot at best