this post was submitted on 04 Jul 2025
33 points (67.4% liked)

Technology

72360 readers
3056 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

By 2030, AI will greatly outperform humans in some complex intellectual tasks. Discover how LLMs are doubling their capabilities every seven months.

top 24 comments
sorted by: hot top controversial new old
[–] MonkderVierte@lemmy.zip 2 points 2 hours ago

So only 10 years until it isn't a ressource hog anymore...

[–] alehel@lemmy.zip 1 points 2 hours ago

Then why do I feel like it's programming abilites are getting worse? I've stopped paying for it now because it causes more frustration than anything else. Works for simple "how can I simplyfi this code" queries when my head hurts, but that's about it.

[–] ReallyActuallyFrankenstein@lemmynsfw.com 4 points 5 hours ago* (last edited 5 hours ago)

Is it just me, or is this graph (first graph in the article) completely unintelligible?

The X-axis being time is self-explanatory, but the Y-axis is somehow exponential time but then also mapping random milestones of performance, meaning those milestones are hard-linked to that time-based Y-axis? What?

[–] FartsWithAnAccent@fedia.io 22 points 10 hours ago (1 children)

How is completely fucking up literally 50% of the time outperforming exactly???

[–] rigatti@lemmy.world 15 points 8 hours ago

You see, in 7 months, they'll fuck up literally 100% of the time! Progress.

[–] db0@lemmy.dbzer0.com 34 points 11 hours ago (1 children)

This is such bullshit. Models have already consumed all available data and have nothing left to consume, whole needing exponentially more data for progressive advancements

[–] PushButton@lemmy.world 2 points 6 hours ago

Apparently, throwing more data at it will not help much from now on... But anyway what they're saying, I can't trust the snake oil seller, he is suspicious...

[–] spankmonkey@lemmy.world 76 points 14 hours ago (3 children)

This is like measuring the increasing speeds of cars in the early years and extrapolating that they would be supersonic by now by ignoring the exponential impact that air resistance has.

[–] Voroxpete@sh.itjust.works 1 points 21 minutes ago

My son has doubled in size every month for the last few months. At this rate he'll be fifty foot tall by the time he's seven years old.

Yeah, it's a stupid claim to make on the face of it. It also ignores practical realities. The first is those is training data, and the second is context windows. The idea that AI will successfully write a novel or code a large scale piece of software like a video game would require them to be able to hold that entire thing in their context window at once. Context windows are strongly tied to hardware usage, so scaling them to the point where they're big enough for an entire novel may not ever be feasible (at least from a cost/benefit perspective).

I think there's also the issue of how you define "success" for the purpose of a study like this. The article claims that AI may one day write a novel, but how do you define "successfully" writing a novel? Is the goal here that one day we'll have a machine that can produce algorithmically mediocre works of art? What's the value in that?

[–] trillian@feddit.org 3 points 4 hours ago

Air resistance has cubic not exponential impact

[–] LovableSidekick@lemmy.world 5 points 4 hours ago* (last edited 4 hours ago)

Very good analogy. They're also ignoring that getting faster and faster at reaching a 50% success rate (a totally unacceptable success rate for meaningful tasks) doesn't imply ever achieving consistently acceptable success.

[–] goondaba@lemmy.world 61 points 14 hours ago (2 children)

*with 50 percent reliability.

Heck of an asterisk on this claim.

[–] eager_eagle@lemmy.world 13 points 9 hours ago

That sounds like a coin flip, but 50% reliability can be really useful.

If a model has 50% chance of completing a task that would cost me an hour - and I can easily check it was completed correctly - on average, I'm saving half of the time it would take to complete this.

That said, exponentials don't exist in the real world, we're just seeing the middle of a sigmoid curve, which will soon yield diminishing returns.

[–] spankmonkey@lemmy.world 1 points 10 hours ago

All that power used for a fucking coin flip.

[–] J52@lemmy.nz 4 points 8 hours ago
[–] flango@lemmy.eco.br 20 points 15 hours ago (1 children)
[–] ddplf@szmer.info 5 points 13 hours ago (2 children)
[–] MHLoppy@fedia.io 11 points 10 hours ago (1 children)

Do you not see any value in engaging with views you don't personally agree with? I don't think agreeing with it is a good barometer for whether it's post-worthy

[–] ddplf@szmer.info 1 points 3 hours ago (1 children)

Good point, thank you, I figured that sharing poor scientific articles essentially equals spreading misinformation (which I think is a fair point either), but I like your perspective either

[–] Voroxpete@sh.itjust.works 1 points 29 minutes ago

I guess the value is that at some point you'll probably hear the core claim - "AI is improving exponentially" - regurgitated by someone making a bad argument, and knowing the original source and context can be very helpful to countering that disinformation.

[–] spankmonkey@lemmy.world 4 points 10 hours ago

So we can mock it!

[–] RagingSnarkasm@lemmy.world 10 points 13 hours ago
[–] SatanClaws@lemmy.world 6 points 13 hours ago

Is the performance increase related to computing power? I suspect the undelying massive datacenters running the cloud based LLMs are expanding at a similar rate...

[–] call_me_xale@lemmy.zip 3 points 14 hours ago

"performance"