this post was submitted on 12 May 2026
122 points (92.4% liked)

Technology

84569 readers
3844 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

In addition to making people stupid, I wonder what affect will LLMs like Claude will have on programmers? How will new programmers learn if companies start using Claude?

top 37 comments
sorted by: hot top controversial new old
[–] fartographer@lemmy.world 1 points 1 hour ago
[–] Chulk@lemmy.ml 9 points 4 hours ago

@grok this true?

[–] LedgeDrop@lemmy.zip 3 points 3 hours ago

Has anyone found an effective way to pair-up and "learn" the syntax faster/better compared to not using AI?

I've written a lot of code in the past, but recently started doing more with golang... and have been using AI for an assist, but at the end of the day (and enough reiterations) - it creates readable and maintainable code. But (unfortunately), I don't think I could rewrite it.

I was contemplating seeing how I could change my workflow, so I'd write the code, but AI would offer fast guidance.

[–] nonentity@sh.itjust.works 3 points 4 hours ago

s/could\ be/are/

They’re lobotomising tools. Vibe coding is just shoving one end of an ice pick up your nose, setting the other on a keyboard, and replacing its handle with a mains powered personal massager.

[–] minorkeys@sh.itjust.works 1 points 3 hours ago

Substituting your own information syntheses, memory and deduction Willa trophy this faculties, leading to people who can't think. Everything their mind needs to complete thought, particularly complex thoughts, connect to an external device. Their thoughts will resemble Swiss cheese, partial ideas with numerous large gaps.

[–] leftzero@lemmy.dbzer0.com 4 points 5 hours ago

I have no doubt “AI” companies are sitting on studies proving their shit causes irreversible brain damage, much like tobacco cartels used to sit on studies proving their shit caused cancer.

By the time the bubble pops and their shit gets properly regulated it'll have crippled a whole generation (on top of all the other damage like destroying the Internet, causing unfathomable damage to science culture, and society in general, and infecting any information produced after this shit became commonly used).

I have very little hope for our civilization being able to survive this self inflicted disaster (and given how we've squandered natural resources and caused a runaway greenhouse effect that'll make our world mostly uninhabitable for humans without massive industrial effort that will be impossible after our fall, no new civilization will be rising after this dark age). But hey, at least some sociopath CEOs will have made a lot of money out of it. Who cares if they murdered the future for their short term profit.

[–] adespoton@lemmy.ca 2 points 4 hours ago

I wonder… are Google and Bing search indexes being intentionally left to moulder specifically to drive people to Gemini and ChatGPT?

[–] BaroqueInMind@piefed.social 9 points 8 hours ago (1 children)

If any of you actually read the article, they only tested 54 college students with writing a fucking essay. This is also undergoing a limited peer review.

Ultimately tells you no results that reflect reality, and only provides justification for certain people's feelings who were anti-AI already

[–] givesomefucks@lemmy.world 4 points 7 hours ago

The sun still rose every day before we knew the planet spins...

Tobacco still caused cancer before the studies came out...

If you've studied biopsychology, you know what happens to any offloading of a cognitive function.

That it's being handed off specifically to AI, just doesn't matter in the slightest. Because the offloading itself is what causes the atrophy.

The issue here, is what is being offloaded is critical thinking...

Which makes it incredibly difficult to explain what is happening to someone who is experiencing it, somewhat like Alzheimer's.

People reliant on chatbots to do their critical thinking, simply don't have the critical thinking to understand the problem. The only way to get them out of it, is making them go cold turkey like with drugs. And eventually the brain will begrudgingly start doing critical thinking again, but it's gonna take a while, because offloading cognitive tasks from the conscious mind is literally why humans are the dominant species.

It's why it takes so little time for people to become reliant on it.

[–] givesomefucks@lemmy.world 37 points 11 hours ago (3 children)

The vast amount of people don't understand how their brain works...

What we think of "us" isn't our brains, it's just our consciousness. And that's just a middle manager that's getting all types of shit thrown at it.

Our consciousness can't tell the difference between the prefrontal lobe handling something, or a laptop with a chatbot open.

It just takes the input and processes it.

When we throw stuff to an AI, the part of our brain that normally handles it, just starts doing other stuff.

If you don't have the AI, your prefontal lobe doesn't want to take the old stuff back, it's already got its plate full with the new stuff it picked up.

Your consciousness knows the chatbot can puke out an answer, so when your prefrontal love won't/can't do it, you just got hyper focused on getting access to the chatbot.

It's "making people stupider" but the real problem is it's abusing how every mammals brain has worked for millions of years. It's not something people can resist,bits the brain as a whole working as intended. We just didn't evolve for something that at any moment could become prohibitively expensive.

Think of how Uber was cheap till people needed it.

If people get hooked on cheap AI, they're not gonna be able to survive without it and will pay anything. I think this is why its pushed on coders so hard, they want everyone to use it so everyone becomes dependent on it. Instead of paying for 4-8 years for a degree, people will have to pay monthly for an AI just to earn a living

That's the end goal of the techbros. No one being able to work unless they pay for AI.

[–] adespoton@lemmy.ca 1 points 4 hours ago (1 children)

It’s pushed on coders because it gives every developer a team of never sleeping junior devs for a fraction of the price.

And if the competition is doing it, you won’t compete unless you do it too. Until the price matches that team of junior coders.

[–] hikaru755@lemmy.world 1 points 1 hour ago

a team of never sleeping junior devs

As a senior dev, that sounds like my worst nightmare tbh

[–] classic@fedia.io 7 points 10 hours ago (1 children)

Asking from a place of agreement, curious if you have any readings to suggest, on that impact to the brain. Always looking for solid content to send along to others

(beyond this article and the MIT research it cites)

[–] givesomefucks@lemmy.world 5 points 9 hours ago* (last edited 9 hours ago) (1 children)

on that impact to the brain.

I mean, I pulled a whole bunch of stuff together in that comment, I'd be shocked if any source existed that touched on every part.

As far as "us" delegating tasks to other parts of the brain, this looks pretty good:

The driving force behind human brain evolution

Although many species can transfer behavior from volitional to habitual function (Poldrack et al., 2005; Barton, 2007; Seger and Spiering, 2011; Krubitzer and Seelke, 2012; Barton and Venditti, 2013), the shift from quadrupedal to bipedal locomotion nonetheless may have been a powerful driver for the rapid elaboration of the distinctively human “delegation” mode of information processing. Bipedality is rare in mammals, seen commonly only in humans and in some apes (Hardman et al., 2002; Alexander, 2004; Doyon et al., 2009). Although bipedality plausibly affords a number of adaptive advantages (e.g., it facilitates surveillance in densely vegetated areas, and frees the arms for other tasks Carrier, 2011), it also imposes a massive information-processing challenge. Compared to the stability conferred by quadrupedal locomotion, a bipedal organism rests its body mass on only two support points. This inherently unstable posture means that even a tiny shift in position will cause a fall, unless the animal instantly detects and responds to that change. Presumably for this reason, quadrupedal animals that resort to bipedality for surveillance typically do so only briefly, or in highly stereotyped poses (as is the case with meerkats). Moving about while bipedal poses extraordinary challenges, whereby the individual must constantly respond to ever-changing subtle shifts in weight distribution (Preuschoft, 2004), reducing its ability to attend to other aspects of its environment (such as the detection of food sources or approaching predators).

Despite these challenges, adult humans spend little time consciously thinking about maintaining their balance as they move around, except when placed in a challenging circumstance, such as walking on a narrow beam or when leaving a pub. The means of achieving that liberation is very clear as one watches a young child learning to walk. This is a long process, with every step initially requiring full concentration. Through time, however, the skills develop as control over fine motor movements improves–and full concentration on movement is no longer needed as the tasks involved become “automatized” and are delegated to other parts of the brain, such as the basal ganglia (Poldrack et al., 2005; Ashby et al., 2010; Seger and Spiering, 2011; Sepulcre et al., 2012) and the cerebellum (Duncan, 2001; Desmurget and Turner, 2010; Balsters and Ramnani, 2011; Callu et al., 2013). Plausibly, then, the adoption of bipedalism in proto-humans posed a strong selective advantage for individuals with brains capable of using their full processing power to learn bipedalism, but that were also able to delegate the basic tasks of walking and running to “lower” neural centers, freeing up the higher segments for detecting unpredictable opportunities and challenges (be they related to predators, food, or social cues), and rapidly responding to that information.

In summary, we suggest that (1) the ability to delegate routine tasks from the cortex to other parts of the brain is more highly developed in humans than other species; and (2) that elaboration arose during our evolutionary history because the computational challenges associated with balancing on two legs enhanced individual fitness in proto-humans who were capable of transferring the control of routine tasks in this way. To this we can add (3) that once this “delegation” mode of neural functioning had evolved, it was co-opted for many other cognitive tasks–essentially, liberating the cortex to deal with novel unpredictable events.

https://pmc.ncbi.nlm.nih.gov/articles/PMC4010745/

Although to be upfront I didn't take the time to read the whole study, I just skimmed it. I was already aware of how this works from school and just searched real quick for a source

But that study starts out assuming what pushed us to delegation forward brains was how fucking hard it was to stand on two feet without a giant tail. And once we got good at delegating that away from conscious thinking, why wouldn't we keep delegating everything else as long as there isn't an immediate negative consequence?

[–] classic@fedia.io 1 points 2 hours ago

Thank you. That was a cool read. We squander this amazing organ. I'm with you that it's hard to find all these concepts in one place. Sapolsky's lectures captures some of it.

[–] cheese_greater@lemmy.world 1 points 8 hours ago

Uber is still quite cheap i find but certainly not as cheap as it used to be had

[–] etchinghillside@reddthat.com 24 points 11 hours ago (1 children)

It’s like becoming middle managers without the people managing experience.

[–] Darkcoffee@sh.itjust.works 11 points 11 hours ago

Peter Principle is real and I'm tired of pretending it's not.

[–] morto@piefed.social 10 points 9 hours ago* (last edited 9 hours ago) (2 children)

The headline subtly implies we were already stupid lol

[–] WhoIzDisIz@lemmy.today 1 points 2 hours ago* (last edited 2 hours ago)

If you're relying on AI, well then...

(Yes, I know they can have their uses if done properly, but for too many that's a HUGE "if".)

[–] rockSlayer@lemmy.blahaj.zone 1 points 7 hours ago

Everyone is stupid about something. People we label as stupid are stupid about most things

[–] StarryPhoenix97@lemmy.world 1 points 5 hours ago* (last edited 4 hours ago)

I wish that I didn't have to use them, but for basic Linux troubleshooting, it's easier to ask the chatbot and have it explain itself and cross-reference than to use a search engine or forum. The internet has become so shitty for self-teaching. You either end up on YouTube watching videos that aren't relevant, your search engine pumps you to garbage on top of ads, or you find a forum from 5 years ago with someone asking the same question, but it was closed because it's the same question that was asked 20 other times, and all the solutions don't actually work.

I know that I am robbing myself of this collection of secondary skills, and that part of it is a lack of patience, but the pool of knowledge that is the internet has been poisoned. The only way I even get to useful guides anymore is if Claude links me to them. It's becoming impossible to use the internet otherwise. Even spell checkers have gotten shittier.

Even in Word, it will try to autocorrect or just tell you something is wrong without auto-correcting, as if I want to search for the word that it obviously marked in red. Even if it offers a correction, it will be one word, and it will be the wrong word. My phone too, constantly changing my words or somehow hitting the wrong letter when it didn't before.

I just don't even know anymore. The mental energy to "do it myself" is exhausting. Then, of course, whatever I do or learn to do will be undone with the next update that gets pushed out and changes all my settings. My fucking phone settings have completely changed between when I got it and today. Every app has tied its permissions for functionality to its permission to send notifications, and those notifications are just ads. If I want to order DoorDash and know when it arrives, I have to agree to have DoorDash bug the shit out of me to order food when I don't want to.

I'm just tired, and if using an AI helps me figure out how to replace my bootloader when I accidentally deleted it, then so be it. I can't fight every battle.

[–] ellen.kimble@piefed.social 4 points 7 hours ago (1 children)

If anything, it increased my threshold for complexity and I’m tits deep in some very cool projects. The concern is they are going to have to make it super expensive if they can’t get any meaningful efficiency gains.

[–] givesomefucks@lemmy.world 1 points 7 hours ago

The concern is they are going to have to make it super expensive if they can’t get any meaningful efficiency gains.

They're 100% going to make it more expensive regardless...

Like, they're pushing it on coders like crack dealers give out their first rock.

Like, you just said you can do things with it you couldn't without it. How long until you can't do what you could before without it?

Have you tried lately? Not a guess of what it would be like if you tried. Actually trying to code without it. If you haven't, you're going to be shocked how hard it is to resist, and how bad you are at it if you manage to do it manual.

It's going to be cheap till everyone is hooked, till they've gotten a promotion using AI, or forgot how to work without it.

When the choice is to send half your paycheck to the AI or get fired, a lot of people are going to sign over half their checks, just for the health benefits of employment.

They don't have to replace humans with AI, they know they can't do that.

But they absolutely can trick people (at least coders) into be coming reliant on something that can increase in price a thousand fold overnight.

C'mon bro, think of Uber or any "disruptive" tech, there's always what they say they want, and the actual goal that would have stopped anyone from using it to begin with. This shouldn't need pointed out to people "in tech"

[–] undone6988@lemmy.zip 0 points 4 hours ago

You can’t fix stupid. AI is super charging my life in so many ways.

[–] mracton@piefed.social 2 points 7 hours ago (2 children)

Not quite an LLM, but I asked Siri the answer to 24*6, without thinking. When I saw the answer I realized if I actually took a moment using some critical thinking I would have easily gotten it. In short, I believe it.

[–] wildncrazyguy138@fedia.io 1 points 7 hours ago

So…you are concerned that you used a calculator instead of doing math in your head?

Did you know how to do that math before hand?

I think the biggest risk to using AI is that people don’t first learn how to do something before using tools to do the thing. In other words, our 7th grade teachers were right. You should understand the principles before accessing the short cuts.

[–] Zorque@lemmy.world 0 points 6 hours ago

So really, it's a choice to become stupider that's the problem not that a tool exists that could potentially make us stupider.

Perhaps we should look at the root cause as to why we choose to take these shortcuts instead of putting all blame on the idea of AI?

[–] FUCKING_CUNO@lemmy.dbzer0.com 8 points 10 hours ago (1 children)

I get that there are no hard-and-fast rules for the use of 'er' when it comes to comparative adjectives, but I feel more stupid every time I hear "stupider".

[–] givesomefucks@lemmy.world 4 points 10 hours ago (1 children)

It's the sylablle count...

1 syllable: er

3 syllables: more

2 syllables: usually "er" but even when typing someone's spoken accent tends to effect their preference.

Since you prefer "more stupid" I'm curious if you feel like your pronunciation is "longer" than normal? Like you stretch the word out longer like a Southern drawl?

No right or wrong answer, just curious.

[–] adespoton@lemmy.ca 1 points 4 hours ago

Are you saying “more” is syllabler than -er?

It has an extra mouth shape, but unless you pronounce it “mowar” it’s still one syllable.

[–] Greg@lemmy.ca 2 points 8 hours ago (1 children)

I wonder what affect will LLMs like Claude will have on programmers?

I'm a developer and I just grunt and headbutt the enter key to give Claude permission to do the next thing I don't understand

[–] Hideakikarate@sh.itjust.works 2 points 7 hours ago

Your "enter" button

[–] Franconian_Nomad@feddit.org 0 points 5 hours ago (1 children)

They can also make you smarter if you use them right. Key is to use local models and not giving the techbros any money.

[–] StarryPhoenix97@lemmy.world 3 points 5 hours ago* (last edited 4 hours ago) (1 children)

That's on my to-do list. I'm currently reworking my entire build because I realized I had enough last generation parts to build a media server. Once I have windows set up to only run on VM and get my stuff moved and backed up I'm going to install an LLM

[–] Franconian_Nomad@feddit.org 1 points 4 hours ago

I recommend Qwen3.6, either the 27B dense or the 35B MoE model. Both outstanding for local models.

[–] GreenKnight23@lemmy.world 1 points 8 hours ago