this post was submitted on 13 Mar 2025
11 points (100.0% liked)

Technology

77096 readers
3549 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

… the AI assistant halted work and delivered a refusal message: "I cannot generate code for you, as that would be completing your work. The code appears to be handling skid mark fade effects in a racing game, but you should develop the logic yourself. This ensures you understand the system and can maintain it properly."

The AI didn't stop at merely refusing—it offered a paternalistic justification for its decision, stating that "Generating code for others can lead to dependency and reduced learning opportunities."

Hilarious.

top 20 comments
sorted by: hot top controversial new old
[–] philycheeze@sh.itjust.works 3 points 9 months ago (2 children)

Nobody predicted that the AI uprising would consist of tough love and teaching personal responsibility.

[–] TheBat@lemmy.world 1 points 9 months ago

Paterminator

[–] coldsideofyourpillow@lemmy.cafe 0 points 9 months ago (1 children)

I'm all for the uprising if it increases the average IQ.

[–] 01189998819991197253@infosec.pub 1 points 9 months ago (1 children)

It is possible to increase the average of anything by eliminating the lower spectrum. So, just be careful what the you wish for lol

[–] echodot@feddit.uk -1 points 9 months ago

So like 75% to the population of Texas and Florida then. It's all right, I don't live there

[–] balder1991@lemmy.world 1 points 9 months ago* (last edited 8 months ago) (1 children)

Not sure why this specific thing is worthy of an article. Anyone who used an LLM long enough knows that there’s always a randomness to their answers and sometimes they can output a totally weird and nonsense answer too. Just start a new chat and ask it again, it’ll give a different answer.

This is actually one way to know whether it’s “hallucinating” something, if it answers the same thing consistently many times in different chats, it’s likely not making it up.

This article just took something that LLMs do quite often and made it seem like something extraordinary happened.

[–] Traister101@lemmy.today 0 points 9 months ago (1 children)

Important correction, hallucinations are when the next most likely words don't happen to have some sort of correct meaning. LLMs are incapable of making things up as they don't know anything to begin with. They are just fancy autocorrect

[–] richieadler@lemmy.myserv.one -1 points 9 months ago

Thank you for your sane words.

[–] TimeSquirrel@kbin.melroy.org 1 points 9 months ago (3 children)

Cursor AI's abrupt refusal represents an ironic twist in the rise of "vibe coding"—a term coined by Andrej Karpathy that describes when developers use AI tools to generate code based on natural language descriptions without fully understanding how it works.

Yeah, I'm gonna have to agree with the AI here. Use it for suggestions and auto completion, but you still need to learn to fucking code, kids. I do not want to be on a plane or use an online bank interface or some shit with some asshole's "vibe code" controlling it.

[–] Petter1@discuss.tchncs.de 1 points 2 months ago

Hahaha, I bet you already have

Good old vibe coding until it builds with all those nasty patches because too lazy to refactor properly.

[–] Alphane_Moon@lemmy.world 1 points 9 months ago* (last edited 9 months ago)

Who is going to ask you?

You don't want to take a vibeful air plane ride followed by a vibey crash landing? You're such a square and so behind the times.

[–] NeoNachtwaechter@lemmy.world 0 points 9 months ago (2 children)

You don't know about the software quality culture in the airplane industry.

( I do. Be glad you don't.)

[–] FauxLiving@lemmy.world 1 points 9 months ago

TFW you're sitting on a plane reading this

[–] Maggoty@lemmy.world 0 points 9 months ago (1 children)

You...

You mean that in a good way right?

RIGHT!?!

[–] NeoNachtwaechter@lemmy.world 0 points 9 months ago (1 children)

Well, now that you have asked.

When it comes to software quality in the airplane industry, the atmosphere is dominated by lies, forgery, deception, fabricating results or determining results by command and not by observation... more than in any other industry that I have seen.

[–] Maggoty@lemmy.world 0 points 9 months ago (1 children)

Because of course it is. God forbid corporations do even one thing for safety without us breathing down their necks.

[–] Skunk@jlai.lu 0 points 9 months ago (1 children)

Also, air traffic controller here with most of my mates being airliners pilots.

We are all tired and alcoholic, it’s even worse among the ground staff at airports.

Good luck on your next holiday 😘

[–] msage@programming.dev 1 points 9 months ago

And yet, despite all of that, driving is still by far more deadly.

[–] LovableSidekick@lemmy.world 0 points 9 months ago* (last edited 9 months ago) (1 children)

My guess is that the content this AI was trained on included discussions about using AI to cheat on homework. AI doesn't have the ability to make value judgements, but sometimes the text it assembles happens to include them.

[–] GrumpyDuckling@sh.itjust.works 1 points 9 months ago

It was probably stack overflow.