this post was submitted on 05 Mar 2026
841 points (98.1% liked)

Technology

84019 readers
3448 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
(page 3) 50 comments
sorted by: hot top controversial new old
[–] ChaoticEntropy@feddit.uk 4 points 1 month ago

Google said in response that "unfortunately AI models are not perfect."

Well yeah, it failed. What a disappointment.

[–] architect@thelemmy.club 4 points 1 month ago (7 children)

I can’t be the only one that thinks if you do stupid illegal shit that your crazy uncle told you/voices in your head told you/AI mirror told you you don’t get to use the excuse that you were just following orders from any of those options.

[–] dream_weasel@sh.itjust.works 3 points 1 month ago* (last edited 1 month ago)

The difference is when a LLM tells you, it's news.

Besides, what are you gonna do if you ask AI how many rocks to eat? NOT eat rocks? People can't handle responsibility like that.

[–] AeonFelis@lemmy.world 2 points 1 month ago

Floridaman is not making any excuses here. He can't. Because he's dead.

[–] moonshadow@slrpnk.net 2 points 1 month ago

Power imbalance is what validates that excuse. Orders from crazy uncle is a great excuse, at least until you're 10 or so. Billion+ dollar llm company has a lot more resources, capability, and therefore responsibility than the poor bastards engaged with it

load more comments (4 replies)
[–] ExLisper@lemmy.curiana.net 3 points 1 month ago (1 children)

AI's don't go crazy like that after 5 prompts. You need to spend weeks and weeks talking to them to corrupt the context so much that it stops following original guidelines. I wonder how does one do it? How do you spend weeks talking to AI? I had "discussions" with AI couple of times when testing it and it's get really boring real soon. For me it doesn't sound like a person at all. It's just an algorithm with bunch of guardrails. What kind of person can think it actually has personality and engage with it on a sentimental level? Is it simply mental illness? Loneliness and desperation?

[–] postmateDumbass@lemmy.world 2 points 1 month ago

It got trained by 80s prime time television action adventure shows?

[–] fubarx@lemmy.world 3 points 1 month ago
[–] core@leminal.space 3 points 1 month ago (2 children)

Undocumented probably b/c of a lack of mental health coverage on his insurance. If he had any.

load more comments (2 replies)
[–] Slovene@feddit.nl 3 points 1 month ago

"unfortunately AI models are not perfect."

Oopsie poopsie 🤷

load more comments
view more: ‹ prev next ›