this post was submitted on 05 Mar 2026
826 points (98.1% liked)

Technology

82329 readers
3281 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] arc99@lemmy.world 6 points 21 hours ago (3 children)

LLMs are only as good as their training and they're not "intelligent" - they're spewing out a response statistically relevant to the input context. I'm sure a delusional person could cause an LLM to break by asking it incoherent, nonsensical things it has no strong pathways for so god knows what response it would generate. It may even be that within the billions of texts the LLM ingested for training there were a tiny handful of delusional writings which somehow win on these weak pathways.

[–] BilSabab@lemmy.world 5 points 21 hours ago

Given that modern datasets use way too much content from social media - it is hard to expect anything else at this point.

[–] Nalivai@lemmy.world 4 points 21 hours ago

You don't even have to "break" llm into anything. It continues your prompts, making sentences as close to something people will mistake for language as possible. If you give it paranoid request, it will continue with the same language.
The only thing that training gave it is the ability to create sequences of words that resemble sentences.

[–] Hiro8811@lemmy.world 1 points 20 hours ago

It didn't break, it probably just created an echo chamber sustaining that person delusion.