359
this post was submitted on 25 Apr 2025
359 points (96.6% liked)
Technology
69298 readers
3875 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Try this on your friends, make up an idiom, then walk up to them, say it without context, and then say "meaning?" and see how they respond.
Pretty sure most of mine will just make up a bullshit response nd go along with what I'm saying unless I give them more context.
There are genuinely interesting limitations to LLMs and the newer reasoning models, and I find it interesting to see what we can learn from them, this is just ham fisted robo gotcha journalism.
it highlights the fact that these LLMs refuse to say "I don't know", which essentially means we cannot rely on them for any factual reporting.
But a) they don't refuse, most will tell you if you prompt them well them and b) you cannot rely on them as the sole source of truth but an information machine can still be useful if it's right most of the time.