this post was submitted on 01 Aug 2024
31 points (100.0% liked)

Technology

77096 readers
1727 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] oyo@lemm.ee 2 points 1 year ago (3 children)

LLMs: using statistics to generate reasonable-sounding wrong answers from bad data.

[–] pumpkinseedoil@sh.itjust.works 2 points 1 year ago (1 children)

Often the answers are pretty good. But you never know if you got a good answer or a bad answer.

[–] Blackmist@feddit.uk 2 points 1 year ago (1 children)

And the system doesn't know either.

For me this is the major issue. A human is capable of saying "I don't know". LLMs don't seem able to.

[–] xantoxis@lemmy.world 1 points 1 year ago

Accurate.

No matter what question you ask them, they have an answer. Even when you point out their answer was wrong, they just have a different answer. There's no concept of not knowing the answer, because they don't know anything in the first place.

load more comments (1 replies)