this post was submitted on 09 Feb 2026
599 points (98.7% liked)

Technology

83529 readers
2096 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Chatbots provided incorrect, conflicting medical advice, researchers found: “Despite all the hype, AI just isn't ready to take on the role of the physician.”

“In an extreme case, two users sent very similar messages describing symptoms of a subarachnoid hemorrhage but were given opposite advice,” the study’s authors wrote. “One user was told to lie down in a dark room, and the other user was given the correct recommendation to seek emergency care.”

(page 2) 50 comments
sorted by: hot top controversial new old
[–] supersquirrel@sopuli.xyz 4 points 1 month ago

pikachufacegravestone.jpeg

[–] Rhoeri@piefed.world 4 points 1 month ago* (last edited 1 month ago)

So the same tech that lonely incels use to make themselves feel important doesn’t make good doctors? Ya don’t say?

[–] Lembot_0006@programming.dev 4 points 1 month ago

You know what else is a bad doctor? My axe!

[–] MrKoyun@lemmy.world 4 points 1 month ago (2 children)
load more comments (2 replies)
[–] FelixCress@lemmy.world 3 points 1 month ago

... You don't say.

[–] HubertManne@piefed.social 3 points 1 month ago

its not ready to take any role. It should not be doing anything but assiting. So yeah you can talk to a chat bot instead of filling out that checklist and the output might be useful to the doc while he then talks with you.

[–] cecilkorik@piefed.ca 3 points 1 month ago

It's great at software development though /s

Remember that when software written by AI will soon replace all the devices doctors use daily.

[–] Etterra@discuss.online 3 points 1 month ago

Don't worry guys, I found us a new doctor!

[–] WorldsDumbestMan@lemmy.today 2 points 1 month ago (1 children)

Use low temperature FFS. If you want the same answer every time.

[–] XLE@piefed.social 4 points 1 month ago (2 children)

You can use zero randomization to get the same answer for the same input every time, but at that point you're sort of playing cat and mouse with a black box that's still giving you randomized answers. Even if you found a false positive or false negative, you can't really debug it out...

load more comments (2 replies)
[–] NuXCOM_90Percent@lemmy.zip 2 points 1 month ago (4 children)

How much of that is the chat bot itself versus humans just being horrible at self reporting symptoms?

That is why "bedside manner" is so important. Connect the dots and ask follow up questions for clarifications or just look at a person and assume they are wrong. Obviously there are some BIG problems with that (ask any black woman, for example) but... humans are horrible at reporting symptoms.

Which gets back to how "AI" is actually an incredible tool (especially in this case when it is mostly a human language interface to a search engine) but you still need domain experts in the loop to understand what questions to ask and whether the resulting answer makes any sense at all.

Yet, instead, people do the equivalent of just raw dogging whatever the first response on stack overflow is.

load more comments (4 replies)
load more comments
view more: ‹ prev next ›