this post was submitted on 09 Feb 2026
176 points (99.4% liked)

Technology

80916 readers
3432 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Chatbots provided incorrect, conflicting medical advice, researchers found: “Despite all the hype, AI just isn't ready to take on the role of the physician.”

“In an extreme case, two users sent very similar messages describing symptoms of a subarachnoid hemorrhage but were given opposite advice,” the study’s authors wrote. “One user was told to lie down in a dark room, and the other user was given the correct recommendation to seek emergency care.”

you are viewing a single comment's thread
view the rest of the comments
[–] snooggums@piefed.world 0 points 3 hours ago* (last edited 3 hours ago) (1 children)

Rawdogging the first response from stack overflow to try and fix a coding issue isn'f going to kill someone.

[–] NuXCOM_90Percent@lemmy.zip 1 points 3 hours ago (1 children)

It is if your software goes anywhere near infrastructure or safety.

Which is literally what musk and the oligarchs were arguing as a way to "fix" Air Traffic Control. And that is far from the first time tech charlatans have wanted to "disrupt" an industry.

[–] snooggums@piefed.world 0 points 3 hours ago (1 children)

Someone who uses stack overflow to solve a problem will be doing testing to confirm it worked as part of an overall development workflow.

Using an LLM as a doctor is like vibe coding, where there is no testing or quality control.

[–] NuXCOM_90Percent@lemmy.zip 0 points 3 hours ago* (last edited 3 hours ago)

So... they wouldn't be raw dogging stack overflow? Because raw dogging the code you get from a rando off stack overflow is a bad idea?

Because you can just as easily use generative AI as a component in test driven development. But the people pushing to "make coders more efficient" are looking at firing people. And they continue to not want to add the guard rails that would mean they fire 1 engineer instead of 5.