this post was submitted on 07 Mar 2026
979 points (98.9% liked)

Technology

84019 readers
3450 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
(page 2) 50 comments
sorted by: hot top controversial new old
[–] SaveTheTuaHawk@lemmy.ca 5 points 1 month ago
[–] webkitten@piefed.social 4 points 1 month ago (2 children)

This bill gave us the "best" interaction:

https://bsky.app/profile/badmedicaltakes.bsky.social/post/3mghyg5eufk2m

A Bluesky skeet from @badmedicaltakes.bsky.social:

"Twitter user eoghan:

How dare poor people get free medical advice

<quote tweet from Twitter user Polymarket: BREAKING: New York bill would ban AI from answering questions related to medicine, law, dentistry, nursing, psychology, social work, engineering, & more.>

Twitter user YBrogard79094:
JUST MAKE HEALTHCARE ACCESSIBLE

Twitter user eoghan:

AI is literally free healthcare. Being a communist must be exhausting"

[–] Hiro8811@lemmy.world 4 points 1 month ago

You can google your simptoms and there probably are some reliable sites but a hallucinating chatbot is a bad idea. Not to mention some people suggested treating covid with chlorine, vinegar etc....

[–] deliriousdreams@fedia.io 3 points 1 month ago

Some horses you can't even lead to water. Let alone make them drink.

[–] NutWrench@lemmy.world 3 points 1 month ago

Chat bots should never give medical advice. Chat bots dispense basic, standalone factoids, like "aspirin is a pain reliever." But they don't know or care about dosages, comorbid conditions or whether or not you live or die, so they won't ask follow up questions.

[–] ArbitraryValue@sh.itjust.works 2 points 1 month ago (1 children)

If you don't want legal or medical advice from an AI, you can already simply not ask the AI for legal or medical advice. But I don't want your paternalistic restrictions on what I may ask.

[–] moroninahurry@piefed.social 2 points 1 month ago

Sir did you pay for that medical advice though? That's what these laws will eventually enforce. Prescription advice.

load more comments
view more: ‹ prev next ›