this post was submitted on 04 Mar 2026
756 points (98.0% liked)

Technology

84019 readers
3448 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Bassman27@lemmy.world 18 points 1 month ago* (last edited 1 month ago) (18 children)

So someone who already has an underlying mental health condition diagnosed or not is at fault for their own death even if being coerced into doing it?

Without the AI these people most likely wouldn’t have gotten to the point of committing the act of suicide. I believe the accusations are valid and that AI can be bad for mental health.

There is evidence throughout history of cults that commit mass suicides. If a human can convince another human to do this why can’t a robot trained to act and speak like a human do it too? It’s not unreasonable to think an AI could push someone to suicide under the right circumstances.

[–] XLE@piefed.social 6 points 1 month ago (3 children)

Google, of all companies, probably has a better psychological profile of their users than the average doctor. They even offer a public-facing option to disable ads about gambling, alcohol, or pregnancy.

[–] TwilitSky@lemmy.world 2 points 1 month ago (2 children)

TBH, alcohol ads are INSUFFERABLE but who needs pregnancy ads blocked?

[–] XLE@piefed.social 4 points 1 month ago

People who don't want their family getting suspicious, perhaps. The Target Incident comes to mind.

Of course, disabling these options doesn't mean Google stops knowing about mental or physical issues. I'm sure you know the best way to prevent that is to just avoid Google and add some together. This is probably just Google's way of looking less creepy to the average person.

load more comments (1 replies)
load more comments (1 replies)
load more comments (15 replies)