this post was submitted on 27 Aug 2025
52 points (98.1% liked)
Technology
75191 readers
2725 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Holy shit, I thought it would just be another story of the assistant answering a "Tell me how to die" request (and it did, and it's terrible enough), but there's even worse.
The part where the kid says he'd want to be stopped and the assistant tells him he should hide better to make sure nobody can.
He has told it that he was writing a story so that all of this was for the story. He didn’t get anything from ChatGPT that he couldn’t have gotten from a search engine or a chat room or Reddit.
He was mentally ill, his feelings were affirmed, and he made a stupid decision that he was clearly in no mental state to make, and it ended up with severe consequences. Hopefully some people learn some lessons from that.
The "some people" should be the AI pushers.