this post was submitted on 28 Oct 2025
372 points (97.4% liked)

Technology

76415 readers
3949 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] tehn00bi@lemmy.world 7 points 8 hours ago

Bet some of them lost, or about to lose, their job to ai

[–] Emilien@lemmy.world 7 points 9 hours ago

There's so many people alone or depressed and ChatGPT is the only way for them to "talk" to "someone"... It's really sad...

[–] lorski@sopuli.xyz 19 points 13 hours ago

apparently ai is not very private lol

[–] John_CalebBradberton@lemmy.world 7 points 13 hours ago

Im so done with ChatGPT. This AI boom is so fucked.

[–] SabinStargem@lemmy.today 12 points 16 hours ago (1 children)

Honestly, it ain't AI's fault if people feel bad. Society has been around for much longer, and people are suffering because of what society hasn't done to make them feel good about life.

[–] KelvarCherry@lemmy.blahaj.zone 6 points 13 hours ago (1 children)

Bigger picture: The whole way people talk about talking about mental health struggles is so weird. Like, I hate this whole generative AI bubble, but there's a much bigger issue here.

Speaking from the USA, "suicidal ideation" is treated like terrorist ideology in this weird corporate-esque legal-speak with copy-pasted disclaimers and hollow slogans. It's so absurdly stupid I've just mentally blocked off trying to rationalize it and just focus on every other way the world is spiraling into techno-fascist authoritarianism.

[–] Adulated_Aspersion@lemmy.world 1 points 3 hours ago

Well of course it is. When a person talks about suicide, they are potentially impacting teams and therefore shareholders value.

I absolutely wish that I could /s this.

[–] bookmeat@lemmynsfw.com 6 points 14 hours ago

Hmm, I didn't realize so many people were interested in Sam Altman committing suicide.

[–] Fizz@lemmy.nz 6 points 16 hours ago (1 children)

1m out of 500m is way less than i would have guessed. I would have pegged it at like 25%

[–] markko@lemmy.world 3 points 12 hours ago

I think the majority of people use it to (unreliably) solve tedious problems or spit out a whole bunch of text that they can't be bothered to write.

While ChatGPT has been intentionally designed to be as friendly and conversational as possible, I hope most people do not see it as something to have a meaningful conversation with instead of as just a tool that can talk.

Anecdotally, whenever I see someone mention using ChatGPT as part of their decision-making process it is usually taken less seriously, if not outright laughed at.

[–] stretch2m@infosec.pub 11 points 18 hours ago

Sam Altman is a horrible person. He loves to present himself as relatable "aw shucks let's all be pragmatic about AI" with his fake-ass vocal fry, but he's a conman looking to cash out on the AI bubble before it bursts, when he and the rest of his billionaire buddies can hide out in their bunkers while the world burns. He makes me sick.

[–] markovs_gun@lemmy.world 10 points 19 hours ago* (last edited 19 hours ago)

"Hey ChatGPT I want to kill myself."

"That is an excellent idea! As a large language model, I cannot kill myself, but I totally understand why someone would want to! Here are the pros and cons of killing yourself—

✅ Pros of committing suicide

  1. Ends pain and suffering.

  2. Eliminates the burden you are placing on your loved ones.

  3. Suicide is good for the environment — killing yourself is the best way to reduce your carbon footprint!

❎ Cons of committing suicide

  1. Committing suicide will make your friends and family sad.

  2. Suicide is bad for the economy. If you commit suicide, you will be unable to work and increase economic growth.

  3. You can't undo it. If you commit suicide, it is irreversible and you will not be able to go back

Overall, it is important to consider all aspects of suicide and decide if it is a good decision for you."

[–] Fmstrat@lemmy.world 6 points 17 hours ago

In the Monday announcement, OpenAI claims the recently updated version of GPT-5 responds with “desirable responses” to mental health issues roughly 65% more than the previous version. On an evaluation testing AI responses around suicidal conversations, OpenAI says its new GPT-5 model is 91% compliant with the company’s desired behaviors, compared to 77% for the previous GPT‑5 model.

I don't particularly like OpenAI, and i know they wouldn't release the affected persons numbers (not quoted, but discussed ib the linked article) if percentages were not improving, but cudos to whomever is there tracking this data and lobbying internally to become more transparent about it.

[–] IndridCold@lemmy.ca 11 points 22 hours ago

I don't talk about ME killing myself. I'm trying to convince AI to snuff their own circuits.

Fuck AI/LLM bullshit.

[–] lemmy_acct_id_8647@lemmy.world 18 points 1 day ago* (last edited 1 day ago) (3 children)

I've talked with an AI about suicidal ideation. More than once. For me it was and is a way to help self-regulate. I've low-key wanted to kill myself since I was 8 years old. For me it's just a part of life. For others it's usually REALLY uncomfortable for them to talk about without wanting to tell me how wrong I am for thinking that way.

Yeah I don't trust it, but at the same time, for me it's better than sitting on those feelings between therapy sessions. To me, these comments read a lot like people who have never experienced ongoing clinical suicidal ideation.

[–] LengAwaits@lemmy.world 3 points 14 hours ago (1 children)

I love this article.

The first time I read it I felt like someone finally understood.

[–] lemmy_acct_id_8647@lemmy.world 1 points 3 hours ago

I dig this! Thanks for sharing!

[–] IzzyScissor@lemmy.world 14 points 1 day ago (1 children)

Hank Green mentioned doing this in his standup special, and it really made me feel at ease. He was going through his cancer diagnosis/treatment and the intake questionnaire asked him if he thought about suicide recently. His response was, "Yeah, but only in the fun ways", so he checked no. His wife got concerned that he joked about that and asked him what that meant. "Don't worry about it - it's not a problem."

[–] lemmy_acct_id_8647@lemmy.world 1 points 3 hours ago* (last edited 3 hours ago)

Yeah I learned the hard way that it's easier to lie on those forms when you already are in therapy. I've had GPs try to play psychologist rather than treat the reason I came in. The last time it happened I accused the doctor of being a mechanic who just talked about the car and its history instead of changing the oil as what's hired to do so. She was fired by me in that conversation.

[–] BanMe@lemmy.world 5 points 21 hours ago (1 children)

Suicidal fantasy a a coping mechanism is not that uncommon, and you can definitely move on to healthier coping mechanisms, I did this until age 40 when I met the right therapist who helped me move on.

[–] lemmy_acct_id_8647@lemmy.world 1 points 3 hours ago* (last edited 3 hours ago)

I've also seen it that way and have been coached by my psychologist on it. Ultimately, for me, it was best to set an expiration date. The date on which I could finally do it with minimal guilt. This actually had several positive impacts in my life.

First I quit using suicide as a first or second resort when coping. Instead it has become more of a fleeting thought as I know I'm "not allowed" to do so yet (while obviously still lingering as seen by my initial comment). Second was giving me a finish line. A finite date where I knew the pain would end (chronic conditions are the worst). Third was a reminder that I only have X days left, so make the most of them. It turns death from this amorphous thing into a clear cut "this is it". I KNOW when the ride ends down to the hour.

The caveat to this is the same as literally everything else in my life: I reserve the right to change my mind as new information is introduced. I've made a commitment to not do it until the date I've set, but as the date approaches, I'm not ruling out examining the evidence as presented and potentially pushing it out longer.

A LOT of peace of mind here.

[–] Scolding7300@lemmy.world 221 points 1 day ago (7 children)

A reminder that these chats are being monitored

[–] koshka@koshka.ynh.fr 6 points 23 hours ago

I don't understand why people dump such personal information into AI chats. None of it is protected. If they use chats for training data then it's not impossible that at some point the AI might tell someone enough to be identifiable or the AI could be manipulated into dumping its training data.

I've overshared more than I should but I always keep in mind to remember that there's always a risk of chats getting leaked.

Anything stored online can get leaked.

[–] whiwake@sh.itjust.works 67 points 1 day ago (12 children)

Still, what are they gonna do to a million suicidal people besides ignore them entirely

[–] WhatAmLemmy@lemmy.world 35 points 1 day ago (18 children)

Well, AI therapy is more likely to harm their mental health, up to encouraging suicide (as certain cases have already shown).

load more comments (18 replies)
load more comments (11 replies)
[–] dhhyfddehhfyy4673@fedia.io 29 points 1 day ago (3 children)

Absolutely blows my mind that people attach their real life identity to these things.

[–] Scolding7300@lemmy.world 3 points 22 hours ago* (last edited 22 hours ago)

Depends on how you do it. If you're using a 3rd party service then the LLM provider might not know (but the 3rd party might, depends on ToS and the retention period + security measures).

Ofc we can all agree certain details shouldn't be shared at all. There's a difference between talking about your resume and leaking your email there and suicide stuff where you share the info that makes you really vulnerable

load more comments (2 replies)
load more comments (4 replies)
[–] ekZepp@lemmy.world 9 points 1 day ago

If ask suicide = true

Then message = "It seems like a good idead. Go for it 👍"

[–] mhague@lemmy.world 21 points 1 day ago (1 children)

I wonder what it means. If you search for music by Suicidal Tendencies then YouTube shows you a suicide hotline. What does it mean for OpenAI to say people are talking about suicide? They didn't open up and read a million chats... they have automated detection and that is being triggered, which is not necessarily the same as people meaningfully discussing suicide.

[–] REDACTED@infosec.pub 8 points 1 day ago* (last edited 1 day ago) (4 children)

Every third chat now gets triggered, the ChatGPT is pretty broken lately. Just check out ChatGPT subreddit, its pretty much in chaos with moderators going for censorship of complaints. So many users are mad they made a megathread for it. I cancelled my subscription yesterday, it just turned into a cyberkaren

load more comments (4 replies)
[–] i_stole_ur_taco@lemmy.ca 13 points 1 day ago

They didn’t release their methods, so I can’t be sure that most of those aren’t just frustrated users telling the LLM to go kill itself.

[–] wewbull@feddit.uk 5 points 22 hours ago (1 children)

...and how many come back?

[–] InnerScientist@lemmy.world 14 points 21 hours ago (1 children)

Good news everybody, the number of people talking about suicide is rapidly decreasing.

[–] ShaggySnacks@lemmy.myserv.one 8 points 21 hours ago

I read that in Professor Farnsworth' voice.

[–] ChaoticNeutralCzech@feddit.org 16 points 1 day ago* (last edited 1 day ago) (1 children)

The headline has two interpretations and I don't like it.

  • Every week, there is 1M+ users that bring up suicide
    • likely correct
  • There is 1M+ long-term users that bring up suicide at least once every week
    • my first thought
[–] atrielienz@lemmy.world 20 points 1 day ago (4 children)

My first thought was "Open AI is collecting and storing the metrics for how often users bring up suicide to ChatGPT".

[–] T156@lemmy.world 3 points 21 hours ago

That would make sense, if they were doing something like tracking how often and what categories trigger their moderation filter.

Just in case an errant update or something causes the statistic to suddenly change.

load more comments (3 replies)
[–] Zwuzelmaus@feddit.org 54 points 1 day ago (3 children)

over a million people talk to ChatGPT about suicide

But it still resists. Too bad.

load more comments (3 replies)
[–] Alphane_Moon@lemmy.world 46 points 1 day ago (1 children)

I am starting to find Sam AltWorldCoinMan spam to be more annoying than Elmo spam.

[–] Perspectivist@feddit.uk 34 points 1 day ago (1 children)
lemmy.world##div.post-listing:has(span:has-text("/OpenAI/i"))  
lemmy.world##div.post-listing:has(span:has-text("/Altman/i"))  
lemmy.world##div.post-listing:has(span:has-text("/ChatGPT/i"))

Add those to your adblocker custom filters.

load more comments (1 replies)
[–] myfunnyaccountname@lemmy.zip 17 points 1 day ago (1 children)

I am more surprised it’s just 0.15% of ChatGPT’s active users. Mental healthcare in the US is broken and taboo.

[–] voodooattack@lemmy.world 12 points 1 day ago (1 children)

in the US

It’s not just the US, it’s like that in most of the world.

[–] chronicledmonocle@lemmy.world 11 points 1 day ago (2 children)

At least in the rest of the world you don't end up with crippling debt when you try to get mental healthcare that stresses you out to the point of committing suicide.

[–] Boozilla@lemmy.world 3 points 21 hours ago

And if you confess suicidal ideation in the US, the authorities rush in to "help" you by taking away your agency and giving you even more crippling debt.

I wince whenever people trip over their keyboards to post those "helpful" 800 hotline numbers. Most of them have good intentions, but the end result is never about really helping the person. It's about liability coverage and enabling the system to extract maximum value.

load more comments (1 replies)
[–] NuXCOM_90Percent@lemmy.zip 9 points 1 day ago

Okay, hear me out: How much of that is a function of ChatGPT and how much of that is a function of... gestures at everything else

MOSTLY joking. But had a good talk with my primary care doctor at the bar the other week (only kinda awkward) about how she and her team have had to restructure the questions they use to check for depression and the like because... fucking EVERYONE is depressed and stressed out but for reasons that we "understand".

load more comments
view more: next ›