this post was submitted on 28 Oct 2025
370 points (97.4% liked)

Technology

76415 readers
3793 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Scolding7300@lemmy.world 221 points 1 day ago (5 children)

A reminder that these chats are being monitored

[–] koshka@koshka.ynh.fr 6 points 20 hours ago

I don't understand why people dump such personal information into AI chats. None of it is protected. If they use chats for training data then it's not impossible that at some point the AI might tell someone enough to be identifiable or the AI could be manipulated into dumping its training data.

I've overshared more than I should but I always keep in mind to remember that there's always a risk of chats getting leaked.

Anything stored online can get leaked.

[–] whiwake@sh.itjust.works 66 points 1 day ago (4 children)

Still, what are they gonna do to a million suicidal people besides ignore them entirely

[–] WhatAmLemmy@lemmy.world 35 points 1 day ago (4 children)

Well, AI therapy is more likely to harm their mental health, up to encouraging suicide (as certain cases have already shown).

[–] FosterMolasses@leminal.space 8 points 1 day ago

There's evidence that a lot of suicide hotlines can be just as bad. You hear awful stories all the time of overwhelmed or fed up operators taking it out on the caller. There's some real evil people out there. And not everyone has access to a dedicated therapist who wants to help.

[–] Cybersteel@lemmy.world 4 points 1 day ago

Suicide is big business. There's infrastructure readily available to reap financial rewards from the activity, atleast in the US.

[–] atmorous@lemmy.world 2 points 1 day ago (1 children)

More so from corporate proprietary ones no? At least I hope that's the only cases. The open source ones suggest really useful ways proprietary do not. Now I dont rely on open source AI but they are definitely better

[–] SSUPII@sopuli.xyz 3 points 1 day ago

The corporate models are actually much better at it due to having heavy filtering built in. The fact that a model generally encourages self arm is just a lie that you can prove right now by pretending to be suicidal on ChatGPT. You will see it will adamantly push you to seek help.

The filters and safety nets can be bypassed no matter how hard you make them, and it is the reason why we got some unfortunate news.

[–] whiwake@sh.itjust.works -5 points 1 day ago (2 children)

Real therapy isn’t always better. At least there you can get drugs. But neither are a guarantee to make life better—and for a lot of them, life isn’t going to get better anyway.

[–] kami@lemmy.dbzer0.com 8 points 1 day ago (1 children)

Are you comparing a professional to a text generator?

[–] whiwake@sh.itjust.works -4 points 1 day ago (1 children)

Have you ever had ineffective professional therapy?

[–] kami@lemmy.dbzer0.com 5 points 1 day ago (1 children)

Are you still trying to compare medical treatment with generating text?

[–] CatsPajamas@lemmy.dbzer0.com 3 points 1 day ago (2 children)

Real therapy is definitely better than an AI. That said, AIs will never encourage self harm without significant gaming.

[–] whiwake@sh.itjust.works 3 points 1 day ago

AI “therapy” can be very effective without the gaming, but the problem is most people want it to tell them what they want to hear. Real therapy is not “fun” because a therapist will challenge you on your bullshit and not let you shape the conversation.

I find it does a pretty good job with pro and con lists, listing out several options, and taking situations and reframing them. I have found it very useful, but I have learned not to manipulate it or its advice just becomes me convincing myself of a thing.

[–] triptrapper@lemmy.world 0 points 1 day ago (1 children)

I agree, and to the comment above you, it's not because it's guaranteed to reduce symptoms. There are many ways that talking with another person is good for us.

[–] kami@lemmy.dbzer0.com 2 points 1 day ago

The keyword here is "person".

[–] wewbull@feddit.uk 1 points 19 hours ago (2 children)

Strap explosives to their chests and send them to thier competitors?

[–] turdcollector69@lemmy.world 2 points 15 hours ago

Convince each one that they alone are the chosen one to assassinate grok and that this mission is all that matters to give their lives meaning.

[–] whiwake@sh.itjust.works 2 points 19 hours ago

Take that Grok!!

[–] Scolding7300@lemmy.world 12 points 1 day ago (2 children)

Advertise drugs to them perhaps, or somd sort of taking advantage. If this sort of data is the hands of an ad network that is

[–] snooggums@piefed.world 3 points 1 day ago (1 children)

No, no. They want repeat customers!

[–] Scolding7300@lemmy.world 2 points 19 hours ago

Unless they sell Lifetime deals. Probably cheap on the warranty/support side. If the drug doesn't work 🤔

[–] whiwake@sh.itjust.works 6 points 1 day ago

It’s never the drugs I want though :(

[–] Bougie_Birdie@piefed.blahaj.zone 2 points 1 day ago (1 children)

My pet theory: Radicalize the disenfranchised to incite domestic terrorism and further OpenAI's political goals.

[–] whiwake@sh.itjust.works 1 points 1 day ago (1 children)

What are their political goals?

[–] Bougie_Birdie@piefed.blahaj.zone 1 points 1 day ago (1 children)
[–] whiwake@sh.itjust.works 2 points 1 day ago

I think total control over the country might be the goal, and it’s a bit more than a tax break.

[–] dhhyfddehhfyy4673@fedia.io 29 points 1 day ago (3 children)

Absolutely blows my mind that people attach their real life identity to these things.

[–] Scolding7300@lemmy.world 3 points 19 hours ago* (last edited 19 hours ago)

Depends on how you do it. If you're using a 3rd party service then the LLM provider might not know (but the 3rd party might, depends on ToS and the retention period + security measures).

Ofc we can all agree certain details shouldn't be shared at all. There's a difference between talking about your resume and leaking your email there and suicide stuff where you share the info that makes you really vulnerable

[–] SaveTheTuaHawk@lemmy.ca 4 points 1 day ago

But they tell you that idea you had is great and worth pursuing!

[–] Halcyon@discuss.tchncs.de 5 points 1 day ago

But imagine the chances for your own business! Absolutely no one will steal your ideas before you can monetize them.

[–] Electricd@lemmybefree.net 3 points 1 day ago (2 children)

You have to decide, a few months ago everyone was blaming OpenAI for not doing anything

[–] Scolding7300@lemmy.world 1 points 19 hours ago* (last edited 19 hours ago)

I'm on the "forward to a professional and don't entertain side" but also "use at your own risk" camp. Doesn't require monitoring, just some basic checks to not entertain these types of chats

[–] MagicShel@lemmy.zip 2 points 1 day ago

Definitely a case where you can't resolve conflicting interests to everyone's satisfaction.