this post was submitted on 29 Jun 2025
511 points (95.7% liked)

Technology

72319 readers
3435 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

A profound relational revolution is underway, not orchestrated by tech developers but driven by users themselves. Many of the 400 million weekly users of ChatGPT are seeking more than just assistance with emails or information on food safety; they are looking for emotional support.

“Therapy and companionship” have emerged as two of the most frequent applications for generative AI globally, according to the Harvard Business Review. This trend marks a significant, unplanned pivot in how people interact with technology.

(page 3) 50 comments
sorted by: hot top controversial new old
[–] poopkins@lemmy.world 11 points 3 days ago (6 children)

Funny, I was just reading comments in another thread about people with mental health problems proclaiming how terrific it is. Especially concerning is how they had found value in the recommendations LLMs make and "trying those out." One of the commenters described themselves as "neuro diverse" and was acting upon "advice" from generated LLM responses.

And for something like depression, this is deeply bad advice. I feel somewhat qualified to weigh in on it as somebody who has struggled severely with depression and managed to get through it with the support of a very capable therapist. There's a tremendous amount of depth and context to somebody's mental condition that involves more deliberate probing to understand than stringing together words until it forms sentences that mimic human interactions.

Let's not forget that an LLM will not be able to raise alarm bells, read medical records, write prescriptions or work with other medical professionals. Another thing people often forget is that LLMs have maximum token lengths and cannot, by definition, keep a detailed "memory" of everything that's been discussed.

It's is effectively self-treatment with more steps.

[–] whalebiologist@lemmy.world 7 points 3 days ago (1 children)

LLM will not be able to raise alarm bells

this is like the "benefit" of what LLM-therapy would provide if it worked. The reality is that, it doesn't but it serves as a proof of concept that there is a need for anonymous therapy. Therapy in the USA is only for people with socially acceptable illnesses. People rightfully live in fear of getting labeled as untreatable, a danger to self and others, and then at best dropped from therapy and at worst institutionalized.

yep, almost nobody wants to be committed to a psych ward without consent

load more comments (5 replies)
[–] ikidd@lemmy.world 21 points 4 days ago (1 children)
load more comments (1 replies)
[–] Blackmist@feddit.uk 13 points 4 days ago (5 children)

This thing has been trained on social media. Is that really wise?

load more comments (5 replies)
[–] Geodad@lemmy.world 22 points 4 days ago

Some people would rather yalk to something they know is fake than to talk to a person who may or may not be.

[–] HugeNerd@lemmy.ca 2 points 2 days ago

Buy more. Buy more now.

load more comments
view more: ‹ prev next ›