this post was submitted on 27 Mar 2026
358 points (96.6% liked)

Technology

83184 readers
2961 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] FosterMolasses@leminal.space 2 points 1 day ago (2 children)

this is just an LLM getting stuck in the data patterns stolen from toxic self-help literature.

Honestly, I've found that discussing that sort of thing with ChatGPT often ends up challenging all the self-help grout I've ingested via cultural osmosis throughout the years.

It's easier to make connections when you're approaching issues in a Descartes "dump out all the apples" approach with a tool that literally doesn't have embedded social contracts in itself.

Ironically, I've found at times that a real therapist can be much more of an echo chamber when they're just regurgitating that same CBT toxic positivity swill that both of you have been drinking lol

Maybe it's because it's less of an authority, so you can debate more and it leads to more well-rounded conclusions in the end, but I've been unearthing bits and pieces of maladaptive behaviors and thought patterns I never even realized I had, much less ever scratched the surface of in proper therapy. Made me kinda angry to realize at first lol, it felt like all that time and money only for bandaid solutions. But I try to reason that was likely a good foundation to have first (even if CBT just wound up making everything worse later on in life and I essentially had to work backwards to stop classifying certain emotions as wrong or problematic things which required "healthy" coping mechanisms to correct).

[–] Tiresia@slrpnk.net 1 points 8 hours ago

An LLM contains multitudes. It's nice you can get it to a space where you benefit from it for now - its inevitable enshittification is still in the "attract users by being useful and cheap" phase - but that doesn't contradict it being dangerous for those who don't know how to handle it whose input activates the section of its weights that imitates cults, catfishers, scammers.

[–] chunes@lemmy.world 1 points 15 hours ago

Don't you worry, they'll have ai therapy locked behind a paywall as soon as the vc money dries up.