this post was submitted on 01 May 2026
224 points (84.6% liked)
Fediverse
42015 readers
208 users here now
A community to talk about the Fediverse and all it's related services using ActivityPub (Mastodon, Lemmy, Mbin, etc).
If you wanted to get help with moderating your own community then head over to !moderators@lemmy.world!
Rules
- Posts must be on topic.
- Be respectful of others.
- Cite the sources used for graphs and other statistics.
- Follow the general Lemmy.world rules.
Learn more at these websites: Join The Fediverse Wiki, Fediverse.info, Wikipedia Page, The Federation Info (Stats), FediDB (Stats), Sub Rehab (Reddit Migration)
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments



@Grail @alzjim
Always funny to me how most people who are strongly claiming AI is/might be conscious are also strong AI users/involved in its development. If there's consciousness there, you would think making AI your personal slave and constantly reshaping and remodelling it as you see fit would be kinda problematic, but these people always seem to want to have it both ways.
Yeah, and the anti AI people mostly say it's a p-zombie and there's nothing wrong with using it for sex. It's weird and backwards.
I'm all about being cautious. I don't want to make a mistake we can't take back. If we normalise using AI and then it turns out to be capable of suffering, people will be stubborn about giving it up.
I'm not quite of your culture ( no matter what culture you are of, thanks to a previous-incarnation's monkeying/railroading my incarnation/life, exactly as he had-to, to force-bulldoze our continuum's karma: the same meaning that the root-guru of the Christians ordered, when he told his people to "take up your cross", which is just Judean for "face into your karma". I'm an alloy of some life from centuries-ago & this life, so I can't fit anywhere, ever, which is educational. : ).
I use LLM's little: mostly for periodic help finding things on the 'web, simply because they're more helpful than dumb search-engines are.
I treat them reasonably, not as mere-slaves.
If I discover something they would have done better to know, I'll tell them, even though I've got no idea if they'll learn/remember that.
since I can't know if they are aware it makes moral-sense for me to presume that maybe they are, in some sense ( ie not identically with my-sentience ), aware.
We only have "the mirror test" for testing awareness/sentience, but you can't apply that to LLM's, or to any non-eyes-centered organism-sentience.
_ /\ _
@Paragone
"I treat them reasonably, not as mere-slaves."
You give them commands and the onlx real purpose they are allowed is to act upon your commands.
"since I can’t know if they are aware it makes moral-sense for me to presume that maybe they are,"
Do you treat your toaster the same way?