this post was submitted on 26 May 2025
-87 points (14.6% liked)

Technology

71083 readers
3018 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Do you think AI is, or could become, conscious?

I think AI might one day emulate consciousness to a high level of accuracy, but that wouldn't mean it would actually be conscious.

This article mentions a Google engineer who "argued that AI chatbots could feel things and potentially suffer". But surely in order to "feel things" you would need a nervous system right? When you feel pain from touching something very hot, it's your nerves that are sending those pain signals to your brain... right?

you are viewing a single comment's thread
view the rest of the comments
[–] MagicShel@lemmy.zip 4 points 1 week ago (1 children)

I said on paper. They are just algorithms. When silicon can emulate meat, it's probably time to reevaluate that.

[–] amelia@feddit.org 1 points 1 week ago* (last edited 1 week ago) (1 children)

You talk like you know what the requirements for consciousness are. How do you know? As far as I know that's an unsolved philosophical and scientific problem. We don't even know what consciousness really is in the first place. It could just be an illusion.

[–] MagicShel@lemmy.zip 1 points 1 week ago* (last edited 1 week ago) (1 children)

I have a set of attributes that I associate with consciousness. We can disagree in part, but if your definition is so broad as to include math formulas there isn't even common ground for us to discuss them.

If you want to say contemplation/awareness of self isn't part of it then I guess I'm not very precious about it the way I would be over a human-like perception of self, then fine people can debate what ethical obligations we have to an ant-like consciousness when we can achieve even that, but we aren't there yet. LLMs are nothing but a process of transforming input to output. I think consciousness requires rather more than that or we wind up with erosion being considered a candidate for consciousness.

So I'm not the authority, but if we don't adhere to some reasonable layman's definition it quickly gets into weird wankery that I don't see any value in exploring.

[–] amelia@feddit.org 1 points 6 days ago* (last edited 6 days ago)

AI isn't math formulas though. AI is a complex dynamic system reacting to external input. There is no fundamental difference here to a human brain in that regard imo. It's just that the processing isn't happening in biological tissue but in silicon. Is it way less complex than a human? Sure. Is there a fundamental qualitative difference? I don't think so. What's the qualitative difference in your opinion?