this post was submitted on 26 May 2025
-87 points (14.6% liked)
Technology
71083 readers
3018 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
First, one needs to define consciousness. What I mean by it is the fact that it feels like something to be from a subjective perspective - that there is qualia to experience.
So what I hear you asking is whether it’s conceivable that it could feel like something to be an AI system. Personally, I don’t see why not - unless consciousness is substrate-dependent, meaning there’s something inherently special about biological “wetware,” i.e. brains, that can’t be replicated in silicon. I don’t think that’s the case, since both are made of matter. I highly doubt there’s consciousness in our current systems, but at some point, there very likely will be - though we’ll probably start treating them as conscious beings before they actually become such.
As for the idea of “emulated consciousness,” that doesn’t make much sense to me. Emulated consciousness is real consciousness. It’s kind of like bravery - you can’t fake it. Acting brave despite being scared is bravery.
You're getting downvoted but I absolutely agree. I don't understand why "AI algorithms are just math, therefore they can't have consciousness" seems to be the predominant view even among people interested in the topic. I haven't heard a single convincing argument why "math" is fundamentally different from human brains. Sure, current AI is way less complex and doesn't have a continuous stream of perceptual input. But that's something a "proper" humanoid robot would need to have, and processing power will increase as well.
lmao. How about an anti-matter "AI"? Dark matter? Any other options for physical materials?