this post was submitted on 01 Nov 2025
363 points (89.7% liked)
Technology
77096 readers
4001 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This was bound to happen. Neural networks are inherently analog processes, simulating them digitally is massively expensive in terms of hardware and power.
Digital domain is good for exact computation, analog is better for approximate computation, as required by neural networks.
You might benefit from watching Hinton's lecture; much of it details technical reasons why digital is much much better than analog for intelligent systems
BTW that is the opposite of what he set out to prove He says the facts forced him to change his mind
https://m.youtube.com/watch?v=IkdziSLYzHw
I wish researchers like Hinton would stick to discussing the tech. Anytime he says anything about linguistics or human intelligence he sounds like a CS major smugly raising his hand in Phil 101 to a symphony of collective groans.
Hinton is a good computer scientist (with an infinitesimally narrow window of expertise). But the guy is philosophically illiterate.