this post was submitted on 04 Apr 2025
363 points (88.4% liked)
Technology
69298 readers
4155 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Predicting the next word vs predicting a word in the middle and then predicting backwards are not hugely different things. It's still predicting parts of the passage based solely on other parts of the passage.
Compared to a human who forms an abstract thought and then translates that thought into words. Which words I use has little to do with which other words I've used except to make sure I'm following the rules of grammar.
Interesting that...
Yeah but I think this is still the same, just not a single language. It might think in some mix of languages (which you can actuaysee sometimes if you push certain LLMs to their limit and they start producing mixed language responses.)
But it still has limitations because of the structure in language. This is actually a thing that humans have as well, the limiting of abstract thought through internal monologue thinking
Probably, given that LLMs only exist in the domain of language, still interesting that they seem to have a "conceptual" systems that is commonly shared between languages.