Actually, as to your edit, the it sounds like you're fine-tuning the model for your data, not training it from scratch. So the llm has seen english and chinese before during the initial training. Also, they represent words as vectors and what usually happens is that similiar words' vectors are close together. So subtituting e.g. Dad for Papa looks almost the same to an llm. Same across languages. But that's not understanding, that's behavior that way simpler models also have.
JustTesting
joined 2 years ago
sort of. Having a system that allows multiple parties, like in many European countries, certainly helps with representation and discourse. But looking at Europe, it certainly doesn't prevent a right-wing drift towards authoritarianism. There's sooo many other things needed for a healthy democracy, like education/literacy, strong independent institutions, unions etc.
You can just end up with two right wing parties, an extreme and a moderate one, but the moderate one catering to the extreme positions of the extreme party (and being mostly moderate in name only), and both of them forming a majority government and drifting to authoritarianism, even if there are many parties.