In a nutshell, voltage incompatibility is generally more damaging than current mismatch, typically in a frightening or energetic manner. Many Americans tourists find this out when they bring their 120v AC hairdryers to an overseas hotel with 230v AC power. If there is only room for one number to be emblazoned on an outlet or plug, it should be the rated voltage, first and foremost.
For current protection, we've had thermal fuses since the 1890s, and thermo-magnetic circuit breakers since the 1940s. There are even more fancy transistor-based current protections available for industrial equipment that can shut off extremely fast. In a sense, protection against over-current has basically been solved, in the scenarios where there's enough of a risk of humans or property.
Whereas voltage mix-ups still happen, although consumer electronics are now moving to automatic voltage detection (eg an 18v electric drill battery charger refuses to charge a 12v battery) and through actively negotiated power parameters (eg USB PD). And even without human error, under- and over voltage transients still happen in residential and commercial environments, leading to either instant damage or long-term product degradation (eg domestic refrigerator motor drive circuits).
It should be noted that a current starvation scenario, such as when an ebike is current-limited per regulations, does not generally cause a spike in voltage. Whereas in a voltage starvation situation, resistive loads will indeed try to draw more current in order to compensate. Hence why current protection is almost always built-in and not left to chance.
For an example of where constant current sources are used -- and IMO, deeply necessary -- we can look to the humble LED driver circuit. LEDs are fickle devices, on account of their very sharp voltage-current curve, which also changes with operating temperature and is not always consistent from the factory. As a practical matter, the current through an LED is what predominantly controls the brightness, so constant current sources will provide very steady illumination. If instead an LED were driven with a constant voltage source, it would need to be exceedingly stable, since even a few tens of millivolts off can destroy some LEDs through over-current and/or over-heating.
For cheap appliances, some designs will use a simple resistor circuit to set the LED current, and this may be acceptable provided that the current is nowhere near overdriving the LED. Thing of small indicator LEDs that aren't that bright anyway. Whereas for expensive industrial LED projectors, it would be foolish to not have an appropriately designed current source, among other protective features.