this post was submitted on 16 Dec 2025
212 points (94.5% liked)
Technology
77847 readers
2816 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
What a load of bullshit, LLMs will be used in a million ways to sideline neurodivergent people in society whether it be BS AI "help" for a neurodivergent student replacing a human teacher or job applications using AI to illegally screen and filter out neurodivergent people, this is a bad decade for neurodivergent people and it is likely only to get worse as societies collapse into bigotry from the endless stresses and catastrophies of runaway climate change.
Right, there was legal pressure upon inputs of decision-making to make it more egalitarian or whatever. And by other criteria too.
So what happens is full obfuscation of inputs. In the form of LLMs.
Philosophically this is correct in my opinion, trees should be judged by their fruit.
A simplified comparison is British vs Prussian army philosophy, in Prussia, when evaluating officer's performance, they'd judge his decision-making process and its inputs, even if the result was catastrophic, while in British army and navy they'd only judge the result, no matter how correct the decision-making. That has been often called unjust and not nuanced enough, but one way lost historically and the other won. For a reason. Judgement of inputs has more failure points. It causes degeneracy long-term.
A bit like every metric used as a KPI ceasing to be a useful metric, there's such a commonly quoted MBA rule, except MBAs are not smart enough to remember that rule, generally.
The alternative to this is responsibility for all that happens downstream. No matter which inputs you get. In exchange for that you are allowed to have any decision-making process at all, just pay for it in full if something wrong happens.
We are being pressed by evolution (including technical progress) to adopt that approach, and it's good, but it'll take probably lots of wars and revolutions. People who hide malice behind formally correct inputs do resist. And they do hold power.
Instead of inputs you should treat any social mechanism as a black box, and both limit and judge its outputs. If they are outside limits, discard and punish. If they are inside limits, then evaluate and bill - in prison years or in fines or both. Or reward.
You never know all the inputs anyway and can't tell if they are correct.