this post was submitted on 06 Jul 2025
60 points (70.8% liked)

Technology

72440 readers
2506 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
all 37 comments
sorted by: hot top controversial new old
[–] MysticKetchup@lemmy.world 16 points 7 hours ago (2 children)

Seems like a normal, sane and totally not-biased source

[–] AbidanYre@lemmy.world 4 points 3 hours ago

What the fuck did I just read?

They're as biased as the data they were trained on. If that data leaned toward male applicants, then yeah, it makes complete sense.

[–] technocrit@lemmy.dbzer0.com 53 points 11 hours ago* (last edited 10 hours ago) (1 children)

I dunno why people even care about this bullshit pseudo-science. The study is dumb AF. The dude didn't even use real resumes. He had an LLM generate TEN fake resumes and then the "result" is still within any reasonable margin of error. Reading this article is like watching a clown show.

It's all phony smoke and mirrors. Clickbait. The usual "AI" grift.

[–] kozy138@slrpnk.net 9 points 11 hours ago

I feel as though generating these "fake" resumes is one of the top uses for LLMs. Millions of people are probably using LLMs to write their own resumes, so generating random ones seems on par with reality.

[–] ter_maxima@jlai.lu 5 points 7 hours ago

I don't care what bias they do and don't have ; if you use an LLM to select résumés, you don't deserve to hire me. I make my résumé illegible for LLMs on purpose.

( But don't follow my advice. I don't actually need a job so I can pull this kinda nonsense and be selective, most people probably can't )

[–] hendrik@palaver.p3x.de 66 points 15 hours ago* (last edited 15 hours ago) (1 children)

LLMs reproducing stereotypes is a well researched topic. They do that due to what they are. Stereotypes and bias in (in the training data), bias and stereotypes out. That's what they're meant to do. And all AI companies have entire departments to tune that, measure the biases and then fine-tune it to whatever they deem fit.

I mean the issue aren't women or anything, it's using AI for hiring in the first place. You do that if you want whatever stereotypes Anthropic and OpenAI gave to you.

[–] kambusha@sh.itjust.works 17 points 14 hours ago (1 children)

Just pattern recognition in the end, and extrapolating from that sample size.

[–] hendrik@palaver.p3x.de 6 points 13 hours ago

Issue is they probably want to pattern-recognize something like merit / ability / competence here. And ignore other factors. Which is just hard to do.

[–] LovableSidekick@lemmy.world 3 points 7 hours ago* (last edited 7 hours ago)

Only half kidding now... the way morality and ethics get extrapolated now by the perfection police, this must mean anti-AI = misogynist.

[–] ohwhatfollyisman@lemmy.world 17 points 14 hours ago

and their companies are biased against humans in hiring.

[–] OutlierBlue@lemmy.ca 3 points 11 hours ago (1 children)

So we can use Trump's own anti-DEI bullshit to kill off LLMs now?

[–] thann@lemmy.dbzer0.com 1 points 7 hours ago

Well, ya see, trump isnt racist against computers

[–] burgerpocalyse@lemmy.world 7 points 13 hours ago

these systems cannot run a lemonade stand without shitting their balls