We’ve lived in a world where resume evaluation is always unjust. It’s just that. A resume can’t imply anything that can be used against you.
ChatGPT is biased against resumes with credentials that imply a disability
Submitted 4 months ago by bot@lemmy.smeargle.fans [bot] to hackernews@lemmy.smeargle.fans
https://www.washington.edu/news/2024/06/21/chatgpt-ai-bias-ableism-disability-resume-cv/
Comments
boredtortoise@lemm.ee 4 months ago
lvxferre@mander.xyz 4 months ago
studies how generative AI can replicate and amplify real-world biases
Emphasis mine. That’s a damn important factor, because the deep “learning” models are prone to make human biases worse.
I’m not sure but I think that this is caused by two things:
- It’ll spam the typical value unless explicitly asked contrariwise, even if the typical value isn’t that common.
- It might take co-dependent variables as if they were orthogonal, for the sake of weighting the output.
kata1yst@sh.itjust.works 4 months ago
Yet again sanitization and preparation of training inputs proves to be a much harder problem to solve then techbros think.
SuperCub@sh.itjust.works 4 months ago
I’m curious what companies have been using to screen applications/resumes before Chat GPT. Seems like they already had shitty software.
andrew_bidlaw@sh.itjust.works 4 months ago
Let the underwhelming brain in a jar decide if your disability would make you less efficient at your work.
Deceptichum@sh.itjust.works 4 months ago
People are biased against resumes that imply a disability. ChatGPT is just picking up on that fact and unknowingly copying it.