Bias training data = bias LLM. Who knew?
AI medical tools found to downplay symptoms of women, ethnic minorities
Submitted 1 day ago by misk@piefed.social to technology@lemmy.zip
Comments
habitualTartare@lemmy.world 1 day ago
LostWanderer@fedia.io 1 day ago
Imagine, a hallucination engine mostly developed by white men and trained on data gathered by white men failing to treat symptoms experienced by women and ethnic minorities seriously. Who would've guessed this outcome?!
Buelldozer@lemmy.today 1 day ago
Imagine a hallucination engine being developed globally by white men in China on data gathered by white men in India.
Wait…what?
leftzero@lemmy.dbzer0.com 1 day ago
Garbage in, garbage out.
Especially when you shove it into a garbage maker.
misk@piefed.social 1 day ago
Just like real doctors 👀
Tb0n3@sh.itjust.works 1 day ago
It’s funny that anybody would expect models trained on information from current doctors to not have the same blind spots.