Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

AI medical tools found to downplay symptoms of women, ethnic minorities

⁨110⁩ ⁨likes⁩

Submitted ⁨⁨1⁩ ⁨day⁩ ago⁩ by ⁨misk@piefed.social⁩ to ⁨technology@lemmy.zip⁩

https://arstechnica.com/health/2025/09/ai-medical-tools-found-to-downplay-symptoms-of-women-ethnic-minorities/

source

Comments

Sort:hotnewtop
  • misk@piefed.social ⁨1⁩ ⁨day⁩ ago

    Just like real doctors 👀

    source
    • Tb0n3@sh.itjust.works ⁨1⁩ ⁨day⁩ ago

      It’s funny that anybody would expect models trained on information from current doctors to not have the same blind spots.

      source
  • habitualTartare@lemmy.world ⁨1⁩ ⁨day⁩ ago

    Bias training data = bias LLM. Who knew?

    source
  • LostWanderer@fedia.io ⁨1⁩ ⁨day⁩ ago

    Imagine, a hallucination engine mostly developed by white men and trained on data gathered by white men failing to treat symptoms experienced by women and ethnic minorities seriously. Who would've guessed this outcome?!

    source
    • Buelldozer@lemmy.today ⁨1⁩ ⁨day⁩ ago

      Imagine a hallucination engine being developed globally by white men in China on data gathered by white men in India.

      Wait…what?

      source
  • leftzero@lemmy.dbzer0.com ⁨1⁩ ⁨day⁩ ago

    Garbage in, garbage out.

    Especially when you shove it into a garbage maker.

    source