Well it’d certainly benefit the medical industry. They’d be saddling tons of patients with surgeries, chemotherapy, mastectomy, and other treatments, “because doctor-GPT said so.”
But imagine being a patient getting physically and emotionally altered, plunged into irrecoverable debt, distressing your family, and it all being a whoopsy by some black-box software.
cecinestpasunbot@lemmy.ml 3 months ago
It’s a common problem in diagnostics and it’s why mammograms aren’t recommended to women under 40.
Let’s say you have 10,000 patients. 10 have cancer or a precancerous lesion. Your test may be able to identify all 10 of those patients. However, if it has a false positive rate of 5% that’s around 500 patients who will now get biopsies and potentially surgery that they don’t actually need. Those follow up procedures carry their own risks and harms for those 500 patients. In total, that harm may outweigh the benefit of an earlier diagnosis in those 10 patients who have cancer.