To be honest, it is not made to diagnose medical scans and it is not supposed to be. There are different AIs trained exactly for that purpose, and they are usually not public.
Comment on But Claude said tumor!
rho50@lemmy.nz 8 months ago
I know of at least one other case in my social network where GPT-4 identified a gas bubble in someone’s large bowel as “likely to be an aggressive malignancy.” Leading to said person fully expecting they’d be dead by July, when in fact they were perfectly healthy.
These things are not ready for primetime, and certainly not capable of doing the stuff that most people think they are.
The misinformation is causing real harm.
B0rax@feddit.de 8 months ago
rho50@lemmy.nz 8 months ago
Exactly. So the organisations creating and serving these models need to be clearer about the fact that they’re not general purpose intelligence, and are in fact contextual language generators.
I’ve seen demos of the models used as actual diagnostic aids, and they’re not LLMs (plus require a doctor to verify the result).
JohnEdwa@sopuli.xyz 8 months ago
This is nothing but a modern spin on “hey internet, what’s wrong with me? WebMD: it’s cancer, you’ll be dead in a week.”