Comment on ChatGPT's hallucination problem is getting worse according to OpenAI's own tests and nobody understands why
homesweethomeMrL@lemmy.world 1 year ago
*raises hand*
Because it never “understood” what any “word” ever “meant” anyway?
Yeah, it’s all hallucinations - it’s just that sometimes the hallucinations manage to approximate correctness, and it can’t tell one from the other.
geekwithsoul@lemm.ee 1 year ago
Yeah, it’s all hallucinations - it’s just that sometimes the hallucinations manage to approximate correctness, and it can’t tell one from the other.