Comment on What is a good eli5 analogy for GenAI not "knowing" what they say?

skillissuer@discuss.tchncs.de ⁨7⁩ ⁨months⁩ ago

it’s a spicy autocomplete. it doesn’t know anything, does not understand anything, it does not reason, and it won’t stop until your boss thinks it’s good enough at your job for “restructuring” (it’s not). any illusion of knowledge comes from the fact that its source material mostly is factual. when you’re drifting off into niche topics or something that was missing out of training data entirely, spicy autocomplete does what it does best, it makes shit up. some people call this hallucination, but it’s closer to making shit up confidently while not knowing any better. humans do that too, but at least they know when they do that

source
Sort:hotnewtop