Comment on What is a good eli5 analogy for GenAI not "knowing" what they say?

<- View Parent
BCOVertigo@lemmy.world ⁨1⁩ ⁨month⁩ ago

It’s valid to point out that we have difficulty defining knowledge, but the output from these machines are inconsistent at a conceptual level, and you can easily get them to contradict themselves in the spirit of being helpful.

If someone told you that a wheel can be made entirely of gas do you have confidence that they have a firm grasp of a wheel’s purpose? Tool use is a pretty widely agreed upon marker of intelligence and so not grasping the purpose of a thing that they can describe at great length and exhaustive detail, while also making boldly incorrect claims on occassion should raise an eyebrow.

source
Sort:hotnewtop