How can we describe this so our brains make sense that you can have language without understanding?
I think it is really impossible to describe in easy and limited words.
Comment on What is a good eli5 analogy for GenAI not "knowing" what they say?
Drummyralf@lemmy.world 7 months agoI commented something similair on another post, but this is exactly why I find this phenomenon so hard to describe.
A teenager in a new group still has some understanding and has a mind. It knows many of the meaning of the words that are said. Sure, some catchphrases might be new, but general topics shouldn’t be too hard to follow.
This is nothing like genAI. GenAI doesn’t know anything at all. It has (simplified) a list of words that somehow are connected to eachother. But AI has no meaning of a wheel, what round is, what rolling is, what rubber is, what an axle is. NO understanding. Just words that happened to describe all of it. For us humans it is so difficult to understand that something uses language without knowing ANY of the meaning.
How can we describe this so our brains make sense that you can have language without understanding? The Chinese Room experiment comes close, but is quite complicated to explain as well I think.
How can we describe this so our brains make sense that you can have language without understanding?
I think it is really impossible to describe in easy and limited words.
NO understanding. Just words that happened to describe all of it.
If being able to describe it does not mean understanding, then what is understanding?
Joel Haver has a sketch in which one person in a group laughs at an inside joke from a trip they didn’t go on. When pressed I think they say something like they laughed because everyone else was. As someone who has been in this situation, it’s true. Even though I don’t understand the specific reference being made, it’s usually being done in a funny manner such that the story telling is enjoyable and humorous. Or I’m able to use context clues to guess what they might be joking about and it’s funny, even if my understanding is off.
Zos_Kia@lemmynsfw.com 7 months ago
I think a flaw in this line of reasoning is that it assigns a magical property to the concept of knowing. Do humans know anything? Or do they just infer meaning from identifying patterns in words? Ultimately this question is a spiritual question and does not hold any water in a scientific conversation.
BCOVertigo@lemmy.world 7 months ago
It’s valid to point out that we have difficulty defining knowledge, but the output from these machines are inconsistent at a conceptual level, and you can easily get them to contradict themselves in the spirit of being helpful.
If someone told you that a wheel can be made entirely of gas do you have confidence that they have a firm grasp of a wheel’s purpose? Tool use is a pretty widely agreed upon marker of intelligence and so not grasping the purpose of a thing that they can describe at great length and exhaustive detail, while also making boldly incorrect claims on occassion should raise an eyebrow.