Comment on What is a good eli5 analogy for GenAI not "knowing" what they say?

<- View Parent
Drummyralf@lemmy.world ⁨6⁩ ⁨months⁩ ago

I commented something similair on another post, but this is exactly why I find this phenomenon so hard to describe.

A teenager in a new group still has some understanding and has a mind. It knows many of the meaning of the words that are said. Sure, some catchphrases might be new, but general topics shouldn’t be too hard to follow.

This is nothing like genAI. GenAI doesn’t know anything at all. It has (simplified) a list of words that somehow are connected to eachother. But AI has no meaning of a wheel, what round is, what rolling is, what rubber is, what an axle is. NO understanding. Just words that happened to describe all of it. For us humans it is so difficult to understand that something uses language without knowing ANY of the meaning.

How can we describe this so our brains make sense that you can have language without understanding? The Chinese Room experiment comes close, but is quite complicated to explain as well I think.

source
Sort:hotnewtop