Comment on What is a good eli5 analogy for GenAI not "knowing" what they say?

<- View Parent
Asifall@lemmy.world ⁨1⁩ ⁨month⁩ ago

I always thought the Chinese Room argument was kinda silly. It’s predicated on the idea that humans have some unique capacity to understand the world that can’t be replicated by a syntactic system, but there is no attempt made to actually define this capacity.

The whole argument depends on our intuition that we think and know things in a way inanimate objects don’t. In other words, it’s a tautology to draw the conclusion that computers can’t think from the premise that computers can’t think.

source
Sort:hotnewtop