Comment on What is a good eli5 analogy for GenAI not "knowing" what they say?

<- View Parent
relevants@feddit.de ⁨1⁩ ⁨month⁩ ago

Ok, I like this description a lot actually, it’s a very quick and effective way to explain the effects of no backtracking. A lot of the answers here are either too reductive or too technical to actually make this behavior understandable to a layman. “It just predicts the next word” is easy to forget when the thing makes it so easy to be anthropomorphized subconsciously.

source
Sort:hotnewtop