Comment on AGI achieved 🤖

<- View Parent
jsomae@lemmy.ml ⁨6⁊ ⁨days⁊ ago

You’re talking about hallucinations. That’s different from tokenization reflection errors. I’m specifically talking about its inability to know how many of a certain type of letter are in a word that it can spell correctly. This is not a hallucination per se – at least, it’s a completely different mechanism that causes it than whatever causes other factual errors. This specific problem is due to tokenization, and that’s why I say it has little bearing on other shortcomings of LLMs.

source
Sort:hotnewtop