Comment on AGI achieved 🤖

<- View Parent
Zacryon@feddit.org ⁨2⁩ ⁨weeks⁩ ago

I know that words are tokenized in the vanilla transformer. But do GPT and similar LLMs still do that as well? I assumed they also tokenize on character/symbol level, possibly mixed up with additional abstraction down the chain.

source
Sort:hotnewtop