The layer of disassociation is present w/ humans speaking different languages too though, right? My point is that once we can understand each other, we are all building on what already exists
Comment on [deleted]
Quatity_Control@lemm.ee 8 months agoNot really. The LLMs use tokens instead of actual words to understand the words. There’s a layer of disassociation. That’s different to taking pre existing knowledge, understanding it, and using it to divine more knowledge.
macallik@kbin.social 8 months ago
Quatity_Control@lemm.ee 8 months ago
Language is not disassociation.
rufus@discuss.tchncs.de 8 months ago
Yeah. We’re all embedded in a context and in a world surrounding us. So are computer language models.
slazer2au@lemmy.world 8 months ago
RFC1925 6a at it again