Comment on Have LLMs killed all future programming languages?

tym@lemmy.world ⁨1⁩ ⁨week⁩ ago

ITT: People not understanding how LLMs are trained. They tokenize words and phrases (give them serial numbers to index), study relationship and distance between tokens, and mimic the most common outcomes they’ve been trained on.

It’s not magic, it’s a parrot.

source
Sort:hotnewtop