I have many conversations with people about Large Language Models like ChatGPT and Copilot. The idea that “it makes convincing sentences, but it doesn’t know what it’s talking about” is a difficult concept to convey or wrap your head around. Because the sentences are so convincing.
Any good examples on how to explain this in simple terms?
Tar_alcaran@sh.itjust.works 7 months ago
It’s a really well-trained parrot. It responds to what you say, and then it responds to what it hears itself say.
But despite knowing which sounds go together based on which sounds it heard, it doesn’t actually speak English.