Comment on AGI achieved 🤖
outhouseperilous@lemmy.dbzer0.com 1 month agoIt doesn’t know things.
It’s a statistical model. It cannot synthesize information ir problem solve, only show you a rough average of its library if inputs graphed by proximity to your input.
jsomae@lemmy.ml 1 month ago
Congrats, you’ve discovered reductionism. The human brain also doesn’t know things, as it’s composed of electrical synapses made of molecules that obey the laws of physics and direct one’s mouth to make words in response to signals that come from the ears.
Not saying LLMs don’t know things, but your argument as to why they don’t know things has no merit.
outhouseperilous@lemmy.dbzer0.com 1 month ago
Oh, that’s why everything else you said seemed a bit off.
jsomae@lemmy.ml 1 month ago
sorry, I only have a regular brain, haven’t updated to the metaphysical edition :/