In my opinion you are giving way too much credit to human beings. We are mainly just machines that spit out sentences.
Comment on We're all a little crazy
MotoAsh@lemmy.world 7 months agoNo, because LLMs are just a mathematical blender with ONE goal in mind: construct a good sentence. They have no thoughts, they have no corrective motion, they just spit out sentences.
You MIGHT get to passing a Turing test with enough feedback tied in, but then the “conciousness” is specifically coming from the systemic complexity at that point and still very much not the LLMs.
Hotzilla@sopuli.xyz 7 months ago
MotoAsh@lemmy.world 7 months ago
No, you are giving too much credit to LLMs. Thinking LLMs are capable of sentience is as stupid as thinking individual neurons could learn physics.
maynarkh@feddit.nl 7 months ago
So you’re saying it’s not good enough for a sentient personality, but it might be good enough for an average politician?
MotoAsh@lemmy.world 7 months ago
Oh, if we’re talking about what it takes to replace politicians, technology has been capable of that for years.
nxdefiant@startrek.website 7 months ago
Hell, maybe even above average if the model can update itself in real time.