Comment on We're all a little crazy

<- View Parent
MotoAsh@lemmy.world ⁨2⁩ ⁨months⁩ ago

No, because LLMs are just a mathematical blender with ONE goal in mind: construct a good sentence. They have no thoughts, they have no corrective motion, they just spit out sentences.

You MIGHT get to passing a Turing test with enough feedback tied in, but then the “conciousness” is specifically coming from the systemic complexity at that point and still very much not the LLMs.

source
Sort:hotnewtop