Comment on Yea well it still can't have an existential crisis like humans can! Take that!
AmbiguousProps@lemmy.today 1 week agoThen why do you bring up code reviews and 500 lines of code? We were not talking about your “simulations” or whatever else you bring up here.
I have no idea what you’re trying to say with your first paragraph. Are you trying to say it’s impossible for it to coincidentally get a correct result? Because that’s literally all it can do. LLMs do not think, they do not reason, they do not understand. They are not capable of that. They are literally hallucinating all of the time, because that’s how they work.
Eheran@lemmy.world 1 week ago
I never said anything about code reviews.