Seriously, it’s just a fancy auto-complete. It knows nothing.
Of course they do. AI is aggressively marketed as as such. Most people simply don’t know that an LLM doesn’t have a concept of "truth“, and the misleading marketing is to blame for that.
VindictiveJudge@lemmy.world 3 weeks ago
AFKBRBChocolate@lemmy.world 3 weeks ago
I always try to explain to people that the key is the last two letters: language model. An LLM is a model of what a conversation should look like. Ask it a question and it’s intended to give you a response that looks like the right kind of thing. So if you ask it for a mathematical proof, it will give you one, but unless the thing you’re asking has the same proof written the same way in lots of places online, what it gives you won’t be correct, and probably won’t actually make sense mathematically, but it will look like the right kind of thing.
So likewise, if you ask it for relationship advice, it’s going to give you something that looks legit, but you’re an idiot if you get your relationship advice from an LLM.