Sadly I have to agree. While the nice people on Lemmy are much nicer, there are some really extreme views here that are heavily detached from reality.
I’ve probably had more heavy downvotes or arguments on Lemmy in 9 months than I had on Reddit in over 15 years. The highlight recently was me discussing how expert systems are used in LLM’s, given that I’m a software engineer that works in AI at a big tech company for a living. Nope, I’m wrong, LLM’s aren’t real AI, downvotes… Pair this with me questioning customer data access rules in big tech, which resulted in someone arguing my view on something I literally helped build and telling me to “open source it to prove it”.
Barbarian@sh.itjust.works 4 months ago
I think that’s mostly a semantics issue. When people talk about AI here on Lemmy, they generally mean AGI. LLMs are not AGI, as far as I understand it.
EnderMB@lemmy.world 4 months ago
They’re absolutely not. Where most people on Lemmy are wrong is in saying that most LLM’s just parrot back trained text. The reality is that a LLM action plan will likely contain expert API’s to provide valuable context. All a LLM is nowadays is an orchestrator platform, where it will pass ambiguities to an expert system that knows how to answer or give clarification.
We’re decades away from “AGI”, if at all.
laughterlaughter@lemmy.world 4 months ago
The irony is that I’ve been antagonized for looking for other examples of AI that are not related to LLMs. As in “lol LLMs are AI!!!” and I have to reply with “yeah, I know that, but what other types are being used out there?” “just shut up with your anti-LLM stuff!”