Comment on LLMs’ “simulated reasoning” abilities are a “brittle mirage,” researchers find
teawrecks@sopuli.xyz 4 days ago
The analogy I use is, it’s like a magician pulled a coin from behind a CEO’s ear, and their response was “that’s incredible! Free money! Let’s go into business together!”
Literally no one ever claimed it had reasoning capabilities. It is a trick to produce a string of characters that your brain can make sense of. That’s all.
anachronist@midwest.social 4 days ago
Altman and similar grifters were and are absolutely making those claims but maybe we’re excusing them as obvious liars?
TehPers@beehaw.org 4 days ago
They are obvious liars. Some people are just too invested to see it.
These models only have reasoning capabilities using the most obscure definitions of “reasoning”. At best, all they’re doing are climbing to local maxima with their so-called “reasoning” on a graph as wavy as the ocean.
I’ve mentioned this on other posts, but it’s really sad because LLMs have been wildly incredible for certain NLP operations. They are that though, not AGI or whatever snake oil Altman wants to sell this week.
teawrecks@sopuli.xyz 4 days ago
The CEOs you’re talking about are the CEOs in the analogy.