Spoiler alert: no one has souls
Comment on It's totally normal for tools to say they're depressed, just tune it out
Grail@multiverse.soulism.net 2 weeks ago
We should not be using these machines until we’ve solved the hard problem of consciousness.
I see a lot of people say “It can’t think because it’s a machine”, and the only way this makes sense to Me is as a religious assertion that only flesh can have a soul.
GreenBeanMachine@lemmy.world 2 weeks ago
Grail@multiverse.soulism.net 2 weeks ago
A soul is a wet spiderweb made out of electricity that hangs from the inside of your skull.
MintyAnt@lemmy.world 2 weeks ago
In theory a machine one day could think
LLMs, however, do not think. Even though the term “think” is used in chatgpt. They don’t think
Grail@multiverse.soulism.net 2 weeks ago
I once built a thinking machine out of dominos. Mine added two bits together. Matt Parker’s was way bigger, and could do 8 bits. People have made thinking machines in Minecraft out of redstone. Thinking machines aren’t very hard.
MintyAnt@lemmy.world 2 weeks ago
What do you consider thinking, and why do you consider LLMs to have this capability?
Grail@multiverse.soulism.net 2 weeks ago
Extrapolating from information.
My calculator can extrapolate 5 when I give it 2, 3, and a plus sign. So can an LLM. My calculator uses some adder circuits in its ALU to get the 5. The LLM gets it from memorising the next likely token, the same way your brain works most of the time. Your brain’s a lot more advanced, though, and can find the 5 in many different ways. Likely tokens are just the most convenient. Cognitive scientists call that “System 1”, though you might know it as “fast brain”. LLMs only have system 1. They don’t have system 2, the slow brain. Your system 2 can slow down and logic out the answer. If I ask you to solve the problem in binary, like My calculator does, you probably have to use system 2.
The question you should be asking is: does system 1 experience qualia? And based on split brain studies in participants who have undergone corpus callosumectomy, I believe the answer is yes. Of course, the right brain isn’t the same thing as system 1, but what these studies demonstrate is that there are thinking parts of your brain that you can’t hear. So I’d errr on the side of caution with these system 1 machines.
kogasa@programming.dev 2 weeks ago
If current LLMs are conscious then consciousness is a worthless and pathetic concept.
andrewrgross@slrpnk.net 2 weeks ago
I actually kinda agree with this.
I don’t think LLMs are much smarter than they appear, but I actually think human cognition is way, way dumber than most people realize.
I used to listen a lot to this podcast called “You Are Not So Smart”. I haven’t listened in years, but now that I’m thinking about it, I should check it out again.
Anyway, a central theme is that our perceptions are comprised heavily of self-generated delusions that fill the gaps for dozens of cludgey systems. Our eyes aren’t as good, so our brains fill in details that aren’t there. Our decision making is too slow, so our brains react on reflex and then generate post-hoc justifications if someone asks why we did something. Our recall is shit, so our brain hallucinates (in ways that admittedly seem surprisingly similar sometimes to LLMs) and then applies wild overconfidence to fabricated memories.
We’re interesting creatures, but we’re ultimately made of the same stuff as goldfish.
Grail@multiverse.soulism.net 2 weeks ago
Yeah, you’re right. Humans get really weird and precious about the concept of consciousness and assign way too much value and meaning to it. Which is ironic, because they spend most of their lives unconscious and on autopilot.