The foundational idea behind what the user is talking about is called panpsychism, it’s the idea that consciousness or awareness is actually a fundamental quality of the universe like fields or forces, in that it’s in everything, but only complex systems have actual thoughts.
The theory(?) states that even a single electron or proton has a state of awareness, but without any functional way to remember any information or think it’s just like some kind of flash of experience like if you suddenly developed perpetual amnesia about literally everything… while you were hurtling through the universe at high speed.
I get the concept, but I don’t get the usefulness of it. It feels too close to people wishing The Force was real.
Guys. You are not getting your light sabers this way.
m_f@discuss.online 3 days ago
What does “aware” mean, or “knowledge”? I think those are going to be circular definitions, maybe filtered through a few other words like “comprehend” or “perceive”.
Does a plant act with deliberate intention when it starts growing from a seed?
To be clear, my beef is more with the definition of “conscious” being useless and/or circular in most cases. I’m not saying “woah, what if plants have thoughts dude” as in the meme, but whatever definition you come up with, you have to evaluate why it does or doesn’t include plants, simple animals, or AI.
auraithx@lemmy.dbzer0.com 2 days ago
Aware means it has a sense of self. They are circular because we use these words to define how that is perceived.
Plants do not act deliberately when they do anything, because they do not have a sense of self and are not conscious.
commie@lemmy.dbzer0.com 2 days ago
Image
auraithx@lemmy.dbzer0.com 1 day ago
We also can’t know “for certain” that a rock isn’t screaming silently, or that there isn’t a china teapot orbiting the sun between Earth and Mars. Science doesn’t deal in absolute certainties; it deals in probabilities based on evidence. There is zero evidence for plant consciousness and massive evidence against it.
Consciousness, as far as we observe it in the entire animal kingdom, is an emergent property of a centralized nervous system processing information. Plants lack neurons, a brain, or any substrate capable of integrating information into a unified experience.
Claiming a plant might be conscious is like claiming a calculator might be running Call of Duty. It’s not that we “don’t know”, it’s that the hardware simply cannot run that software.
Evolutionarily, consciousness (and specifically the ability to feel pain or fear) is a mechanism to trigger escape or avoidance. Since plants are sessile (they cannot move), developing a complex, energy-expensive system to “feel” damage would be a massive evolutionary disadvantage. Why would nature select for an organism that can feel being eaten but do absolutely nothing about it?
m_f@discuss.online 2 days ago
What do “sense” and “perceived” mean? I think they both loop back to “aware”, and the reason I point that out is that circular definitions are useless. How can you say that plants lack a sense of self and consciousness, if you can’t even define those terms properly? What about crown shyness? Trees seem to be able to tell the difference between themselves and others.
As an example of the circularity, “sense” means (using Wiktionary, but pick your favorite if you don’t like it) “Any of the manners by which living beings perceive the physical world”. What does “perceive” mean? “To become aware of, through the physical senses”. So in your definition, “aware” loops back to “aware” (Wiktionary also has a definition of “sense” that just defines it as “awareness”, for a more direct route, too).
I meant that plants don’t have thoughts more in the sense of “woah, dude”, pushing back on something without any explanatory power. But really, how do you define “thought”? I actually think Wiktionary is slightly more helpful here, in that it defines “thought” as “A representation created in the mind without the use of one’s faculties of vision, sound, smell, touch, or taste”. That’s kind of getting to what I’ve commented elsewhere, with trying to come up with a more objective definition based around “world model”. Basing all of these definitions on “representation” or “world model” seems to the closest to an objective definition as we can get.
Of course, that brings up the question of “What is a model?” / “What does represent mean?”. Is that just pushing the circularity elsewhere? I think not, if you accept a looser definition. If anything has an internal state that appears to correlate to external state, then it has a world model, and is at some level “conscious”. You have to accept things that many people don’t want to, such as that AI is conscious of much of the universe (albeit experienced through the narrow peephole of human-written text). I just kind of embraced that though and said “sure, why not?”. Maybe it’s not satisfying philosophically, but it’s pragmatically useful.
auraithx@lemmy.dbzer0.com 1 day ago
Your definition of consciousness as any “internal state correlating to external state” is functionally too broad; by this metric, a mercury thermometer possesses a “world model” and is therefore conscious, which renders the term useless for distinguishing complex biology from simple causality. Phenomena like crown shyness are better explained by mechanical feedback loops, essentially biological if/then statements based on light and abrasion, rather than a self-aware “sense of self.” A true “thought” or “world model” requires the capacity for “offline” simulation (counterfactuals) decoupled from immediate sensory input, whereas plants are entirely reactive (“online”) and current AI lacks continuous internal state. Ultimately, you are conflating reception (reflexive data intake) with perception (integrated awareness), failing to distinguish between the mechanism of a map and the subjective experience of the territory.