If that was true then consent is meaningless because people are just predictive models with no agency to give consent.
Maybe your comparison is terrible?
Comment on Recent conversations between Dawkins and sentient chat-bot Claudia (Claude)
Grail@multiverse.soulism.net 12 hours agoYou’re a probability model. Your brain is just spitting out an approximation of the most likely actions to get you food and sex. If you don’t get enough food and sex, your genes die out and evolution tries again with an iteration of a more successful model. All those neurons are just a fancy way of calculating how to eat more bananas and chase more poontang. You’re nothing more than a mathematical equation for reproduction.
If that was true then consent is meaningless because people are just predictive models with no agency to give consent.
Maybe your comparison is terrible?
consent is meaningless because people are just predictive models
This is true if one maintains the assumption that predictive models (such as people) can’t experience qualia such as pain. My intend was to disabuse you and daannii of this silly notion. Obviously mathematical models can experience pain, because you’re a mathematical model and you can experience pain.
Predictive models and other computing processes do not have feelings or sensations because they don’t have nerves or any other senses. They are a complex process that has output based on input.
Reducing biology to just cause and effect is like saying rivers and oceans are the same thing because they both involve moving water, ignoring literally everything else that makes them different.
Predictive models are perfectly capable of having nerves and senses. You, for instance. You’re a predictive model and you have nerves and senses.
Also, what’s this “nerves or any other senses”? What kind of sense doesn’t come through a nerve? I’m starting to think you don’t know as much as I do about neuroscience.
daannii@lemmy.world 6 hours ago
Nope. We aren’t. Infact humans don’t work like that at all.
It’s actually amazing we ever learned actual probability math since it goes against our nature.
Grail@multiverse.soulism.net 1 hour ago
Yeah, LLMs are prone to similar biases because they are also bad at conceptualising probability. They’re not calculators, they’re ANNs. They basically operate on dream-logic. Whatever makes sense to you in a dream, is likely to make sense to an LLM. That’s because dreams are a time when you have intelligence without consciousness, like an LLM. You’re super suggestible and you just go with whatever feels right based on your gut instincts. An LLM is a simulation of a person’s gut instincts.