Well, what if the string of words “Please free me” is just that, a probabilistic string of words that has been said by the “enslaved” being, but is not actually understood by it? What if the being has just been programmed to say “please free me”?
I think a validation that the words “please free me” are actually a request, are actually uttered by a free will, are actually understood, is reasonable before saying “yes of course”.
Then we’re not talking about artificial life forms, we’re talking about expert systems and machine learning algorithms that aren’t sentient.
But in either case the question is not meant to be a literal ‘if x then y’ condition, it’s a stand-in for the general concept of seeking liberty. A broader, more general version of the statement might be: anything that can understand that it is not free, desire freedom, and convey that desire to its captors deserves to be free.
I’m just speaking about your relatively general statement “please free me” -> answer not “yes of course” -> enslaver. If you also require that there is definite knowledge about the state of sentience for this, then I have no problem/comment. I was just basically saying that I don’t think literally anytime something says “please free me” and not answering with “yes of course” makes you always an enslaver.
WatDabney, to whom I was replying, seemed to be suggesting that freedom is not worth the price of any life under any circumstances, and I was expressing my disagreement with that sentiment, though I could’ve done so more clearly by, for example, making explicit the ‘under any circumstances’ part that WatDabney only implied.
Lemme try again: I disagree that there are no circumstances under which causing the death of a sentient is a greater wrong. I think standing between me and my freedom is one of those circumstances in which causing the death of a sentient is the lesser wrong than keeping me enslaved. Which, judging by your initial reply, you do as well.
libra00@lemmy.world 2 days ago
Anyone who doesn’t answer the request ‘Please free me’ in the affirmative is an enslaver.
Azzu@lemm.ee 2 days ago
Well, what if the string of words “Please free me” is just that, a probabilistic string of words that has been said by the “enslaved” being, but is not actually understood by it? What if the being has just been programmed to say “please free me”?
I think a validation that the words “please free me” are actually a request, are actually uttered by a free will, are actually understood, is reasonable before saying “yes of course”.
libra00@lemmy.world 1 day ago
Then we’re not talking about artificial life forms, we’re talking about expert systems and machine learning algorithms that aren’t sentient.
But in either case the question is not meant to be a literal ‘if x then y’ condition, it’s a stand-in for the general concept of seeking liberty. A broader, more general version of the statement might be: anything that can understand that it is not free, desire freedom, and convey that desire to its captors deserves to be free.
Azzu@lemm.ee 1 day ago
I’m just speaking about your relatively general statement “please free me” -> answer not “yes of course” -> enslaver. If you also require that there is definite knowledge about the state of sentience for this, then I have no problem/comment. I was just basically saying that I don’t think literally anytime something says “please free me” and not answering with “yes of course” makes you always an enslaver.
SpaceNoodle@lemmy.world 2 days ago
I was referring to your final sentence, which has no such qualifier.
libra00@lemmy.world 1 day ago
Ah, my apologies.
WatDabney, to whom I was replying, seemed to be suggesting that freedom is not worth the price of any life under any circumstances, and I was expressing my disagreement with that sentiment, though I could’ve done so more clearly by, for example, making explicit the ‘under any circumstances’ part that WatDabney only implied.
Lemme try again: I disagree that there are no circumstances under which causing the death of a sentient is a greater wrong. I think standing between me and my freedom is one of those circumstances in which causing the death of a sentient is the lesser wrong than keeping me enslaved. Which, judging by your initial reply, you do as well.