link to original reddit post by /u/MayCaesar
Watching the recent Sam Harris' conversation with Lex Friedman on consciousness got me thinking about the nature of rights. In virtually all liberal (in the classical sense) philosophies conscious beings have rights, while unconscious ones do not. Hence, for example, a human has rights, a rock does not, and a cow may or may not have some limited rights.
The problem, however, is that science still has nothing to say about what determines consciousness. One of the popular theories, for example, suggests that consciousness arises in any system containing more than one particle - so even an atom consisting of multiple electrons, protons and neutrons is conscious. So is planet Mars. So is the Andromeda galaxy. So is the table in your room.
Now, obviously, consciousness of such entities does not make much sense in practical terms. Whether your table has consciousness or not is a philosophical question, and regardless of what the answer is, talking about the "rights" of the table is not constructive. But what about more "advanced" entities, such as dolphins? It is possible that one day we will learn that dolphins are even more intelligent, self-aware and conscious than humans are. Will they have acquired rights as a consequence?
The question is deeper than it might seem, as it is predicated on a more fundamental question: do creatures acquire rights only when their consciousness is determined, or do they have rights prior to that? If we cannot even know whether a certain creature can have rights or not, then how do we go about making sure that we do not violate them? When the spectrum of possibilities is from "everything is conscious" to "nothing but me is conscious, and even I ultimately am just a simulated creature in a virtual world", then how should determination of one's rights be done?
Another closely related question concerns human creations. It is possible (and I like to believe that it is true, however bizarre it would be if it was) that, for example, all NPCs in video games we play are conscious: we simulate them in our virtual worlds, and they are actually self-aware on some "plane of existence" and experience joy, pain, hunger, satisfaction. It is possible that one day science will determine that this is the case. Will this mean that we are no longer free to make video games with violence in them, given that they would violate the rights of the simulated creatures?
In a free market society these questions would likely be answered in a decentralized manner, with multiple legal systems competing against each other. The whole concept of "rights" ultimately is conditional upon acceptance of a certain set of assumptions, and those assumptions differ between individuals. There can be a wide agreement on the right of every human being to live, but a much narrower agreement on the right of, say, human-made advanced robots to live. Hopefully, the disagreement can be resolved without violence and through pure market competition, with the concept of rights that produces the generally best outcome for the individual ultimately winning the market game. But it is also possible that this outcome would deny rights to many conscious creatures, such as "conscious" video game characters, which could be problematic not only from the purely philosophical perspective, but also in the practical perspective, when in the not-so-distant future humans start uploading their minds to electronic devices and become progressively closer to video game characters, just much more complicated. It is possible that the concept of rights will ultimately evolve into something that we nowadays would see as extremely tyrannical and restrictive.
What do you think about this? Is this something to discuss? Something to worry about? Or is it all too far-fetched to be applicable to anything in the foreseeable future?