Cross-posted on By Way of Contradiction
In my morals, at least up until recently, one of the most obvious universal rights was freedom of thought. Agents should be allowed to think whatever they want, and should not be discouraged for doing so. This feels like a terminal value to me, but it is also instrumentally useful. Freedom of thought encourages agents to be rational and search for the truth. If you are punished for believing something true, you might not want to search for truth. This could slow science and hurt everyone. On the other hand, religions often discourage freedom of thought, and this is a major reason for my moral problems with religions. It is not just that religions are wrong, everyone is wrong about lots of stuff. It is that many religious beliefs restrict freedom of thought by punishing doubters with ostracizing or eternal suffering. I recognize that there are some "religions" which do not exhibit this flaw (as much).
Recently, my tune has changed. There are two things which have caused me to question the universality of the virtue of freedom of thought:
1) Some truths can hurt society
Topics like unfriendly artificial intelligence make me question the assumption that I always want intellectual progress in all areas. If we as modern society were to choose any topic which restricting thought about might be very useful, UFAI seems like a good choice. Maybe the freedom of thought in this issue might be a necessary casualty to avoid a much worse conclusion.
2) Simulations
This is the main point I want to talk about. If we get to the point where minds can simulate other minds, then we run into major issues. Should one mind be allowed to simulate another mind and torture it? It seems like the answer should be no, but this rule seems very hard to enforce without sacrificing not only free thought, but what would seem like the most basic right to privacy. Even today, people can have preferences over the thoughts of other people, but our intuition tells us that the one who is doing the thinking should get the final say. If the mind is simulating another mind, shouldn't the simulated mind also have rights? What makes advanced minds simulating torture so much worse than a human today thinking about torture. (Or even worse, thinking about 3^^^^3 people with dust specks in their eyes. (That was a joke, I know we cant actually think about 3^^^^3 people.))
The first thing seems like a possible practical concern, but it does not bother me nearly as much as the second one. The first seems like it is just and example of the basic right of freedom of thought contradicting another basic right of safety. However the second thing confuses me. It makes me wonder whether or not I should treat freedom of thought as a virtue as much as I currently do. I am also genuinely not sure whether or not I believe that advanced minds should not be free to do whatever they want to simulations in their own minds. I think they should not, but I am not sure about this, and I do not know if this restriction should be extended to humans.
What do you think? What is your view on the morality of drawing the line between the rights of a simulator and the rights of a simulatee? Do simulations within human minds have any rights at all? What conditions (if any) would make you think rights should be given to simulations within human minds?
The way I can see it in sci-fi terms:
If human mind is the first copy of a brain that has been uploaded to an computer, than it deserves the same rights as any human. There is a rule against running more than one instance of the same person at the same time.
Human mind created on my own computer from first principles, so to speak of, does not have any rights, but there is also a law in place to prevent such agents from being created, as human minds are dangerous toys.
Plans to enforce thought-taboo devices are likely to fail, as no self-respecting human being would allow such an crude ingerence of third parties into their own thought process. I mean, it starts with NO THINKING ABOUT NANOTECHNOLOGY and in time changes to NO THINKING ABOUT RESISTANCE .
EDIT:
Also, assuming that there is really a need to extract some information from an individual, I would reluctantly grant government right to create temporary copy of an individual to be interrogated, interrogate (i.e. torture) the copy and then delete it shortly afterwards. It is squicky, but in my head superior to leaving the original target with memories of interrogation.
Of course, provided the alternative is not to just be killed here and now.
Men with weapons have been successfully persuading other people to do something they don't want to do for ages.