Cross-posted on By Way of Contradiction
In my morals, at least up until recently, one of the most obvious universal rights was freedom of thought. Agents should be allowed to think whatever they want, and should not be discouraged for doing so. This feels like a terminal value to me, but it is also instrumentally useful. Freedom of thought encourages agents to be rational and search for the truth. If you are punished for believing something true, you might not want to search for truth. This could slow science and hurt everyone. On the other hand, religions often discourage freedom of thought, and this is a major reason for my moral problems with religions. It is not just that religions are wrong, everyone is wrong about lots of stuff. It is that many religious beliefs restrict freedom of thought by punishing doubters with ostracizing or eternal suffering. I recognize that there are some "religions" which do not exhibit this flaw (as much).
Recently, my tune has changed. There are two things which have caused me to question the universality of the virtue of freedom of thought:
1) Some truths can hurt society
Topics like unfriendly artificial intelligence make me question the assumption that I always want intellectual progress in all areas. If we as modern society were to choose any topic which restricting thought about might be very useful, UFAI seems like a good choice. Maybe the freedom of thought in this issue might be a necessary casualty to avoid a much worse conclusion.
2) Simulations
This is the main point I want to talk about. If we get to the point where minds can simulate other minds, then we run into major issues. Should one mind be allowed to simulate another mind and torture it? It seems like the answer should be no, but this rule seems very hard to enforce without sacrificing not only free thought, but what would seem like the most basic right to privacy. Even today, people can have preferences over the thoughts of other people, but our intuition tells us that the one who is doing the thinking should get the final say. If the mind is simulating another mind, shouldn't the simulated mind also have rights? What makes advanced minds simulating torture so much worse than a human today thinking about torture. (Or even worse, thinking about 3^^^^3 people with dust specks in their eyes. (That was a joke, I know we cant actually think about 3^^^^3 people.))
The first thing seems like a possible practical concern, but it does not bother me nearly as much as the second one. The first seems like it is just and example of the basic right of freedom of thought contradicting another basic right of safety. However the second thing confuses me. It makes me wonder whether or not I should treat freedom of thought as a virtue as much as I currently do. I am also genuinely not sure whether or not I believe that advanced minds should not be free to do whatever they want to simulations in their own minds. I think they should not, but I am not sure about this, and I do not know if this restriction should be extended to humans.
What do you think? What is your view on the morality of drawing the line between the rights of a simulator and the rights of a simulatee? Do simulations within human minds have any rights at all? What conditions (if any) would make you think rights should be given to simulations within human minds?
Sam Harris has said that there are some beliefs, so dangerous, that we could have to kill someone for believing it.
Imagine an agent with an (incorrect) belief that only by killing everyone, would the world be the best place possible, and a prior against anything realistically causing it to update away. This would have to be stopped somehow, because of what it thinks (and what that causes it to do).
Y'know the intentional stance?
Belief + Desire --> Intentional Action
(In fact, I the agent sounds similar to a religious person who believes that killing everyone ensures believers eternity in heaven, and evil people an eternity in hell or something similar - and knows that doubts are only offered by the devil. Sam Harris talks about this idea in the context of a discussion of people who believe in Islam and act on the beliefs by blowing themselves up in crowded places.)
I'm not what practical advice this gives, I'm just making the general point that what you think becomes what you do, and there's a lot of bad things you can do.
That doesn't quite follow.
Thinking something does not make it so, and there are a vanishingly small number of people who could realistically act on a desire to kill everyone. The only time you have to be deeply concerned about someone with those beliefs is if they managed ... (read more)