MaoShan comments on Local Ordinances of Fun - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (153)
Well, how would you answer my hypothetical?
And suppose I rephrased it thus: your friend needs help say, getting through a painful divorce, and you knew that this will be a difficult process taking many years. But you also know that if you put yourself in an experience machine for the rest of your life, you could soothe your (virtual) friend's wounded soul in half an hour. Supposing the move to the experience machine doesn't interfere with any of your other plans (they could be simulated too, of course), would you consider the experience machine simply a more efficient means to your end? Or would it fail to achieve your end at all?
That is sort of an illogical question. What it boils down to is "Is your goal to feel like you are helping somebody, or is your goal to help somebody whom you are actually emotionally attached to?" If I agree with the former, then aside from realizing I have some pretty vapid and pointless goals, I'd get in the machine. But if I had the clarity to realize my part in that goal system, helping others probably wouldn't be high on my list of things to do. I would get in without a glance backward, and start thinking up something interesting. If I genuinely believed that I cared about them and got in the box anyway, do I really qualify as a sentient being? The machine might as well be a meat-grinder in that case.
If I genuinely cared about the person in question, I would realize that with me inside the machine, he would still be suffering, and my social programming would not easily allow me to deviate from the "right thing to do", and I would refuse to get in.