MaoShan comments on Local Ordinances of Fun - Less Wrong

18 Post author: Alicorn 18 June 2012 03:07AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (153)

You are viewing a single comment's thread. Show more comments above.

Comment author: [deleted] 18 June 2012 08:11:51PM 1 point [-]

Well, how would you answer my hypothetical?

if I was starting from zero rather than replacing them, I wouldn't mind ending up with a fully simulated social circle so long as it was similarly engaging / persistent / etc.

And suppose I rephrased it thus: your friend needs help say, getting through a painful divorce, and you knew that this will be a difficult process taking many years. But you also know that if you put yourself in an experience machine for the rest of your life, you could soothe your (virtual) friend's wounded soul in half an hour. Supposing the move to the experience machine doesn't interfere with any of your other plans (they could be simulated too, of course), would you consider the experience machine simply a more efficient means to your end? Or would it fail to achieve your end at all?

Comment author: MaoShan 19 June 2012 03:42:56AM 1 point [-]

That is sort of an illogical question. What it boils down to is "Is your goal to feel like you are helping somebody, or is your goal to help somebody whom you are actually emotionally attached to?" If I agree with the former, then aside from realizing I have some pretty vapid and pointless goals, I'd get in the machine. But if I had the clarity to realize my part in that goal system, helping others probably wouldn't be high on my list of things to do. I would get in without a glance backward, and start thinking up something interesting. If I genuinely believed that I cared about them and got in the box anyway, do I really qualify as a sentient being? The machine might as well be a meat-grinder in that case.

If I genuinely cared about the person in question, I would realize that with me inside the machine, he would still be suffering, and my social programming would not easily allow me to deviate from the "right thing to do", and I would refuse to get in.