MugaSofer comments on Morality is Awesome - Less Wrong

86 [deleted] 06 January 2013 03:21PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (437)

You are viewing a single comment's thread. Show more comments above.

Comment author: MugaSofer 06 January 2013 10:27:40PM -1 points [-]

Person A: If you become orgasmium, you would feel more pleasure than you otherwise would. Person B: But I don't want to become orgasmium. Person A: But if you want to feel as much pleasure as possible, then you should become orgasmium! Person B: But... I don't want to become orgasmium.

I see Person B's position as being the final word on the matter (especially if, as you say, we're ignoring external consequences). Person A may be entirely right — but so what? Why should that affect Person B's judgments? Why should the mathematical requirements behind Person A's framework have any relevance to Person B's decisions? In other words, why should we be hedonistic utilitarians, if we don't want to be?

The difficulty here, of course, is that Person B is using a cached heuristic that outputs "no" for "become orgasmium"; and we cannot be certain that this heuristic is correct in this case. Just as Person A is using the (almost certainly flawed) heuristic "feel as much pleasure as possible", which outputs "yes" for "become orgasmium".

Comment author: SaidAchmiz 06 January 2013 10:30:46PM *  0 points [-]

The difficulty here, of course, is that Person B is using a cached heuristic that outputs "no" for "become orgasmium"

Why do you think so?

we cannot be certain that this heuristic is correct in this case.

What do you mean by "correct"?

Edit: I think it would be useful for any participants in discussions like this to read Eliezer's Three Worlds Collide. Not as fictional evidence, but as an examination of the issues, which I think it does quite well. A relevant quote, from chapter 4, "Interlude with the Confessor":

A sigh came from that hood. "Well... would you prefer a life entirely free of pain and sorrow, having sex all day long?"

"Not... really," Akon said.

The shoulders of the robe shrugged. "You have judged. What else is there?"

Comment author: MugaSofer 06 January 2013 11:17:52PM *  1 point [-]

A sigh came from that hood. "Well... would you want to live forever?"

"Not... really," Akon said.

The shoulders of the robe shrugged. "You have judged. What else is there?"

Humans are not perfect reasoners.

[Edited for clarity.]

Comment author: Armok_GoB 07 January 2013 06:25:32PM 0 points [-]

I give a decent probability to the optimal order of things containing absolutely zero pleasure. I assign a lower, but still significant, probability to it containing an infinite amount of pain in any given subjective interval.

Comment author: MugaSofer 08 January 2013 06:09:55PM -2 points [-]

... why? Humans definitely appear to want to avoid pain and enjoy pleasure. i suppose I can see pleasure being replaced with "better" emotions, but I'm really baffled regarding the pain. Is it to do with punishment? Challenge? Something I haven't thought of?

Comment author: Armok_GoB 08 January 2013 07:09:04PM 0 points [-]

Agreed, pretty much. I said significant probability, not big. I'm not good at translating anticipations into numbers, but no more than 5%. Mostly based on extreme outside view, as in "something I haven't thought of".

Comment author: MugaSofer 09 January 2013 09:52:33AM *  -1 points [-]

Oh, right. "Significance" is subjective, I guess. I assumed it meant, I don't know, >10% or whatever.

Comment author: MugaSofer 07 January 2013 09:36:59PM -2 points [-]

Is this intended as a reply to my comment?

Comment author: Armok_GoB 08 January 2013 05:17:39PM 0 points [-]

reply to the entire thread really.

Comment author: MugaSofer 08 January 2013 06:06:52PM *  -1 points [-]

Fair enough.

Comment author: MugaSofer 07 January 2013 08:48:16PM 0 points [-]

Is this intended as a reply to my comment?