You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

entirelyuseless comments on Probabilities Small Enough To Ignore: An attack on Pascal's Mugging - Less Wrong Discussion

20 Post author: Kaj_Sotala 16 September 2015 10:45AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (176)

You are viewing a single comment's thread. Show more comments above.

Comment author: entirelyuseless 21 September 2015 12:42:22PM 0 points [-]

Bounded utility functions effectively give "bounded probability functions," in the sense that you (more or less) stop caring about things with very low probability.

For example, if my maximum utility is 1,000, then my maximum utility for something with a probability of one in a billion is .0000001, an extremely small utiliity, so something that I will care about very little. The probability of of the 3^^^3 scenarios may be more than one in 3^^^3. But it will still be small enough that a bounded utility function won't care about situations like that, at least not to any significant extent.

That is precisely the reason that it will do the things you object to, if that situation comes up.

That is no different from pointing out that the post's proposal will reject a "mugging" even when it will actually cost 3^^^3 lives.

Both proposals have that particular downside. That is not something peculiar to mine.

Comment author: Houshalter 22 September 2015 10:38:53PM 1 point [-]

Bounded utility functions mean you stop caring about things with very high utility. That you care less about certain low probability events is just a side effect. But those events can also have very high probability and you still don't care.

If you want to just stop caring about really low probability events, why not just do that?

Comment author: entirelyuseless 23 September 2015 03:18:09AM *  0 points [-]

I just explained. There is no situation involving 3^^^3 people which will ever have a high probability. Telling me I need to adopt a utility function which will handle such situations well is trying to mug me, because such situations will never come up.

Also, I don't care about the difference between 3^^^^^3 people and 3^^^^^^3 people even if the probability is 100%, and neither does anyone else. So it isn't true that I just want to stop caring about low probability events. My utility is actually bounded. That's why I suggest using a bounded utility function, like everyone else does.

Comment author: Houshalter 23 September 2015 07:34:04PM 0 points [-]

There is no situation involving 3^^^3 people which will ever have a high probability.

Really? No situation? Not even if we discover new laws of physics that allow us to have infinite computing power?

Telling me I need to adopt a utility function which will handle such situations well is trying to mug me, because such situations will never come up.

We are talking about utility functions. Probability is irrelevant. All that matters for the utility function is that if the situation came up, you would care about it.

Also, I don't care about the difference between 3^^^^^3 people and 3^^^^^^3 people even if the probability is 100%, and neither does anyone else.

I totally disagree with you. These numbers are so incomprehensibly huge you can't picture them in your head, sure. There is massive scope insensitivity. But if you had to make moral choices that affect those two numbers of people, you should always value the bigger number proportionally more.

E.g. if you had to torture 3^^^^^3 to save 3^^^^^^3 from getting dust specks in their eyes. Or make bets involving probabilities between various things happening to the different groups. Etc. I don't think you can make these decisions correctly if you have a bounded utility function.

If you don't make them correctly, well that 3^^^3 people probably contains a basically infinite number of copies of you. By making the correct tradeoffs, you maximize the probability that the other versions of yoruself find themselves in a universe with higher utility.