MixedNuts comments on St. Petersburg Mugging Implies You Have Bounded Utility - Less Wrong

10 Post author: TimFreeman 07 June 2011 03:06PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (163)

You are viewing a single comment's thread. Show more comments above.

Comment author: Armok_GoB 12 June 2011 07:15:01PM 2 points [-]

One very critical factor you forgot is goal uncertainty! Your argument is actually even better than you think it is. If you assign an extremely low but non-zero probability that your utility function is unbounded, then you must still multiply it with infinity. And 1 is not a probability... There is no possible state that represent sufficient certainty that your utility function is bounded to justify not giving all your money to the mugging.

I WOULD send you my money, except the SIAI is a lot of orders of magnitude more likely than you to be a god (you didn't define it'd be instant or direct) and they have a similar offer, so I'm mugged into maximizing amount of help given to the SIAI instead. But I DO bite the bullet of small probabilities of extremely large utilities, however repugnant and counter-intuitive it seems.

Comment author: MixedNuts 12 June 2011 08:02:53PM 1 point [-]

If I am a god, then it will be instant and direct; also, I'll break the laws of physics/the Matrix/the meta-Matrix/etc. to reach states the SIAI can't reach. If I am a god and you do not give me any money, then I'll change the universe into the most similar universe where SIAI's probability of success is divided by 2.

Can I get money?

Comment author: Armok_GoB 12 June 2011 08:17:31PM 0 points [-]

The probability of the AI doing all of that (hey, time travel) is still much much larger.