DanielLC comments on Open thread, Dec. 15 - Dec. 21, 2014 - Less Wrong

2 Post author: Gondolinian 15 December 2014 12:01AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (309)

You are viewing a single comment's thread. Show more comments above.

Comment author: DanielLC 19 December 2014 06:33:40AM 1 point [-]

There are other ways of taking Pascal's mugging into account. You shouldn't do that based on lack of computing power. And if you aren't doing it based on lack of computing power, why involve randomness at all? Why not work out what an agent would probably do after N samples, or something like that?

Comment author: [deleted] 19 December 2014 08:52:24PM *  0 points [-]

You shouldn't do that based on lack of computing power. And if you aren't doing it based on lack of computing power, why involve randomness at all?

Well, it's partially because sampling-based approximate inference algorithms are massively faster than real marginalization over large numbers of nuisance variables. It's also because using sampling-based inference makes all the expectations behave correctly in the limit while still yielding boundedly approximately correct reasoning even when compute-power is very limited.

So we beat the Mugging while also being able to have an unbounded utility function, because even in the limit, Mugging-level absurd possible-worlds can only dominate our decision-making an overwhelmingly tiny fraction of the time (when the sample size is more than the multiplicative inverse of their probability, which basically never happens in reality).

Comment author: bogus 20 December 2014 12:18:29PM *  0 points [-]

Importance sampling wouldn't have you ignore Pascal's Muggings, though. At its most basic, 'sampling' is just a way of probabilistically computing an integral.

Comment author: [deleted] 20 December 2014 05:31:31PM 0 points [-]

Importance sampling wouldn't have you ignore Pascal's Muggings, though.

Well, they shouldn't be ignored, as long as they have some finite probability. The idea is that by sampling (importance or otherwise), we almost never give in to it, we always spend our finite computing power on strictly more probable scenarios, even though the Mugging (by definition) would dominate our expected-utility calculation in the case of a completed infinity.