linkhyrule5 comments on The Empty White Room: Surreal Utilities - Less Wrong

11 Post author: linkhyrule5 23 July 2013 08:37AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (124)

You are viewing a single comment's thread. Show more comments above.

Comment author: linkhyrule5 23 July 2013 07:59:59PM 0 points [-]

The first is entirely up to you. The second are worth 0.0001ω, 0.01ω, and .99ω, respectively, and are still larger than any secular value. This is working as planned, as far as I'm concerned...

Comment author: shminux 23 July 2013 08:36:41PM 0 points [-]

This is working as planned, as far as I'm concerned...

Are you saying that any odds of your request causing Frank's death, no matter how small, are unacceptable? Then you will never be able to ask for anything.

Comment author: linkhyrule5 23 July 2013 08:47:50PM 0 points [-]

Yes. See: Flaws. This is Pascal's Mugging; it shows up in real systems too, you need a slightly more unlikely set-up but it's still a plausible scenario. It's not a problem the real utility system doesn't have.

Comment author: shminux 23 July 2013 09:08:59PM *  -1 points [-]

It's not a problem the real utility system doesn't have.

Well, the usual utilitarian "torture wins" does not have this particular problem, it trades it for the repugnant conclusion "torture wins".

Anyway, I don't see how you approach avoids any of the standard pitfalls of utilitarianism, though it might be masking some.

Comment author: linkhyrule5 23 July 2013 09:13:13PM 0 points [-]

Surreal Utilities can support that conclusion as well: how you decide on Torture v. Dust Specks depends entirely on your choice of tiers.

I'm talking purely about Pascal's Mugging, where someone shows up and says "I'll save 3^^^3 lives if you give me five dollars." This is isomorphic to this problem on the surreals, where someone says "I'll give you omega-utility (save a life) at a probability of one in one quadrillion.)