Oscar_Cunningham comments on The Empty White Room: Surreal Utilities - Less Wrong

11 Post author: linkhyrule5 23 July 2013 08:37AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (124)

You are viewing a single comment's thread.

Comment author: Oscar_Cunningham 23 July 2013 10:29:40AM *  19 points [-]

To show that my utility for Frank is infinite you have to establish that I wouldn't trade an arbitrarily small probability of his death for the nanofab. I would make the trade at sufficiently small probabilities.

Also, the surreal numbers are almost always unnecessarily large. Try the hyperreals first.

Comment author: Eliezer_Yudkowsky 23 July 2013 09:08:42PM 2 points [-]

Affirm this reply as well.

Comment author: endoself 26 July 2013 04:38:15AM 1 point [-]

What's wrong with the surreals? It's not like we have reason to keep our sets small here. The surreals are prettier, don't require an arbitrary nonconstructive ultrafilter, are more likely to fall out of an axiomatic approach, and can't accidently end up being too small (up to some quibbles about Grothendieck universes).

Comment author: Oscar_Cunningham 26 July 2013 06:44:07PM *  0 points [-]

I agree with all of that, but I think we should work out what decision theory actually needs and then use that. Surreals will definitely work, but if hyperreals also worked then that would be a really interesting fact worth knowing, because the hyperreals are so much smaller. (Ditto for any totally ordered affine set).

Comment author: linkhyrule5 23 July 2013 07:06:54PM 1 point [-]

Not at all. I wouldn't trade any secular value for Frank's life, but if I got a deal saying that Frank might die (or live) at a probability of 1/3^^^3, I'd be more curious about how on earth even Omega can get that level of precision than actually worried about Frank.

Comment author: Oscar_Cunningham 23 July 2013 07:34:27PM 6 points [-]

Not at all. I wouldn't trade any secular value for Frank's life

Eh? Do you mean you wouldn't make the trade at any probability? That would be weird; everyone makes decisions every day that put other people in small probabilities of danger.

Comment author: linkhyrule5 23 July 2013 07:55:08PM *  1 point [-]

Well of course. That's why I put this in a white room.

(Also, just because I should choose something doesn't mean I'm actually rational enough to choose it.)

Assuming I am perfectly rational (*cough* *cough*) in the real world, the decision I'm actually making is "some fraction of myself living" versus "small probability of someone else dying."