topynate comments on Precommitting to paying Omega. - Less Wrong

5 Post author: topynate 20 March 2009 04:33AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (27)

You are viewing a single comment's thread. Show more comments above.

Comment author: topynate 20 March 2009 09:19:52AM *  2 points [-]

The whole scenario becomes a complicated way of saying that the coin that has a probability of 1 coming up tails and the Omega has given my future self the chance to come back and tell me that before I make the bet.

The coin is deterministic, but your state of knowledge at time t does not include the information (tails will appear with probability 1). Your best estimate is that p(tails) = 1/2. Therefore, if you want to maximize your expected utility, given your bounded knowledge, you should, if possible, precommit to paying $100. I explore what that means.

I am suggesting that this counterfactual mugging situation seems to amount to progressively stretching reality as we can and trying to thwart Newbcomb before we are forced to admit that we're just creating a convoluted paradox.

The actual state of the universe, as I see it, involves a Tegmark ensemble existing. Thinking on those lines led me to the conclusion I gave here. However, I now believe this is the wrong way to think about the problem. If one is incapable of precommitting, then belief in a Tegmark ensemble at least leads me not to inflict huge suffering on other people. If, however, one can precommit, this is a utility improvement whether the Tegmark ensemble exists or not.

ETA: I need to make something extremely clear here. When I say "probability", you should probably assume I mean it exactly as E.T. Jaynes does. I may occasionally slip up, but you will be 'less wrong' if you follow this prescription.

Comment deleted 20 March 2009 09:44:32AM *  [-]
Comment author: topynate 20 March 2009 09:49:53AM *  2 points [-]

So you imagine your current self in such a situation. So do I: and I reach the same conclusion as you:

"Right. No, I don't want to give you $100."

I then go on to show why that's the case. Actually the article might be better if I wrote Bellman's equation and showed how the terms involving "heads" appearing drop off when you enter the "tails appeared" states.

In other words, the quote from MBlume is just wrong: a rational agent is perfectly capable of wanting to precommit to a given action in a given situation, while not performing that action in that situation. Rather, a perfectly rational and powerful rational agent, one that has pre-actions available that will put it in certain special states, will always perform the action.

The question is how one can actually precommit. Eliezer claims that he has precommitted. I am genuinely curious to know how he has done that in the absence of brain hacking.

Let me ask you a question. Suppose you were transported to Omega world (as I define it in the article). Suppose you then came to the same conclusions that Vladimir Nesov asks us to take as facts: that Omega is trustworthy, etc. Would you then seek to modify yourself such that you would definitely pay Omega $100?

Comment deleted 20 March 2009 10:47:56AM [-]
Comment author: topynate 20 March 2009 05:18:40PM *  2 points [-]

I don't think we're on the same page. I imagine myself in a different situation in which there is a tails only coin.

How is it different? If you get zapped to Omega world, then you are in some deterministic universe, but you don't know which one exactly. You could be in a universe where Omega was going to flip tails (and some other things are true which you don't know about), or one where Omega was going to flip heads (and some other things are true which you don't know about), and you are in complete ignorance as to which set of universes you now find yourself in. Then either Omega will appear and tell you that you're in a "heads" universe, and pay you nothing, or appear and tell you that you're in a "tails" universe, in which case you will discover that you don't want to pay Omega $100. As would I.

Have you made the assumption that handing over the $100 proves that you have made a precommitment?

It proves that either: a) you are literally incapable of doing otherwise b) you genuinely get more benefit/utility from handing the $100 over than from keeping it, where "benefit" is a property of your brain that you rationally act to maximize. c) your actions are irrational, in the sense that you could have taken another action with higher utility.

When I refer to "you", I mean "whoever you happen to be at the moment Omega appears^W^W you make your decision", not "you as you would be if pushed forward through time to that moment".

Comment deleted 20 March 2009 10:58:52AM [-]
Comment author: topynate 20 March 2009 05:31:47PM *  0 points [-]

What is the smallest alteration to the situations proposed in which you would?