Cameron_Taylor comments on Precommitting to paying Omega. - Less Wrong

5 Post author: topynate 20 March 2009 04:33AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (27)

You are viewing a single comment's thread. Show more comments above.

Comment author: topynate 20 March 2009 09:49:53AM *  2 points [-]

So you imagine your current self in such a situation. So do I: and I reach the same conclusion as you:

"Right. No, I don't want to give you $100."

I then go on to show why that's the case. Actually the article might be better if I wrote Bellman's equation and showed how the terms involving "heads" appearing drop off when you enter the "tails appeared" states.

In other words, the quote from MBlume is just wrong: a rational agent is perfectly capable of wanting to precommit to a given action in a given situation, while not performing that action in that situation. Rather, a perfectly rational and powerful rational agent, one that has pre-actions available that will put it in certain special states, will always perform the action.

The question is how one can actually precommit. Eliezer claims that he has precommitted. I am genuinely curious to know how he has done that in the absence of brain hacking.

Let me ask you a question. Suppose you were transported to Omega world (as I define it in the article). Suppose you then came to the same conclusions that Vladimir Nesov asks us to take as facts: that Omega is trustworthy, etc. Would you then seek to modify yourself such that you would definitely pay Omega $100?

Comment deleted 20 March 2009 10:58:52AM [-]
Comment author: topynate 20 March 2009 05:31:47PM *  0 points [-]

What is the smallest alteration to the situations proposed in which you would?