Cameron_Taylor comments on Precommitting to paying Omega. - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (27)
So you imagine your current self in such a situation. So do I: and I reach the same conclusion as you:
I then go on to show why that's the case. Actually the article might be better if I wrote Bellman's equation and showed how the terms involving "heads" appearing drop off when you enter the "tails appeared" states.
In other words, the quote from MBlume is just wrong: a rational agent is perfectly capable of wanting to precommit to a given action in a given situation, while not performing that action in that situation. Rather, a perfectly rational and powerful rational agent, one that has pre-actions available that will put it in certain special states, will always perform the action.
The question is how one can actually precommit. Eliezer claims that he has precommitted. I am genuinely curious to know how he has done that in the absence of brain hacking.
Let me ask you a question. Suppose you were transported to Omega world (as I define it in the article). Suppose you then came to the same conclusions that Vladimir Nesov asks us to take as facts: that Omega is trustworthy, etc. Would you then seek to modify yourself such that you would definitely pay Omega $100?
What is the smallest alteration to the situations proposed in which you would?