Vladimir_Nesov comments on Newcomb's Problem and Regret of Rationality - Less Wrong

65 Post author: Eliezer_Yudkowsky 31 January 2008 07:36PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (588)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: Xan 13 October 2010 11:43:49AM 2 points [-]

Mr Eliezer, I think you've missed a few points here. However, I've probably missed more. I apologise for errors in advance.

  1. To start with, I speculate than any system of decision making consistently gives the wrong results on a specific problem. The whole point of decision theory is finding principles which usually end up with a better result. As such, you can always formulate a situation in which it gives the wrong answer: maybe one of the facts you thought you knew was incorrect, and led you astray. (At the very least, Omega may decide to reward only those who have never heard of a particular brand of decision theory.)

It's like with file compression. In bitmaps, there are frequently large areas with similar colour. With this fact we can design a system that writes that taking less space. However, if we then try to compress a random bitmap, it will take more space than before the compression. Same thing with human minds. They work simply and relatively efficiently, but there's a whole field dedicated to finding flaws in its method. If you use causal decision theory, you sacrifice your ability at games against superhuman creatures that can predict the future, in return for better decision making when that isn't the case. That seems like a reasonably fair trade-off to me. Any theory which gets this one right opens itself to either getting another one wrong, or being more complex and thus harder for a human to use correctly.

  1. The scientific method and what I know of rationality make the initial assumption that your belief does not affect how the world works. "If a phenomenon feels mysterious, that is a fact about our state of knowledge, not a fact about the phenomenon itself." etc. However, this isn't something which we can actually know.

Some Christians believe that if you pray over someone with faith, they will be immediately healed. If that is true, rationalists are at a disadvantage, because they aren't as good at self delusion or doublethink as the untrained. They might never end up finding out that truth. I know that religion is the mind killer too, I'm just using the most common example of the supremely effective standard method being unable to deal with an idea. It's necessarily incomplete.

  1. I don't agree with you that "reason" means "choosing what ends up with the most reward". You're mixing up means and end. Arguing against a method of decision making because it comes up with the wrong answer to a specific case is like complaining that mp3 compression does a lousy job of compressing silence. I don't think that reason can be the only tool used, just one of them

Incidentally, I would totally only take the $1000 box, and claim that Omega told me I had won immortality, to confuse all decision theorists involved.

Comment author: Vladimir_Nesov 13 October 2010 12:04:28PM 2 points [-]

See chapters 1-9 of this document for a more detailed treatment of the argument.

Comment author: themusicgod1 09 February 2015 07:05:16PM 0 points [-]

This link is 404ing. Anyone have a copy of this?

Comment author: Vladimir_Nesov 10 February 2015 05:20:35AM 2 points [-]

The current version is here. (It's Eliezer Yudkowsky (2010). Timeless Decision Theory.)