Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Wei_Dai comments on What Are Probabilities, Anyway? - Less Wrong

22 Post author: Wei_Dai 11 December 2009 12:25AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (78)

You are viewing a single comment's thread. Show more comments above.

Comment author: Wei_Dai 12 December 2009 04:07:31AM *  2 points [-]

It would be much more pleasant to endorse some other prior - for example, one where everything just happens to work out to match my preferences - the "wishful thinking" prior.

Presumably you don't do that because that's not your actual prior - you don't just care about one particular possible world where things happen to turn out exactly the way you want. You also care about other possible worlds and want to make decisions in ways that make those worlds better.

In general, if there is no fact of the matter about what is real, then why would anyone bother to endorse anything other than their own personal wishful thinking as real?

It would be for the same reason that you don't change your utility function to give everything an infinite utility.

Comment deleted 12 December 2009 06:14:52AM *  [-]
Comment author: Wei_Dai 12 December 2009 12:12:52PM *  4 points [-]

It sounds like you're assuming that people use a wishful-thinking prior by default, and have to be argued into a complexity-based prior. This seems implausible to me.

I think the phenomenon of wishful thinking doesn't come from one's prior, but from evolution being too stupid to design a rational decision process. That is, a part of my brain rewards me for increasing the anticipation of positive future experiences, even if that increase is caused by faulty reasoning instead of good decisions. This causes me to engage in wishful thinking (i.e., miscalculating the implications of my prior) in order to increase my reward.

Perhaps I could frame it this way: the complexity prior is (in fact) counterintuitive and alien to the human mind.

I dispute this. Sure, some of the implications of the complexity prior are counterintuitive, but it would be surprising if none of them were. I mean, some theorems of number theory are counterintuitive, but that doesn't mean integers are aliens to the human mind.

Why should I pay special attention to worlds that conform to it (simple worlds)?

Suppose someone gave you a water-tight argument that all possible world are in fact real, and you have to make decisions based on which worlds you care more about. Would you really adopt the "wishful-thinking" prior and start putting all your money into lottery tickets or something similar, or would your behavior be more or less unaffected? If it's the latter, don't you already care more about worlds that are simple?

"if I use a complexity prior to repeatedly make decisions, then my subjective experience will be (mostly) of winning"

Perhaps this is just one of the ways an algorithm that cares about each world in proportion to its inverse complexity could feel from the inside?

Comment deleted 12 December 2009 08:44:20PM *  [-]
Comment author: timtyler 12 December 2009 08:50:09PM 0 points [-]

You don't believe in affirmations? The self-help books about the power of positive thinking don't work for you? What do you make of the following quote?

"Personal optimism correlates strongly with self-esteem, with psychological well-being and with physical and mental health. Optimism has been shown to be correlated with better immune systems in healthy people who have been subjected to stress."

Comment deleted 12 December 2009 09:04:41PM [-]