army1987 comments on Accuracy Versus Winning - Less Wrong

12 Post author: John_Maxwell_IV 02 April 2009 04:47AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (72)

You are viewing a single comment's thread. Show more comments above.

Comment author: John_Maxwell_IV 02 April 2009 04:28:34PM 3 points [-]

OK, I see you don't believe me that you should sometimes accept and sometimes reject epistemic rationality for a price. So here's a simple mathematical model:

Let's say agent A accepts the offer of increased epistemic rationality for a price, and agent N has not accepted it. P is the probability A will decide differently than N. F(A or N) is the expected value of N's original course of action as a function of the agent who takes it, while S(A) is the expected value of the course of action that A might switch to. If there is a cost C associated with becoming agent A, then agent N should become agent A if and only if

(1 - P) * F(A) + P * S(A) - C >= F(N)

The left side of the equation is not bigger than the right side "by definition"; it depends on the circumstance. Eliezer's dessert-ordering example is a situation where the above inequality does not hold.

If you complain that agent N can't possibly know all the variables in the equation, then I agree with you. He will be estimating them somewhat poorly. However, that complaint in no way supports the view that the left side is in fact bigger. Someone once said that "Anything you need to quantify can be measured in some way that is superior to not measuring it at all." Just like the difficulty of measuring utility is not a valid objection to utilitarianism, the difficulty of guessing what a better-informed self would do is not a valid objection to using this equation.

that luck only favors us a small fraction of the time, by definition.

That's a funny definition of "luck" you're using.

Comment author: Furcas 02 April 2009 06:28:56PM *  0 points [-]

Yes, the right side can be bigger, and occasionally it will be. If you get lucky.

If the information that N chooses to remain ignorant of happens to be of little relevance to any decision N will take in the future, and if his self-deception allows him to be more confident than he would have been otherwise, and if this increased confidence grants him a significant advantage, then the right side of the equation will be bigger than the left side.

That's a funny definition of "luck" you're using.

It is? Why do you think people are pleasantly surprised when they get lucky, if not because it's a rare occurrence?

Comment author: John_Maxwell_IV 02 April 2009 06:43:00PM *  3 points [-]

If the information that N chooses to remain ignorant of happens to be of little relevance to any decision N will take in the future, and if his self-deception allows him to be more confident than he would have been otherwise, and if this increased confidence grants him a significant advantage, then the right side of the equation will be bigger than the left side.

Not quite.

  • The information could be of high relevance, but it could so happen that it won't cause him to change his mind.

  • He could be choosing among close alternatives, so switching to a slightly better alternative could be of limited value.

  • Remember also that failure to search for disconfirming evidence doesn't necessarily constitute self-deception.

It is? Why do you think people are pleasantly surprised when they get lucky, if not because it's a rare occurrence?

Sorry, I guess your definition of luck was reasonable. But in this case, it's not necessarily true that the probability of the right side being greater is lower than 50%. In which case you wouldn't always have to "get lucky".

Comment author: Furcas 02 April 2009 10:49:03PM *  3 points [-]

I've been thinking about this on and off for an hour, and I've come to the conclusion that you're right.

My mistake comes from the fact that the examples I was using to think about this were all examples where one has low certainty about whether the information is irrelevant to one's decision making. In this case, the odds are that being ignorant will yield a less than maximal chance of success. However, there are situations in which it's possible to know with great certainty that some piece of information is irrelevant to one's decision making, even if you don't know what the information is. These situations are mostly those that are limited in scope and involve a short-term goal, like giving a favorable first impression, or making a good speech. For instance, you might suspect that your audience hates your guts, and knowing that this is in fact the case would make you less confident during your speech than merely suspecting it, so you'd be better off waiting after the speech to find out about this particular fact.

Although, if I were in that situation, and they did hate my guts, I'd rather know about it and find a way to remain confident that doesn't involve willful ignorance. That said, I have no difficulty imagining a person who is simply incapable of finding such a way.

I wonder, do all situations where instrumental rationality conflicts with epistemic rationality have to do with mental states over which we have no conscious control?

Comment author: John_Maxwell_IV 02 April 2009 11:58:04PM *  3 points [-]

I've been thinking about this on and off for an hour, and I've come to the conclusion that you're right.

Wow, this must be like the 3rd time that someone on the internet has said that to me! Thanks!

Although, if I were in that situation, and they did hate my guts, I'd rather know about it and find a way to remain confident that doesn't involve willful ignorance.

If you think of a way, please tell me about it.

I wonder, do all situations where instrumental rationality conflicts with epistemic rationality have to do with mental states over which we have no conscious control?

Information you have to pay money for doesn't fit into this category.