steven0461 comments on Theism, Wednesday, and Not Being Adopted - Less Wrong

56 Post author: Alicorn 27 April 2009 04:49PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (320)

You are viewing a single comment's thread.

Comment author: steven0461 27 April 2009 05:15:49PM 1 point [-]

Almost everyone who thinks he or she has higher priorities than being right actually does not have higher priorities than being right, but doesn't place enough priority on being right to see that this is the case. This is why we should avoid the "rationalists should win" mantra -- figuring out what "winning" means is at least as essential as actually winning.

I reject out of hand the idea that she should deconvert in the closet and systematically lie to everyone she knows.

Rejecting options out of hand is bad, especially when the alternatives suck.

Comment author: Alicorn 27 April 2009 05:22:20PM 0 points [-]

Almost everyone who thinks he or she has higher priorities than being right actually does not have higher priorities than being right, but doesn't place enough priority on being right to see that this is the case.

Can you help me disentangle what you mean by this? There seems to be some equivocation.

Rejecting options out of hand is bad, especially when the alternatives suck.

I rejected that option for ethical reasons. The alternatives do suck, but "carry on believing as always" and "deconvert, then tell an uncomfortable truth" are at least not unethical.

Comment author: steven0461 27 April 2009 06:37:04PM 4 points [-]

For clarification, see my reply to MrHen.

The alternatives do suck, but "carry on believing as always" and "deconvert, then tell an uncomfortable truth" are at least not unethical.

Choosing to believe falsely and then speaking honestly is at least as unethical as choosing to believe truly and then lying. The former amounts to lying and then committing the further ethical crime of believing one's own lies.

Comment author: Vladimir_Nesov 27 April 2009 05:32:01PM *  0 points [-]

Almost everyone who thinks he or she has higher priorities than being right actually does not have higher priorities than being right, but doesn't place enough priority on being right to see that this is the case. This is why we should avoid the "rationalists should win" mantra -- figuring out what "winning" means is at least as essential as actually winning.

That's open to interpretation. The procedure by which you are figuring out what " winning " means is itself a rational pursuit, that should better be precisely targeted, with " winning' " in that meta-game already fixed. You have to stop somewhere, and actually write the code.

Comment author: steven0461 27 April 2009 06:42:48PM 0 points [-]

You do indeed have to stop somewhere, but any algorithm that stops before rejecting everything that's at least one tenth as wrong as Mormonism is broken.

Comment author: Vladimir_Nesov 27 April 2009 08:17:19PM 1 point [-]

Huh? The algorithm doesn't stop, the meta-meta-goal has to be fixed at some point.

Comment author: MrHen 27 April 2009 06:19:12PM *  0 points [-]

Almost everyone who thinks he or she has higher priorities than being right actually does not have higher priorities than being right, but doesn't place enough priority on being right to see that this is the case.

After parsing this, I think you are saying:

  1. Many people who think they have higher priorities than being right
  2. Do not have higher priorities than being right
  3. But do not know they do not have higher priorities than being right
  4. Because they do not have a high enough priorities with regards to being right

So, replacing "priorities" with "X" and "being right" with "Y" we get this:

  1. Many people who think they have higher X than Y
  2. Do not have higher X than Y
  3. But do not know they do not have higher X than Y
  4. Because they do not have a high enough X with regards to Y

Which is a very mean and uncharitable way of saying I do not know what you mean. I think my difficulty is that I rank priorities against themselves. To me, Priority of 55 makes no sense. Fifty-fifth Priority does. Bumping priority up means replacing a higher rank with a lower rank. If something has no higher priority is is First Priority. With these definitions, your statement makes no sense because (2) and (4) are incompatible.

Comment author: steven0461 27 April 2009 06:28:45PM *  4 points [-]

OK, I can see how that was unclear, but I stand by the statement. Figuring out what one's true goals are is itself a problem that one can apply rationality to. Many people think applying rationality doesn't help achieve their goals well enough to be worth the costs. But they're wrong: rationality helps achieve their true goals well enough to be worth the costs. If they applied rationality enough, they'd find out that their true goals aren't what they thought they were, and conclude that applying rationality was indeed worth it.

An irrational person cannot reliably assess the cost of being irrational. A rational person can. People who have chosen rationality almost always agree choosing rationality was worth it.

Red and blue box, one of them contains a diamond. Wednesday asks, "how would this "rationality" thing help me get to the red box, which contains the diamond?" But the diamond is in the blue box.

Comment author: MrHen 27 April 2009 07:02:37PM 2 points [-]

Yeah, that makes more sense. I think there is a danger in telling someone they do not know what they really want or what their true goals are, but I understand your point and agree.

Comment author: Nick_Tarleton 27 April 2009 07:09:20PM *  2 points [-]

I don't think the danger is in saying that another doesn't know their true goals so much as in thinking that you do know them.

Comment author: conchis 27 April 2009 08:39:34PM *  3 points [-]

An irrational person cannot reliably assess the cost of being irrational. A rational person can.

Yes, a fully rational person is better able to assess the relative costs of being irrational vs. rational. But this knowledge won't help them much if it turns out that the costs of being irrational were lower after all.