Delta comments on Rationality Quotes September 2012 - Less Wrong

7 Post author: Jayson_Virissimo 03 September 2012 05:18AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (1088)

You are viewing a single comment's thread. Show more comments above.

Comment author: buybuydandavis 03 September 2012 10:45:47AM 15 points [-]

I think he's mischaracterizing the issue.

Beliefs serve multiple functions. One is modeling accuracy, another is signaling. It's not whether the environment is harsh or easy, it's which function you need. There are many harsh environments where what you need is the signaling function, and not the modeling function.

Comment author: Delta 05 September 2012 12:44:36PM 1 point [-]

I think the quote reflects reality (humans aren't naturally rational so their beliefs are conditioned by circumstance), but is better seen as an observation than a recommendation. The best approach should always be to hold maximally accurate beliefs yourself, even if you choose to signal different ones as the situation demands. That way you can gain the social benefits of professing a false belief without letting it warp or distort your predictions.

Comment author: buybuydandavis 05 September 2012 07:03:38PM *  2 points [-]

The best approach should always be to hold maximally accurate beliefs yourself, even if you choose to signal different ones as the situation demands.

No, that wouldn't necessarily be the case. We should expect a cost in effort and effectiveness to try to switch on the fly between the two types of truths. Lots of far truths have little direct predictive value, but lots of signaling value. Why bear the cost for a useless bit of predictive truth, particularly if it is worse than useless and hampers signaling?

That's part of the magic of magisteria - segregation of modes of truth by topic reduces that cost.

Comment author: Delta 06 September 2012 12:58:37PM 0 points [-]

Hmm, maybe I shouldn't have said "always" given that acting ability is required to signal a belief you don't hold, but I do think what I suggest is the ideal. I think someone who trained themselves to do what I suggest, by studying people skills and so forth, would do better as they'd get the social benefits of conformity and without the disadvantages of false beliefs clouding predictions (though admittedly the time investment of learning these skills would have to be considered).

Short version: I think this is possible with training and would make you "win" more often, and thus it's what a rationalist would do (unless the cost of training proved prohibitive, of which I'm doubtful since these skills are very transferable).

I'm not sure what you meant by the magisteria remark, but I get the impression that advocating spiritual/long-term beliefs to less stringent standards than short term ones isn't generally seen as a good thing (see Eliezer's "Outside the Laboratory" post among others).