You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Zetetic comments on [SEQ RERUN] The Martial Art of Rationality - Less Wrong Discussion

7 Post author: Unnamed 19 April 2011 07:41PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (45)

You are viewing a single comment's thread. Show more comments above.

Comment author: Zetetic 20 April 2011 03:31:08PM 0 points [-]

I think that we can take something clear and simple from the posts below: rationality should not only help you to accomplish your goals, but also to define goals clearly and identify easy and (more importantly) useful goals that are likely to induce a prolonged (preferably indefinitely so) period of well being.

Comment author: [deleted] 20 April 2011 07:58:58PM 4 points [-]

Can we at least agree that these three imperatives

  1. Believe true things
  2. Achieve goals
  3. Induce well-being

are not identical? There seems to a be "rationality thesis" here that the best way to go about 2. and 3. is to sort out 1. first. I would like to see this thesis stated more clearly.

Comment author: MrMind 21 April 2011 07:29:40AM *  0 points [-]

This may very well be the case today, or in our society, but it's not really difficult to imagine a society in which you have to 'hold' really crazy idea in order to win. Also, believing true things is an endeavour which is never completed per se: it surely is not possible to have it sorted out simpliciter before attaining 2 (the third imperative I really see as a subgoal of the second one).

The thesis after all conflicts with basically all history of humanity: homo sapiens has won more and more without attaining a perfect accuracy. However it seems to me that it had won more where it accumulated a greater amount of truths.

So I won't really say that in order to win you have to be accurate, but I think a strong case can be made that accuracy enhances the probability of winning.

What is then the real purpose of rationality? I'm perfectly fine if we accept the conjunction "truth /\ winning", with the provision that P(winning | high degree of truth) > P(winning | low degree of truth). However, if Omega is going to pop-up and ask:

You must choose between two alternatives. I can give you the real TOE and remove your cognitive bias if you accept to live a miserable life, or you can live a very comfortable and satisfying existence, provided that you let me implant the belief in the flying spaghetti monster.

I confess I would guiltily choose the second.