FormallyknownasRoko comments on Making your explicit reasoning trustworthy - Less Wrong

82 Post author: AnnaSalamon 29 October 2010 12:00AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (93)

You are viewing a single comment's thread. Show more comments above.

Comment author: komponisto 21 November 2010 08:29:15PM *  3 points [-]

The simpler way is just to recognize that, as a human in a western society, you won't lose much more or win much more than the other humans around you

Well, unless you actually take specific steps to win more....which is kind of what this is about.

which subgroup of humans do you join? How do you make the tradeoff between different subcultures etc. But still, you don't even need a general solution to that problem, you only need to decide which of the handful of specific subcultures available to you seems best for you.

Note that people probably tend to end up here by this very process. That is, of all the subcultures available to them, the subculture of people who are interested in

carefully building up [their] abstract-reasoning ability to the point where it produces usefully accurate, unbiased, well-calibrated probability distributions over relevant outcome spaces

is the most attractive.

Comment author: FormallyknownasRoko 21 November 2010 08:47:14PM *  5 points [-]

Note that people probably tend to end up here by this very process. That is, of all the subcultures available to them, the subculture of people who are interested in

True ... but I suspect that people who end up here do so because they basically take more-than-averagely literally the verbally endorsed beliefs of the herd. Rationality as memetic immune disorder, failure to compartmentalize etc.

Perhaps I should amend my original comment to say that if you are cognitively very different from the herd, you may want to use a bit of rationality/self-development like a corrective lens. You'll have to run compartmentalization in software.

Maybe I should try to start a new trend: use {compartmentalization} when you want to invalidate an inference which most people would not make because of compartmentalization?

E.g. "I think all human lives are equally valuable"

"Then why did you spend $1000 on an ipad rather than giving it to Givewell?"

"I refute it thus: {compartmentalization: nearmode/farmode}"