FormallyknownasRoko comments on Making your explicit reasoning trustworthy - Less Wrong

82 Post author: AnnaSalamon 29 October 2010 12:00AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (93)

You are viewing a single comment's thread. Show more comments above.

Comment author: FormallyknownasRoko 21 November 2010 08:01:03PM *  4 points [-]

There is a much simpler way of winning than carefully building up your abstract-reasoning ability to the point where it produces usefully accurate, unbiased, well-calibrated probability distributions over relevant outcome spaces.

The simpler way is just to recognize that, as a human in a western society, you won't lose much more or win much more than the other humans around you. So you may as well dump the abstract reasoning and rationality, and pick some humans who seem to live relatively non-awful lives (e.g. your colleagues/classmates) and take whatever actions they take. Believe what they believe, even if it seems irrational. Do what they do.

Careful probability estimation and actions taken based upon anticipations of consequences is the kind of cognitive algorithm befitting a lone agent who actually reaps what (s)he sows. For a human, herd-mentality seems to be the more elegant solution: elegant in the sense that the epistemology is hard to get right, but there is a robust argument about consequences and utilities: almost all of the relatively-average-strategy humans in the herd will get roughly the same deal out of life.

Research from hedonic psychology on the "Hedonic Treadmill" effect backs this up further: even if you make more (or less) money than average, you probably won't actually be happier or better (worse) off.

Of course there are details and complications: which subgroup of humans do you join? How do you make the tradeoff between different subcultures etc. But still, you don't even need a general solution to that problem, you only need to decide which of the handful of specific subcultures available to you seems best for you.

And, of course, it goes without saying that this strategy is useless for someone who is determined to invest emotionally in a nonstandard life-narrative, like utilitarian charity or life-extension. From this point of view, one might object that joining the herd is selfish in the sense that it isn't the action which maximizes utility across the herd; but then again most people don't have a utilitarian concept of selfishness and don't count benefit to random strangers as part of their actual near-mode, actionable goal set, so from their axiological point of view, herding is an acceptable solution.

Comment author: komponisto 21 November 2010 08:29:15PM *  3 points [-]

The simpler way is just to recognize that, as a human in a western society, you won't lose much more or win much more than the other humans around you

Well, unless you actually take specific steps to win more....which is kind of what this is about.

which subgroup of humans do you join? How do you make the tradeoff between different subcultures etc. But still, you don't even need a general solution to that problem, you only need to decide which of the handful of specific subcultures available to you seems best for you.

Note that people probably tend to end up here by this very process. That is, of all the subcultures available to them, the subculture of people who are interested in

carefully building up [their] abstract-reasoning ability to the point where it produces usefully accurate, unbiased, well-calibrated probability distributions over relevant outcome spaces

is the most attractive.

Comment author: FormallyknownasRoko 21 November 2010 08:47:14PM *  5 points [-]

Note that people probably tend to end up here by this very process. That is, of all the subcultures available to them, the subculture of people who are interested in

True ... but I suspect that people who end up here do so because they basically take more-than-averagely literally the verbally endorsed beliefs of the herd. Rationality as memetic immune disorder, failure to compartmentalize etc.

Perhaps I should amend my original comment to say that if you are cognitively very different from the herd, you may want to use a bit of rationality/self-development like a corrective lens. You'll have to run compartmentalization in software.

Maybe I should try to start a new trend: use {compartmentalization} when you want to invalidate an inference which most people would not make because of compartmentalization?

E.g. "I think all human lives are equally valuable"

"Then why did you spend $1000 on an ipad rather than giving it to Givewell?"

"I refute it thus: {compartmentalization: nearmode/farmode}"