jacoblyles comments on Why Our Kind Can't Cooperate - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (186)
There is no guarantee of a benevolent world, Eliezer. There is no guarantee that what is true is also beneficial. There is no guarantee that what is beneficial for an individual is also beneficial for a group.
You conflate many things here. You conflate what is true with what is right and what is beneficial. You assume that these sets are identical, or at least largely overlapping. However, unless a galactic overlord designed the universe to please homo sapien rationalists, I don't see any compelling rational reason to believe this to be the case.
Irrational belief systems often thrive because they overcome the prisoner dilemmas that individual rational action creates on a group level. Rational people cannot mimic this. The prisoners dilemma and the tragedy of the commons are not new ideas. Telling people to act in the group interest because God said so is effective. It is easy to see how informing people of the costs of action, because truth is noble and people ought not be lied to, can be counter-effective.
Perhaps we should stop striving for the maximum rational society, and start pursuing the maximum rational society which is stable in the long term. That is, maybe we ought to set our goal to minimizing irrationality, recognizing that we will never eliminate it.
If we cannot purposely introduce a small bit of beneficial irrationality into our group, then fine: memetic evolution will weed us out and there is nothing we can do about it. People will march by the millions to the will of saints and emperors while rational causes whither on the vine. Not much will change.
Robin made an excellent post along similar lines, which captures half of what I want to say:
http://lesswrong.com/lw/j/the_costs_of_rationality/
I'll be writing up the rest of my thoughts soon.
Sorry, I can't find the motivation to jump on the non-critical bandwagon today. I had the idea about a week ago that there is no guarantee that truth= justice = prudence, and that is going to be the hobby-horse I ride until I get a good statement of my position out, or read one by someone else.
"However, unless a galactic overlord designed the universe to please homo sapien rationalists, I don't see any compelling rational reason to believe this to be the case."
Except that we are free to adopt any version of rationality that wins. Rationality should be responsive to a given universe design, not the other way around.
"Irrational belief systems often thrive because they overcome the prisoner dilemmas that individual rational action creates on a group level. Rational people cannot mimic this."
Really? Most of the "individual rationality -> suboptimal outcomes" results assume that actors have no influence over the structure of the games they are playing. This doesn't reflect reality particularly well. We may not have infinite flexibility here, but changing the structure of the game is often quite feasible, and quite effective.
In that case, believing in truth is often non-rational.
Many people on this site have bemoaned the confusing dual meanings of "rational" (the economic utility maximizing definition and the epistemological believing in truth definition). Allow me to add my name to that list.
I believe I consistently used the "believing in truth" definition of rational in the parent post.
I agree that the multiple definitions are confusing, but I'm not sure that you consistently employ the "believing in truth" version in your post above.* It's not "believing in truth" that gets people into prisoners' dilemmas; it's trying to win.
*And if you did, I suspect you'd be responding to a point that Eliezer wasn't making, given that he's been pretty clear on his favored definition being the "winning" one. But I could easily be the one confused on that. ;)
"In that case, believing in truth is often non-rational."
Fair enough. Though I wonder whether, in most of the instances where that seems to be true, it's true for second-best reasons. (That is, if we were "better" in other (potentially modifiable) ways, the truth wouldn't be so harmful.)