Nebu comments on Why Our Kind Can't Cooperate - Less Wrong

132 Post author: Eliezer_Yudkowsky 20 March 2009 08:37AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (186)

You are viewing a single comment's thread. Show more comments above.

Comment author: jacoblyles 20 March 2009 09:54:09AM *  6 points [-]

There is no guarantee of a benevolent world, Eliezer. There is no guarantee that what is true is also beneficial. There is no guarantee that what is beneficial for an individual is also beneficial for a group.

You conflate many things here. You conflate what is true with what is right and what is beneficial. You assume that these sets are identical, or at least largely overlapping. However, unless a galactic overlord designed the universe to please homo sapien rationalists, I don't see any compelling rational reason to believe this to be the case.

Irrational belief systems often thrive because they overcome the prisoner dilemmas that individual rational action creates on a group level. Rational people cannot mimic this. The prisoners dilemma and the tragedy of the commons are not new ideas. Telling people to act in the group interest because God said so is effective. It is easy to see how informing people of the costs of action, because truth is noble and people ought not be lied to, can be counter-effective.

Perhaps we should stop striving for the maximum rational society, and start pursuing the maximum rational society which is stable in the long term. That is, maybe we ought to set our goal to minimizing irrationality, recognizing that we will never eliminate it.

If we cannot purposely introduce a small bit of beneficial irrationality into our group, then fine: memetic evolution will weed us out and there is nothing we can do about it. People will march by the millions to the will of saints and emperors while rational causes whither on the vine. Not much will change.

Robin made an excellent post along similar lines, which captures half of what I want to say:

http://lesswrong.com/lw/j/the_costs_of_rationality/

I'll be writing up the rest of my thoughts soon.

Sorry, I can't find the motivation to jump on the non-critical bandwagon today. I had the idea about a week ago that there is no guarantee that truth= justice = prudence, and that is going to be the hobby-horse I ride until I get a good statement of my position out, or read one by someone else.

Comment author: conchis 20 March 2009 01:32:23PM 4 points [-]

"However, unless a galactic overlord designed the universe to please homo sapien rationalists, I don't see any compelling rational reason to believe this to be the case."

Except that we are free to adopt any version of rationality that wins. Rationality should be responsive to a given universe design, not the other way around.

"Irrational belief systems often thrive because they overcome the prisoner dilemmas that individual rational action creates on a group level. Rational people cannot mimic this."

Really? Most of the "individual rationality -> suboptimal outcomes" results assume that actors have no influence over the structure of the games they are playing. This doesn't reflect reality particularly well. We may not have infinite flexibility here, but changing the structure of the game is often quite feasible, and quite effective.

Comment author: Nebu 20 March 2009 06:32:39PM 2 points [-]

However, unless a galactic overlord designed the universe to please homo sapien rationalists, I don't see any compelling rational reason to believe this to be the case.

Except that we are free to adopt any version of rationality that wins. Rationality should be responsive to a given universe design, not the other way around.

I don't think your argument applies to jacoblytes' argument. Jacoblytes claims that there is no reason for "rational" to equal "(morally/ethically) right", unless an intelligent designer designed the universe in line with our values.

So it's not about winning versus losing. It's that unless the rules of the game are set up just in a certain way, then winning may entail causing suffering to others (e.g. to our rivals).

Comment author: jacoblyles 20 March 2009 06:54:56PM *  2 points [-]

My writing in these comments has not been perfectly clear, but Nebu you have nailed one point that I was trying to make: "there is no guarantee that morally good actions are beneficial".

The Christian morality is interesting, here. Christians admit up front that following their religion may lead to persecution and suffering. Their God was tortured and killed, after all. They don't claim that what is good will be pleasant, as the rationalists do. To that degree, the Christians seem more honest and open-minded. Perhaps this is just a function of Christianity being an old religion and having the time to work out the philosophical kinks.

Of course, they make up for it by offering infinite bliss in the next life, which is cheating. But Christians do have a more honest view of this world in some ways.

Maybe we conflate true, good, and prudent because our "religion" is a hard sell otherwise. If we admitted that true and morally right things may be harmful, our pitch would become "Believe the truth, do what is good, and you may become miserable. There is no guarantee that our philosophy will help you in this life, and there is no next life". That's a hard sell. So we rationalists cheat by not examining this possibility.

There is some truth to the Christian criticism that Atheists are closed-minded and biased, too.