wuwei comments on Intelligence enhancement as existential risk mitigation - Less Wrong

17 [deleted] 15 June 2009 07:35PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (198)

You are viewing a single comment's thread. Show more comments above.

Comment author: wuwei 16 June 2009 11:11:03PM 0 points [-]

I suspect you aren't sufficiently taking into account the magnitude of people's irrationality and the non-monotonicity of rationality's rewards. I agree that intelligence enhancement would have greater overall effects than rationality enhancement, but rationality's effects will be more careful and targeted -- and therefore more likely to work as existential risk mitigation.

Comment author: MichaelBishop 17 June 2009 01:44:27AM 0 points [-]

...the non-monotonicity of rationality's rewards

Could you elaborate on the shape of the rewards to rationality?

Comment author: gwern 17 June 2009 03:43:06AM 1 point [-]

This was covered in some LW posts a while ago (which I cannot be arsed to look up and link); the paradigmatic example in those posts, I think, was a LWer who used to be a theist and have a theist girlfriend, but reading OB/LW stuff convinced him of the irrationality of God. Then his girlfriend left his hell-bound hide for greener pastures, and his life is in general poorer than when he started reading OB/LW and striving to be more rational.

The suggestion is that rationality/irrationality is like a U: you can be well-off as a bible-thumper, and well-off as a stone-cold Bayesian atheist, but the middle is unhappy.

Comment author: Eliezer_Yudkowsky 17 June 2009 09:00:41AM 2 points [-]

and his life is in general poorer

I'm not sure this is a fair statement. He did say he wouldn't go back if he had the choice.

Comment author: wuwei 17 June 2009 02:33:08AM *  0 points [-]

Increases in rationality can sometimes lead with some regularity to decreasing knowledge or utility (hopefully only temporarily and in limited domains).