MichaelBishop comments on Intelligence enhancement as existential risk mitigation - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (198)
Could you elaborate on the shape of the rewards to rationality?
This was covered in some LW posts a while ago (which I cannot be arsed to look up and link); the paradigmatic example in those posts, I think, was a LWer who used to be a theist and have a theist girlfriend, but reading OB/LW stuff convinced him of the irrationality of God. Then his girlfriend left his hell-bound hide for greener pastures, and his life is in general poorer than when he started reading OB/LW and striving to be more rational.
The suggestion is that rationality/irrationality is like a U: you can be well-off as a bible-thumper, and well-off as a stone-cold Bayesian atheist, but the middle is unhappy.
I'm not sure this is a fair statement. He did say he wouldn't go back if he had the choice.
Increases in rationality can sometimes lead with some regularity to decreasing knowledge or utility (hopefully only temporarily and in limited domains).