PhilGoetz comments on An Xtranormal Intelligence Explosion - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (80)
Like many people, I don't think this idea will work. But I voted it up, because I vote on comment expected value. On a topic that is critical to solve, and for which there are no good ideas, entertaining crazy ideas is worthwhile. So I'd rather hear one crazy idea that a good Yudkowskian would consider sacrilege, than ten well-reasoned points that are already overrepresented on LessWrong. It's analogous to the way that optimal mutation rate is high when your current best solution is very sub-optimal, and optimal selection strength (reproduction probability as a function of fitness) is low when your population is nearly homogenous (as ideas about FAI on LessWrong are).