RobinZ comments on Simplified Humanism, Positive Futurism & How to Prevent the Universe From Being Turned Into Paper Clips - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (43)
I don't know that Eliezer Yudkowsky has spent much time talking about AI theory in this forum such that his competence would be obvious - but either way, the math of the decision theory is not as simple as "do what you are best at".
It might not even be as simple as comparitive advantage, but there are certainly more good writers in the world than good AI theorists.