Eliezer_Yudkowsky comments on Existential Risk and Public Relations - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (613)
whpearson mentioned this already, but if you think that the most important thing we can be doing right now is publicizing an academically respectable account of existential risk, then you should be funding the Future of Humanity Institute.
Funding SIAI is optimal only if you think that the pursuit of Friendly AI is by far the most important component of existential risk reduction, and indeed they're focusing on persuading more people of this particular claim. As you say, by focusing on something specific, radical and absurd, they run more of a risk of being dismissed entirely than does FHI, but their strategy is still correct given the premise.
Agreed. (Modulo a caveat about marginal ROI eventually balancing if FHI got large enough or SIAI got small enough.)