Eliezer_Yudkowsky comments on Existential Risk and Public Relations - Less Wrong

36 Post author: multifoliaterose 15 August 2010 07:16AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (613)

You are viewing a single comment's thread. Show more comments above.

Comment author: Eliezer_Yudkowsky 18 August 2010 02:46:17PM 6 points [-]

if you think that the most important thing we can be doing right now is publicizing an academically respectable account of existential risk, then you should be funding the Future of Humanity Institute. Funding SIAI is optimal only if you think that the pursuit of Friendly AI is by far the most important component of existential risk reduction

Agreed. (Modulo a caveat about marginal ROI eventually balancing if FHI got large enough or SIAI got small enough.)