Many people have an incorrect view of the Future of Humanity Institute's funding situation, so this is a brief note to correct that; think of it as a spiritual successor to this post. As John Maxwell puts it, FHI is "one of the three organizations co-sponsoring LW [and] a group within the University of Oxford's philosophy department that tackles important, large-scale problems for humanity like how to go about reducing existential risk." (If you're not familiar with our work, this article is a nice, readable introduction, and our director, Nick Bostrom, wrote Superintelligence.) Though we are a research institute in an ancient and venerable institution, this does not guarantee funding or long-term stability.
Really? Before, MIRI was being constantly criticized for not publishing any papers.
I see.
I take it that this is a damned if you do and damned if you don't kind of situation.
I'm not able to find the source right now (that criticized the MIRI on said grounds), but I'm pretty certain it wasn't a very authentic/respectable source to begin with. As far as I can recall, it was Stephen Bond, the same guy who wrote the article on "the cult of bayes theorem", though there was a link to his page from Yudkowsky's wikipedia page which is not there anymore.
I simply brought up this example to show how easy it is tarnish an image, something ... (read more)