You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Evan_Gaensbauer comments on [LINK] Author's Note 119: Shameless Begging - Less Wrong Discussion

7 Post author: Evan_Gaensbauer 11 March 2015 12:14AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (66)

You are viewing a single comment's thread. Show more comments above.

Comment author: Evan_Gaensbauer 12 March 2015 05:41:37AM 5 points [-]

I just want to note we're not discussing laser-mounted mosquito-terminating drones anymore. That's fine. Anyway, I'm a bit older than Eliezer was when he founded the Singularity Institute for Artificial Intelligence. While starting a non-profit organization at that age seems impressive to me, once it's been legally incorporated I'd guess one can slap just about any name they like on it. The SIAI doesn't seem to have achieved much in the first few years of its operation.

Based on the their history from the Wikipedia page on the Machine Intelligence Research Institute, it seems to me how notable the organization's achievements are are commensurate with the time it's been around. For several years, as the Singularity Institute, they also ran the Singularity Summit, which they eventually sold as a property to Singularity University for one million dollars. Eliezer Yudkowsky contributed two chapters to Global Catastrophic Risks in 2008, at the age of 28, without having completed either secondary school or a university education.

On the other hand, the MIRI has made great mistakes in operations, research, and outreach in their history. Eliezer Yudkwosky is obviously an impressive person for various reasons. I think the conclusion is Eliezer sometimes assumes he's enough of a 'rationalist' he can get away with being lazy with how he plans or portrays his ideas. He seems like he's not much of a communications consequentialist, and seems relcuctant to declare mea culpa when he makes those sorts of mistakes. All things equal, especially if we hasn't tallied Eliezer's track record, we should remain skeptical of his plans based on shoddy grounds. I too don't believe we should take his bonus requests and ideas at the end of the post seriously.