Eliezer_Yudkowsky comments on Lone Genius Bias and Returns on Additional Researchers - LessWrong

24 Post author: ChrisHallquist 01 November 2013 12:38AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (63)

You are viewing a single comment's thread. Show more comments above.

Comment author: Eliezer_Yudkowsky 02 November 2013 08:40:27PM 14 points [-]

They might also believe that AMF donations would have a greater impact on potential intelligence explosions

It is neither probable nor plausible that AMF, a credible maximum of short-term reliable known impact on lives saved valuing all current human lives equally, should happen to also possess a maximum of expected impact on future intelligence explosions. It is as likely as that donating to your local kitten shelter should be the maximum of immediate lives saved. This kind of miraculous excuse just doesn't happen in real life.

Comment author: dankane 03 November 2013 07:10:40PM 1 point [-]

OK. Granted. Even a belief that the AMF is better at affecting intelligence explosions is unlikely to justify the claim that it is the best, and thus not justify the behavior described.

Comment author: joaolkf 27 December 2013 04:23:03PM 0 points [-]

Amazing how even after reading all Eliezer's posts (many more than once), I can still get surprise, insight and irony at a rate sufficient enough to produce laughter for 1+ minute.