AlexMennen comments on MIRI strategy - Less Wrong

5 Post author: ColonelMustard 28 October 2013 03:33PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (94)

You are viewing a single comment's thread. Show more comments above.

Comment author: AlexMennen 29 October 2013 04:08:01PM 9 points [-]

Pamphlets work for wells in Africa. They don't work for MIRI's mission. The inferential distance is too great, the ideas are too Far, the impact is too far away.

Didn't you get convinced about AI risk by reading a short paragraph of I. J. Good?

Comment author: lukeprog 29 October 2013 09:43:30PM 3 points [-]

Certainly there exist people who will be pushed to useful action by a pamphlet. They're fairly common for wells in Africa, and rare for risks from self-improving AI. To get 5 "hits" with well pamphlets, you've got to distribute maybe 1000 pamphlets. To get 5 hits with self-improving AI pamphlets, you've got to distribute maybe 100,000 pamphlets. Obviously you should be able to target the pamphlets better than that, but then distribution and planning costs are a lot higher, and the cost per New Useful Person look higher to me on that plan than distributing HPMoR to leading universities and tech companies, which is a plan for which we already have good evidence of effectiveness, and which we are therefore doing.