aletheilia comments on Singularity Institute Strategic Plan 2011 - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (20)
Thank you so much for doing this. It makes a very big difference.
Some comments:
Strategy #1, Point 2e seems to cover things that should be either in point 3 or 4. Also points 3 and 4 seem to bleed into each other
If the Rationality training is being spun off to allow Singinst to focus on FAI, why isn't the same done with the Singularity summit? The slightly-bad faith interpretation for the lack of explanation would be that retaining the training arm has internal opposition while the summit does not. If this is not an inference you like, this should be addressed.
The level 2 plan includes " Offer large financial prizes for solving important problems related to our core mission". I remember cousin_it mentioning that he's had very good success asking for answers in communities like MathOverflow, but the main cost was in formalizing the problems. It seems intuitive that geeks are not too much motivated by cash, but are very much motivated by a delicious open problem (and the status solving it brings). Before resorting to 'large financial prizes', shouldn't level 1 include 'formalize open problems and publicise them'?.
Thank you again for publishing a document so that this discussion can be had.
The trouble is, 'formalizing open problems' seems like by far the toughest part here, and it would thus be nice if we could employ collaborative problem-solving to somehow crack this part of the problem... by formalizing how to formalize various confusing FAI-related subproblems and throwing this on MathOverflow? :) Actually, I think LW is more appropriate environment for at least attempting this endeavor, since it is, after all, what a large part of Eliezer's sequences tried to prepare us for...