The Singularity Institute is undergoing a series of important strategic discussions. There are many questions for which we wish we had more confident answers. We can get more confident answers on some of them by asking top-level mathematicians & mathletes (e.g. Putnam fellow, IMO top score, or successful academic mathematician / CS researcher).
If you are such a person and want to directly affect Singularity Institute strategy, contact me at luke@intelligence.org.
Thank you.
Now back to your regularly scheduled rationality programming...
I remember being tempted, but ultimately it felt nosy. I wouldn't request that SIAI make all its strategic thinking public, and there's nothing special about this particular bit of it, from my perspective.
Why not? Edit: See previous discussion of this topic here.