The Singularity Institute is undergoing a series of important strategic discussions. There are many questions for which we wish we had more confident answers. We can get more confident answers on some of them by asking top-level mathematicians & mathletes (e.g. Putnam fellow, IMO top score, or successful academic mathematician / CS researcher).
If you are such a person and want to directly affect Singularity Institute strategy, contact me at luke@intelligence.org.
Thank you.
Now back to your regularly scheduled rationality programming...
Thanks for that link; I wound up reading the whole comment thread, and it changed my mind.
So, people of SIAI: What were the discussions about? What conclusions did you reach? Or are you not finished yet? If not, how's it going?